且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

如何优化网站加载时间在红宝石?

更新时间:2023-12-04 22:59:16

我想你混合了一些东西。 ror与php / cake混合?



所以,关于性能。它主要取决于你认为你会有多少用户,这些用户是谁,他们做什么。 10每小时或100每秒?他们看了很长时间的图像,或者他们是从一个页面跳到另一个页面?



这里有一些技巧不够的提示。没有服务器配置优化,没有memcached等等。




  • 您的网站/应用程式太慢了吗? ,但通常不是这样。它永远不会伤害它的速度,但往往,人们关心的性能太多了。总是记住:它不是要快,它是关于足够快。没有人注意到一些额外的毫秒。如果您的网页需要一秒钟加载,则加速50%是显着的,但如果只需要100秒,则大部分不相关。


  • 慢,基准。有很多方法来做到这一点,一个是自动化的,如 ab (apache benchmark)。它模拟许多用户连接到您的网站,并给你一个很好的总结需要多长时间来回应。另一个是:使用它。而不是在本地网络!


  • 照片展示在很大程度上取决于图片。图像很大。


  • 如果您缩放图片(非常可能),请务必在每次网页请求时调整图片大小,缓存缩放图片!缓存缩略图。缓存一切。处理和交付静态文件比不断做所有的处理要便宜得多。


  • 考虑图片的质量。是快速交货比高图像质量更重要吗?使用图片大小进行播放 - 更好的压缩意味着文件大小更低,质量更低,交付更快。


  • 考虑可用性。如果没有缩略图页面,人们必须顺序浏览您的图书馆,看看他们不想看到的很多照片。如果他们已经看到图像,他们可以直接跳到那些重要的(降低带宽使用和每秒请求)。想想flickr:显示的图像的大小...他们像邮票 - 500像素宽,人们仍然很高兴。


  • 技巧,窍门,技巧:早期,当用户以模式冲浪时,有时低分辨率/高压缩图像被转移,因此用户在短时间量之后有东西。只有在第一个图像加载后,更大的版本startet。


  • 考虑观众 >。他们会用14.4k调制解调器或宽带访问您的网站吗?他们用来缓慢加载网站(摄影师可能是)?请检查您的统计信息以了解详情。


  • 您的后端脚本语言 php不是真的快,ruby不是真的快 - 相比,说,c或java或ocaml。框架比手工制作的优化代码慢。调试您的代码以查看缓慢部分的位置。我猜?图像大小调整和数据库访问。




关于网站的速度



有很多因素。一些是:


  1. 服务器处理:是您的应用程序快,是您的硬件快吗?


  2. 交付:请求和文件从客户端传输到服务器有多快,反之亦然? (取决于带宽)


  3. 客户端呈现:他们的浏览器有多快,需要完成多少工作?


  4. 用户haibts:客户端甚至需要速度吗?有时,慢页面没有问题,例如。如果他们花了很长时间没有点击。考虑Flash游戏网站:如果你花费一个小时玩Flash游戏,你可能不会注意到页面在3或5秒内加载。


如果你确认你真的是这样,那么这个速度是所有四种速度的组合,是重要的衡量标准。





关于优化



性能是不可或缺的一部分构建应用程序。如果你真的需要快速的东西,你必须从一开始就计划速度。如果它不是为速度设计,有效的优化往往是不可能的。



这不是真正的网络应用程序,所有的时间,因为他们很容易水平扩展意味着:抛硬件

所有的东西都花钱,如果钱对你(或你的老板)很重要,不要忘记它。两周内优化应用成本多少?说,优化成本你(或你的老板)X€(我是欧洲人)的薪水。现在,想想购买另一台服务器:成本Y€(包括设置)。如果Y

随机流行语



你会抛出一些随机(无序)的流行语,也许有一些可能有帮助。只需google,这应该有助于...



内容传送网络,(intel)SSD,精灵(组合图像以保存请求),页面压缩(gzip,deflate) memcached,APC(PHP的字节码缓存),缩减和合并多个CSS和JS文件,有意识地处理HTTP状态代码(未更改),分离静态和动态内容(不同的服务器和域),逐步通过AJAX加载(重要内容第一),...



现在我的想法了。



编辑/更新



我忘记的事物/技巧:




  • 一个进度条或类似的东西,所以用户至少感觉到事情发生了。你不能使用进度条如果只工作与javascript,但至少显示某种动画沙漏或时钟。如果您使用Flash,您可以显示一个实际的进度栏。


  • 您可以通过使用AJAX或Flash来跳过完整的页面重新加载 - 。你经常看到这在Flash图像画廊中实现。


  • 预加载:如果用户长时间查看一个图像,则可以开始加载下一个图像,




免责声明



我从来没有实现性能关键的应用程序(有2个例外),所以我上面写的大部分是投机和别人的经验。今天你读了关于成功的初创公司的故事,以及他们如何应对(性能方面)从100到一个bazillion用户每天,他们如何使用漂亮的技巧,以解决所有这些问题所有的时间。

但是会发生在你身上?可能不会。



我的现实世界的经验(是的,我知道) ,我喜欢写长的答案):



一旦我做了网站的一部分,每天有几千个独特的访问者,由cms(typo3)单专用samp服务器(考虑使用,十年前solaris服务器,不ghz !)。您可以搜索单位,表单通过重新加载iframe点击来告诉您有多少结果(例如20-40m²:400次点击,30-60m²:600次点击)。它是非常,非常缓慢(但用户仍然使用它)。持续100%负载。这是我的工作来解决这个问题。

我做了什么?首先,找出为什么这么慢。我的第一个猜测是对的,点击请求使用typo3(w / o缓存,当然)。通过用直接查询数据库的自定义脚本替换此单个操作,绕过typo3,问题解决了。负载几乎没有什么。我花了大约2个小时。



另一个项目每天有大约1500个唯一访问者,通过一个oracle数据库显示数据serverd,数百万行和复杂的连接永远=几秒钟)运行。我没有太多的优化oracle的经验,但我知道:数据库每周更新一次或两次。我的解决方案:我只是缓存的内容通过写入html到文件系统。更新后(在半夜)我清除缓存,并开始重建它。所以,而不是昂贵的查询,我只有廉价的文件系统读取。问题解决了。



这两个例子告诉我,网络开发的性能不是火箭科学。大多数时候的解决方案很简单。和:在99%的时间:开发人员费用安全,其他部分更重要。


I'm a newbie who is creating a lightweight photo showcase site written on the cake php framework with RoR.i plan to use effects from the scriptalicious library, as well as jquery for photo display & transition effects.

As the site will be very photo-rich, what programming steps can i take to ensure that all photos and other pages load quickly?

i think you're mixing things up a bit. ror mixing with php/cake?

so, about performance. it mostly depends on how many users do you think you'll have, who those user are and what they do. 10 per hour or 100 per second? do they look at an image for a long time or are they rapidly hopping from page to page?

here are some tips that are not too technical. no server configuration optimizing, no memcached and so on. start thinking about performance with common sense - it's not the holy grail!

  • is your site/application too slow? most often, that's not the case. it never hurts speeding it up, but often, people care about performance too much. always remember: it's not about being fast, it's about being fast enough. nobody notices some extra milliseconds. a speedup of 50% is noticeable if your page needs a second to load, but mostly irrelevant if it takes only 100ms.

  • to find out if your site is slow, benchmark it. there are a lot of methods to do this, one is automated, like ab (apache benchmark). it simulates lots of users connecting to your site and gives you a nice summary how long it took to respond. the other is: use it. and not in the local network! if you feel it's to slow, then do something.

  • a photo showcase heavily depends on the images. images are big. so make sure your server has enough bandwidth to deliver them fast.

  • if you scale the images (that's very probable), don't resize the image on every page request, cache the scaled image! cache the thumbnails too. cache everything. processing and delivering a static file is a lot cheaper than constantly doing all the processing.

  • think about the quality of the image. is fast delivery more important than high image quality? play around with the image size - better compression means lower file size, lower quality and faster delivery.

  • think about usability. if there is no thumbnails page, people have to sequentially navigate through your library, looking at a lot of photos they don't want to see. if they already see the image, they can jump straight to those that matter (lowering bandwith usage and requests per second). think about flickr: the size of the images shown ... they're like stamps - 500 pixel wide, and people are still happy. if they need a bigger version, they click on the "all sizes" link anyway.

  • tricks, tricks, tricks: earlier, when users surfed with modes, sometimes low resolution/high compression images were transfered, so the user had something after a short amount of time. only after the first image was loaded, the bigger version startet. it's not common anymore, because today most users have broadband, so sending an additional image is just additional workload.

  • think about the audience. are they gonna visit your site with 14.4k modems or broadband? are they used to slow loading sites (photographers probably are)? check your statistics to find out about them.

  • your backend scripting language is most probably not your problem. php is not really fast, ruby is not really fast - compared to, say, c or java or ocaml. frameworks are slower than hand-crafted, optimized code. debug your code to see where the slow parts are. my guess? image resizing and database access. that won't change when switching to another language or optimizing your code.

regarding the speed of websites

there are a lot of factors involved. some of them are:

  1. serverside processing: is your application fast, is your hardware fast?

  2. delivery: how fast are the requests and files transfered from the client to the server and vice versa? (depending on bandwidth)

  3. client side rendering: how fast is their browser, how much work has to be done?

  4. user haibts: do the client even need speed? sometimes, slow pages are no problem, e.g. if they spend a long time there without clicking around. think about flash game sites: if you spend an hour playing a flash game, you'll probably won't even notice if the page loads in 3 or 5 seconds.

the percieved speed - a mixture of all four - is the important metric.

if you've confirmed you are really too slow, be sure to optimize the right part. optimizing the server side scripts is useless if the server is fast enough, but the page takes ages to render on the browser. no need to optimize rendering time if your bandwidth is clogged.

regarding optimization

performance is an integral part when building an application. if you really need something fast, you have to plan for speed from the very beginning. if it's not designed for speed, effective optimization is often not possible.

that's not really true for web apps all the time, because they easily scale horizontally, meaning: throw hardware at it.
all things cost money, and if money is important to you (or your boss), don't forget about it. how much do two weeks of optimising an application cost? say, optimising costs you (or your boss) X € (i'm european) in salary. now, think about buying another server: that costs Y € (including setup). if Y < X, just buy the server and you'll be fine.

random buzzwords

last but not least i'll throw some random (unordered) buzzwords at you, maybe there is something that might help. just google, that should help ...

content delivery networks, (intel) SSDs, sprites (combining images to save requests), page compression (gzip, deflate), memcached, APC (bytecode cache for PHP), minifying and merging of multiple CSS and JS files, conscious handling of HTTP status codes (not changed), separation of static and dynamic content (different servers & domains), step-by-step loading via AJAX (important content first), ...

now i'm out of ideas.

edit/update

things/techniques i forgot:

  • implement a progress bar or something comparable, so users at least feel something's going on. you can't use progress bars if working only with javascript, but at least show some kind of animated hourglass or clock. if you use flash, you can show a real progress bar.

  • you can skip complete page reloads by working with AJAX or flash - just load the data you need. you often see this implemented in flash image galleries. just load the image and the description.

  • preloading: if users look at one image for an extended period of time, you can already start loading the next image, so it's browser-cached if the user continues.

disclaimer

i never implemented performance critical apps (with 2 exceptions), so most of what i've written above is speculation and the experience of others. today you read stories about successfull startups and how they coped (performance-wise) with going from 100 to a bazillion users a day, and how they used nifty tricks to solve all those problems all the time.
but is that going to happen to you? probably not. everyone talks about it, almost nobody really needs it (but i admit, it's still good to know).

my real world experience (yes, i like writing long answers):

once i did parts of a website with several thousand of unique visitors a day, powered by a cms (typo3) and running on a single dedicated samp-server (think of used, decade old solaris servers, not ghz!). you could search for flats, and the form told you how many results you'll have (e.g. 20-40m²: 400 hits, 30-60m²: 600 hits) by reloading an iframe ON-CLICK. it was very, very slow (but users still used it). constantly 100% load. it was my job to solve that problem.
what did i do? first, find out why it was so slow. my first guess was right, the on-click request also used typo3 (w/o caching, of course). by replacing this single action with a custom script that just queried the database directly, bypassing typo3, the problem was solved. load went down to almost nothing. took me about 2 hours.

the other project had about 1500 unique visitors a day, displaying data serverd by an oracle database with millions of rows and complicated joins that took forever (=several seconds) to run. i didn't have much experience in optimizing oracle, but i knew: the database was updated only once or twice a week. my solution: i just cached the contents by writing the html to the filesystem. after updating (in the middle of the night) i cleared the cache and began rebuilding it. so, instead of expensive queries i had just cheap filesystem reads. problem solved.

both examples taught me that performance in web developement is not rocket science. most of the time the solution is simple. and: there are other parts that are way more important 99% of the time: developer cost and security.