
Inhoudsopgave Seti@Home hitparades week 33
- DPC SETI@Home hitparade van 15 augustus 2010
- DPC SETI@Home hitparade van 16 augustus 2010
- DPC SETI@Home hitparade van 17 augustus 2010
- DPC SETI@Home hitparade van 18 augustus 2010 *
- DPC SETI@Home hitparade van 19 augustus 2010 *
- DPC SETI@Home hitparade van 20 augustus 2010
- DPC SETI@Home hitparade van 21 augustus 2010
Wat is Seti@Home ?
Wat is SETI@home?
SETI@home is de naam van een wetenschappelijk experiment dat computers die via het internet met elkaar verbonden zijn, inzet om buitenaardse intelligentie op te sporen. U kunt hieraan deelnemen middels een gratis computerprogramma dat radiotelescoop-data download en analyseert.
BELANGRIJK: wekelijks driedaagse server outage
Weekly Outage
Every Tuesday morning (Pacific time) we begin a 3 day data distribution outage to focus on science processing and development plus any needed systems maintenance. The upload/download servers will be offline during this time. The web site (including the forums) will only be offline during the database maintenance and backup portion of the outage.
3 Aug 2010 17:29:57 UTC
NIEUWS UPDATE
20 Aug 2010 15:18:54 UTC
We're starting with just the uploads for an hour or two. This was suggested on the forums as a way to minimize timeouts. Also, we need to clear some workunit storage space and moving completed work through quickly will do this.
As for the limits, I accidentally started with the ending limits last week! But is was OK so that's where we'll start this week. I may raise the limits even more as the run progresses and we come off the peak.
19 Aug 2010 21:58:18 UTC
Hey gang. Another week slips by much faster than expected. Maybe it seemed fast because I've been lost in a land of javascript, php, broken web standards, pointless browser differences, and ultimately little final results. What's this all about? I'm working on some more fun features for the NTPCkr candidate public voting pages coming down the pike. For example, a way to easily zoom into these waterfall plots to closely inspect interference near candidates. There's some neat flash/javascript based graphic packages out there that sort of do this, but underneath the flashy good looks it's all clumsy and client side and can't handle the amount of data we're pushing out. So I'm rolling my own tools, after trying out another javascript based package that should have been plug and play but was more like just plug.
This should have been easy, but nothing works as expected on the WWW. It's becoming a major time sink, though I'm close to finishing one test example - which only works on Chrome. And Chrome does this terribly annoying thing of resizing images however it sees fit, with no option for (a) users to turn this off or (b) web designers to force a certain size. One general problem I have with the internet and all related technology is that there way too people who implement "practical" features with zero thought about design, and somehow even less consideration for the actual designers. I swear - I don't know how anybody does web development full time without stabbing themselves in the eye with a fork. It's like being a surgeon who only has access to a random pile of variably sized band aids. And you're asking yourself, "well how do I make an incision?" and the experts reply, "well, duh, you use the wrapping and make a papercut, you n00b!" Anyway...
Server wise, the databases are playing nice this week thus far, and the mysql replica is working and caught up for the first time in a week. We had some issues with the upload storage just before the planned start of the outage on Tuesday. This is just one of those things that will time away as server shuffles continue. Bob is working on getting Astropulse copied to its new server. I didn't have much time for any other upgrades beyond that, but have been helping Jeff brainstorm through the current NTPCkr performance issues. Oh yeah - he's running the show tomorrow and may try the "only let uploads through at first" for a couple hours upon opening the floodgates.
Hunh. Just noticed now our workunit storage server is quite full again. Well, other things are stored on that server and I'm finding one of the causes of bloat are the db purge archives, which archives all workunit/result information from the mysql database as flat files before deleting them. If we didn't purge these from mysql we'd have billions of rows by now, which would be impossible to deal with. At any rate, the only really useful information in these files is which participant worked on which result, which will come in handy when we need to figure out who gets to share our Nobel prize. So I guess I have some file parsing/management in my new future to whittle these 700GB of archives to 10GB of user-to-result lists.
- Matt
[ Voor 47% gewijzigd door grass460 op 22-08-2010 12:36 ]