DonkeyMails.com: No Minimum Payout

Saturday, April 18, 2009

Downgrade plan for Windows 7 PCs

Windows badges, Getty

Anyone buying a PC with Windows 7 pre-installed will be able to swap it for XP or Vista.

Microsoft has confirmed that the licence conditions under which the software will be sold will allow people to downgrade.

The conditions will apply to both businesses that buy licences for Windows in bulk and consumers that get the operating system on a PC or laptop.

No firm date has been given for the release of Windows 7's final version.

New life

Downgrade rights are common in Microsoft licensing terms and conditions and customers who buy large volumes of Windows operating systems have always been able to roll back to previous versions.

Microsoft has twice granted Windows XP a reprieve to allow computer makers to get licences for it for far longer than was originally planned.

Windows XP, released to consumers in 2001, was also granted a lifeline to ensure that it could be used on so-called netbooks - cut-down net-capable laptops that are proving very popular.

At the same time, computer makers such as Dell and HP have been exploiting clauses in the licensing terms that let them rollback machines with Vista pre-installed to the older operating system.

The news comes as the cut-off date for free mainstream support for Windows XP ends. From 14 April, Windows XP and Home plus Office 2003 enter their "extended support" period.

This means the only updates and bug fixes these products will get will be to improve security.

Microsoft has said that the release candidate of Windows 7, which will be broadly similar to the final version, will be released in late May 2009. The final version is expected in January 2010.


Read more... Read More ..

Spam 'produces 17m tons of CO2'

Smoke billows from chimney

A study into spam has blamed it for the production of more than 33bn kilowatt-hours of energy every year, enough to power more than 2.4m homes.

The Carbon Footprint of e-mail Spam report estimated that 62 trillion spam emails are sent globally every year.

This amounted to emissions of more than 17 million tons of CO2, the research by climate consultants ICF International and anti-virus firm McAfee found.

Searching for legitimate e-mails and deleting spam used some 80% of energy.

The study found that the average business user generates 131kg of CO2 every year, of which 22% is related to spam.

Unwanted traffic

ICF say that spam filtering would reduce unwanted spam by 75%, the equivalent to taking 2.3 million cars off the road.

However, the ICF goes on to say that while spam filtering is effective in reducing energy waste, fighting it at the source is far better.

The report highlights the case of McColo , a US web hosting firm that had ties to spammers. The day after it was taken offline by its two internet service providers, global spam volume fell by 70%.

Although the respite was only temporary, McAfee said the "day without spam amounted to talking 2.2 million cars off the road" and that tackling spam should be part of the campaign to reduce carbon emissions.

Richi Jennings - an independent spam analyst who helped produce the report - told the BBC that the figures were based on the extra energy use spent dealing with spam.

"The PC on our desks uses more power when they do work, so the numbers are based on the additional work they use when dealing with spam," he said.

The Spam Report follows only a few days after Symantec's bi-annual Internet Security Threat report, which found that spam had increased by 192%, with bot networks responsible for approximately 90% of all spam e-mail.

Mr Jennings said that while McAfee and Symantec had different ways of measuring spam, he was in total agreement with the bot network figure.

"Our report was based on mail that spammers attempt to send, including ones that are blocked by an ISP at source. Symantec only measures spam that is successfully sent.

"The vast majority of spam is sent via botnets. We've got Conficker building a fantastic network and you can bet your bottom dollar that it will wind up being used to send spam.

"There is speculation that the botnet Conficker is building up is owned and run by the owners of another active botnet - Waledac, itself probably connected to the classic Storm botnet - and the theory is that the owners are keeping their powder dry at the moment and will activate it once Waledac goes down."


Read more... Read More ..

Monday, April 6, 2009

OnLive games service 'will work'

OnLive screenshot

The founder of online streaming games firm OnLive has defended the technology underpinning the service after accusations it was unworkable.

Steve Perlman said critics had not even used the system.

OnLive turns games into video data sent across the net to a hardware add-on, or software plug-in, which decompresses the data back into video.

The firm says a revolutionary video compression algorithm and custom silicon makes it possible.

OnLive has been in development for the last seven years and has signed up content partners, including EA, Ubisoft, Take2, Eidos, Atari, Codemasters, Epic and THQ

The subscription service will feature games such as Burnout, Fear 2, Tomb Raider: Underworld and Crysis: Warhead.

We are not doing video encoding in the conventional sense
Steve Perlman, OnLive

Mr Perlman, who led the early developments into video streaming service QuickTime while at Apple, told BBC News: "We have nine of the largest game publishers in world signed up.

"They have spent several years in some cases actually going and reviewing our technology before allowing us to associate with their company names and allowing us to have access to their first-tier franchises."

The service has raised eyebrows in some quarters given the difficulties of encoding High Definition video in near real time at servers in data centres, and streaming it over the open internet to a user.

Delivering real-time streaming game play is seen by some as an insurmountable problem, even before factoring in the necessity of sending back telemetry from a game controller across the net to the data centre.

"We are not doing video encoding in the conventional sense," explained Mr Perlman, dismissing an article in gaming website Eurogamer that said the service was unworkable.

"It's a very ignorant article," said Mr Perlman, who said Eurogamer had conflated issues of frame rate and latency.

"They are independent factors," he said.

OnLive has said it has created a video compression algorithm designed specifically for video games that can encode and compress video into data in about one millisecond.

OnLive screen shot

A custom-built silicon chip designed by OnLive does the actual encoding calculations at the server end, as well as the decompression at the gamer end, inside a cheap hardware add-on.

Mr Perlman said it had taken "tens of thousand" of man hours to develop the algorithm.

He said: "First of all it was a postage stamp size screen with no latency over the internet. It looked like the silliest kind of game because the screen size was smaller than a cell phone but nonetheless there was no lag.

"We were running Quake actually - or micro quake as we called it. It was very unimpressive to anyone apart from an engineer."

The algorithm is not perfect. You will sometimes see little artefacts on the screen
Steve Perlman, OnLive

After years spent refining the technology OnLive has said it was able to make the video window bigger and bigger until achieving a resolution of 1280 by 720 at 60 frames per second.

Technologists contacted by BBC News said that that level and speed of video encoding would not be "beyond the bounds of credibility" but would require custom hardware.

The algorithm was developed on dual quad core Xeon processors, which cost thousands of pounds, but OnLive have said they have distilled it down so it can run on a custom chip which costs "under 20 bucks to make".

Mr Perlman said the chip was "high performance for video compression", running at less than 100Mhz clock speed and drawing about two watts of power.

"We can make millions of these things. Because of the economy there is plenty of excess capacity in fabrication plants."

Mr Perlman said OnLive had already ordered a "very large batch".

He said the OnLive experience was almost as good as sitting in front of a console and playing a game.

"The algorithm is not perfect. You will sometimes see little artefacts on the screen. Video compression is part science and part art.

Net imperfections

"Every time you present new material to it, you will see something it does not compress so well. We note those and correct the algorithm."

Mr Perlman said the algorithm had been designed with the imperfections of the internet in mind.

"Rather than fighting against the internet... and dropped, delayed or out of order packets we designed an algorithm that deals with these characteristics.

"Every compression algorithm leaves something out. It's about figuring out what kind of stuff you drop out."

OnLive said a broadband connection of 5Mbps will be fast enough for high definition gaming, while 1.5Mbps will be sufficient for standard definition.

At those speeds and with a data centre no further than 1,000 miles away for any gamer in the US the inevitable latency of the net as data has to physically travel across the network is within tolerable limits, said Mr Perlman.

OnLive screenshot
The MicroConsole connects the TV to the internet

OnLive currently has two data centres in the US running a beta version of the service. In order to minimise lag across when the commercial service goes live at the end of 2009 the company has said it will need five data centres around the country.

"The round trip latency from pushing a button on a controller and it going up to the server and back down, and you seeing something change on screen should be less than 80 milliseconds.

"We usually see something between 35 and 40 milliseconds."

The games themselves will be running on "off the shelf motherboards" at the data centres.

The company has calculated that each server will be dealing with about 10 different gamers, because of the varying demands games have on hardware.

"Most games run fine on dual core processors. What you really want is a high performance graphics processor unit," said Mr Perlman.

He said that while work continued on refining the algorithm the bulk of the technical work had been completed.

A wider beta test begins this summer and feedback from the testing will be used to refine the service.


Read more... Read More ..

EA 'dumps DRM' for next Sims game

The Sims 3

Electronic Arts have confirmed that the next version of The Sims will be free of Digital Rights Management (DRM).

The firm came in for considerable criticism last year, when the copy protection limited users to three installations of the game Spore.

The Sims division head, Rod Humble, said the game would use traditional serial code copy protection as "this is a good, time-proven solution".

DRM was introduced to combat game piracy but proved unpopular with users.

"The game will have disc-based copy protection - there is a serial code, just like The Sims 2," said Mr Humble in a blog posting.

"To play the game there will not be any online authentication needed."

"We feel like this is a good, time-proven solution, that makes it easy for you to play the game without DRM methods that feel overly invasive or leave you concerned about authorization server access in the distant future," he added.

Fighting pirates

The issue of software piracy is one that has dogged the games industry since its inception.

One of the earliest attempts - and still the most popular with users - is a serial code check. Users have to enter a code, made up of numbers and letters, printed on the back of the game manual before installation of the game can complete.

Other copy protection methods include CD check, dongles and DRM.

The problem for software developers is that hackers usually crack the copy protection system within a few days of release.

The issue came to a head in 2008 when EA release Will Wright's Spore. The SecuROM DRM restricted users to a maximum of three installs and required online verification before the game could be played.

Valve logo

But despite the DRM, Spore was cracked within 24 hours of release and consumers felt they were being penalised for buying a legitimate copy of the game, rather than downloading a hacked version.

"It's such a shame that the distributor of the game treats its own customers as criminals and attempts to do their best to prevent you from actually playing the game," one user wrote on Amazon.com.

Speaking to the BBC, Tiffany Steckler, a spokesperson for EA, said a final decision on the future of DRM for the company has yet to be made.

"There is always going to be a level of protection for games and this solution [DRM free] is right for The Sims 3.

"How these things roll out in the future will be down to the developers and we will make announcements in due course."

Fighting back

But developers may be making progress on solutions that obviate the need for DRM.

At this years Game Developers Conference in San Francisco, Valve - the developers behind the Half Life series - unveiled a new set of features for its Steamworks platform - saying its distribution system had "made DRM obsolete".

Steam's new "custom executable generation" technology makes copies of the games for each user, meaning players can access their games on multiple machines without install limits.

The only restriction is that users need to log onto their account to actually play.

Read more... Read More ..

Google sees voice search as core

Google voice search

Google has said it sees voice search as a major opportunity for the company in building a presence on the mobile web.

The company's vice president of engineering made the comments during a wide-ranging discussion at the Web 2.0 Expo in San Francisco.

"We believe voice search is a new form of search and that it is core to our business," said Vic Gundotra.

SearchEngineLand editor Greg Sterling agreed: "If done right, it could be a valuable strategic feature for Google."

Mr Gundotra acknowledged to the audience that "voice recognition in the early days was a nice trick but not very usable".

There were early complaints that Google's offering could not understand accents other than American and that results were often garbled.

"Look how far we have come. I get the advantage of looking at daily voice queries coming in and it's amazing. It's working. It's reached a tipping point. It's growing and growing very, very fast and we are thrilled about it," said Mr Gundotra.

He declined to share figures about just how many queries the company deals with via voice search.

However, Mr Gundotra did say: "It's one of those technologies we think gets better with usage.

"We launched it on the iPhone and have seen a 15% jump in accuracy because, as more people use it, we collect more data and our accuracy gets better."

'Queen's English'

In 2002, Google Labs introduced a service that allowed users to search with a simple phone call. The company admitted it "wasn't very useful because the results were displayed on your computer and Google discontinued it".

Six years later, the search giant introduced an improved feature under the Google Mobile App for the iPhone.

Vic Gundotra

It is also available on the Android based T-Mobile G1 and last month was introduced on the BlackBerry as a free download. The New York Times's Gadgetwise blog rated the BlackBerry version the "App of the Week" earlier this week.

Early iterations that worked best with North American accents had problems understanding other accents, including British. BBC technology cCorrespondent Rory Cellan-Jones reported in November last year that his attempts to use it were "pure gibberish".

For example, his query about the next train, West Ealing to Paddington "delivered some useful information about 'neck strain' - but no train times".

Those problems have since been largely ironed out and Google said it was continuing to work on improving the accuracy of the service. This, Mr Sterling said, is crucial if the company wants it to give them the edge in the marketplace.

"My view is voice search could be a strategic differentiator if it works well. It depends on how much better Google's system is compared to, say, Yahoo's or Microsoft's.

"If they come up with a really great version that is really accurate, it could retain users and likely increase search usage for Google," said Mr Sterling.

"Stay tuned"

At Web 2.0, Mr Gundotra also talked about a web-based version of Google's e-mail service, Gmail.

Gmail app

He demonstrated a "technical prototype" on the iPhone and the G1 and said "Stay tuned" for a release date.

Mr Gundotra said the prototype used HTML 5, an as-yet incomplete version of mark up language of the world wide web.

He revealed that Google would create a whole suite of offline apps using HTML 5 and that "we are going to be leaders in taking advantage of HTML 5".

Mr Gundotra also said that engineers were working hard to bring the Chrome browser to the Mac and that while there was no date for delivery, "we are making progress to get it out as fast as we can".

Twitter purchase

During a question and answer session, Mr Gundotra was quizzed on rumours circulating in the blogosphere that Google is looking to buy the micro-blogging service Twitter.

Twitter

"I'm a big fan of Twitter but we don't as a policy comment on rumour or speculation," he said.

Meanwhile, Twitter co-founder Biz Stone has said that he has been "flooded with requests for a response to the latest internet speculation about where Twitter is headed".

In a blog entitled Sometimes We Talk, Mr Stone wrote: "It should come as no surprise that Twitter engages in discussions with other companies regularly and on a variety of subjects.

"Our goal is to build a profitable, independent company and we're just getting started."

Read more... Read More ..

  ©Template by Dicas Blogger.