ads compdd

วันพฤหัสบดีที่ 27 สิงหาคม พ.ศ. 2558

An incredibly shrinking Firefox faces endangered species status

mozilla-firefox-san-francisco-100455024-primary.idge

Desktop browser continues to bleed user share; combined desktop + mobile share falls under 10%

Mozilla's Firefox is in danger of making the endangered species list for browsers.

Just two weeks after Mozilla's top Firefox executive said that rumors of its demise were "dead wrong," the iconic browser dropped another three-tenths of a percentage point in analytics firm Net Applications' tracking, ending February with 11.6%.

That was Firefox's lowest share since July 2006, when the browser had been in the market for less than two years.

Firefox 1.0 was released in November 2004, at a time when Microsoft's Internet Explorer (IE) had a stranglehold on the browser space, having driven Netscape -- Firefox's forerunner -- out of the market. Mozilla has been credited with restarting browser development, which had been moribund under IE.

But Firefox has fallen on hard times.

In the last 12 months, Firefox's user share -- an estimate of the portion of all those who reach the Internet via a desktop browser -- has plummeted by 34%. Since Firefox crested at 25.1% in April 2010, Firefox has lost 13.5 percentage points, or 54% of its peak share.

At Firefox's 12-month average rate of decline, Mozilla's desktop browser will slip under the 10% bar in June, joining other third-tier applications like Apple's Safari (with just a 4.8% user share in February) and Opera Software's Opera (1.1%). If the trend continues, Firefox on the desktop could drop below 8% as soon as October.

The numbers for Firefox were even worse when both the desktop and mobile data are combined.

Firefox's total user share -- an amalgamation of desktop and mobile -- was 9.5% for February, its lowest level since Computerworld began tracking the metric nearly six years ago, and 3.4 percentage points lower than in July 2014, the last time Computerworld analyzed the data.

Mozilla faces a double whammy: Its flagship desktop browser continues to bleed share, while the company has been unable to attract a significant mobile audience. Although the company has long offered Firefox on Android and its Firefox OS has landed on an increasing number of low-end smartphone makers' devices, its February mobile share was less than seven-tenths of one percent, about four times smaller than the second-from-the-bottom mobile browser, Microsoft's IE.

Apple, which had long trailed Mozilla in desktop + mobile browser user share, has leapfrogged its rival because of Firefox's decline: Safari on desktop and mobile had a cumulative 11.8% user share, down half a point from July 2014. More than two-thirds of Apple's total was credited to Safari on iOS.

Google has been the biggest beneficiary of the losses suffered by Mozilla and to a lesser extent, Apple, adding to its lead over both in February. Last month, it had a combined desktop/mobile user share of 27.6%, 5 percentage points higher than seven months ago.

Together, the aged stock Android browser and its replacement, Chrome, accounted for 41.5% of all mobile browsers by Net Applications' count. Google's pair remained behind Apple's Safari on mobile, but has narrowed the gap.

Two weeks ago, Johnathan Nightingale, vice president of Firefox, argued that the browser had a "fierce momentum," citing Mozilla's internal data. Nightingale said that January's desktop download numbers had been "the best they've been in years" and claimed that Mozilla's own numbers showed a tick upward that had not yet been confirmed by third-party measurements.

Neither Net Applications' data or that from StatCounter, an Irish analytics vendor, supported Nightingale's contention. According to StatCounter, which measures usage share -- how active each browser's users are on the Web -- Firefox on the desktop stood at 18.2% in February, down half a percentage point from the month prior.

Mozilla, of course, remains committed to Firefox. Last month, Mozilla's CEO Chris Beard announced that the company had combined its cloud services group with the one responsible for Firefox. "We have been exploring how we can integrate client software on desktops and mobile with cloud service approaches to evolve what Firefox can do for people," Beard said.

Like Nightingale, Beard asserted that Firefox was in good shape. "In the last year, Firefox turned a corner. We achieved positive growth again and dramatically reset our global search strategy," he said, referring to the move late in 2014 when Mozilla dropped Google as its global search partner and signed a five-year deal with Yahoo to make its search engine the default for Firefox in the U.S.

Popular ResourcesBut third-party measurements, the only available because browser makers don't disclose the number of active users on a regular basis, do not back up Beard's claim that Firefox experienced "positive growth" in 2014.

Mozilla has also said it will develop an iOS version of Firefox that will run on Apple's iPhone and iPad, but the project has not yet produced a browser suitable for public testing.

Browsers' desktop and mobile share for Feb. 2015 Data: Net Applications

Mozilla now stands in fourth place, with less than 10% of the global user share of the combined desktop and mobile market.

MIT proves flash is as fast as RAM, and cheaper, for big data

800px-wikimedia_foundation_servers-8055_17-100596212-primary.idge
40 servers using 10TB of RAM were no faster than 20 using 20TB of flash

When it comes to high-speed data processing, RAM has always been the go-to memory for computers because it's tens of thousands of times faster than disk drives and many times faster than NAND flash.

Researchers at MIT, however, have built a server network that proves for big data applications, flash is just as fast as RAM and vastly cheaper.

In the age of big data, where massive data sets are used to uncover the purchasing trends of millions of people or predict financial market trends based on millions of data points, a single computer's RAM won't do.

For example, the data needed to process a single human genome requires between 40 and 100 typical computers.

NAND flash is about a tenth as expensive as RAM and it also consumes about a tenth as much power. So at the International Symposium on Computer Architecture last month, MIT researchers revealed a new system that proved flash memory is just as efficient as conventional RAM, but it also cuts power and hardware costs.

"Say, we need to purchase a system [to] process a dataset that is 10TBs large. To process it in DRAM, we would need a cluster of about 100 computers, assuming servers with 100GB of DRAM," Arvind Mithal, the Johnson Professor of Computer Science and Engineering at MIT, said in an email reply to Computerworld. "Such a cluster will cost around $400K to build."

Each server was connected to a field-programmable gate array, or FPGA, a kind of chip that can be reprogrammed to mimic different types of electrical circuits. Each FPGA, in turn, was connected to two 500GB flash chips and to the two FPGAs nearest it in the server rack.

Networked together, the FPGAs became a fast network that allowed any server to retrieve data from any flash drive. The FPGAs also controlled the flash drives.

Arvind, as Mithal typically goes by, said to process the same 10TB dataset in flash, only 10 computers -- each with 1TB of flash storage -- would be needed. Even including the cost of FPGA-based accelerator hardware, the total price of the system would be less than $70,000 or so, he said.

"This price may go down even further if we consider the fact we don't need as much DRAM on each server on a flash based system," Arvind said. "If we use a lower-end server with less DRAM, the system will cost around $40K."

Maintaining a flash-based system is also much cheaper, he continued, because flash consumes much less power than DRAM, and also because it would require fewer servers. Even when the additional power consumption of flash and FPGA accelerators were factored in, MIT's server network prototype showed that the flash storage device added only about 10% power consumption to the whole system.

In fact, even without their new network configuration, the researchers showed that if servers working on a distributed computation use disk drives to retrieve data just 5% of the time, performance is the same as if it were using flash.

For example, 40 servers with 10TB of RAM could not handle a 10.5TB computation any faster than 20 servers with 20TB worth of flash memory. And, the flash would cost less and consume a fraction of the power.

The researchers were able to make a network of 20 flash-based servers competitive with a network of RAM-based servers by moving some of the computational power off the servers and onto the flash drives' controller chips.

The researchers used flash drives to preprocess some of the data before passing it back to the servers, increasing the efficiency of the distributed computation.

"This is not a replacement for DRAM [dynamic RAM] or anything like that," Arvind said.

Arvind performed the work with a group of graduate students and researchers at Quanta Computer. The research showed there are likely many applications that can replace RAM and take advantage of a flash-based computer architecture's lower cost.

"Everybody's experimenting with different aspects of flash. We're just trying to establish another point in the design space," Arvind said

วันพฤหัสบดีที่ 20 สิงหาคม พ.ศ. 2558

Capturing cell growth in 3-D

Spinout’s microfluidics device better models how cancer and other cells interact in the body.

3d13d23d3

Replicating how cancer and other cells interact in the body is somewhat difficult in the lab. Biologists generally culture one cell type in plastic plates, which doesn’t represent the dynamic cell interactions within living organisms.

Now MIT spinout AIM Biotech has developed a microfluidics device — based on years of research — that lets researchers co-culture multiple cell types in a 3-D hydrogel environment that mimics natural tissue.

Among other things, the device can help researchers better study biological processes, such as cancer metastasis, and more accurately capture how cancer cells react to chemotherapy agents, says AIM Biotech co-founder Roger Kamm, the Cecil H. Green Distinguished Professor in MIT’s departments of mechanical engineering and biological engineering.

“If you want realistic models of these processes, you have to go to a 3-D matrix, with multiple cell types … to see cell-to-cell contact and let cells signal to each other,” Kamm says. “None of those processes can be reproduced realistically in the current cell-culture methods.”

Designed originally for Kamm’s lab, the new commercial device is a plastic chip with three chambers: a middle chamber for hydrogel and any cell type, such as cancer cells or endothelial cells (which line blood vessels), and two side channels for culturing additional cell types. The hydrogel chamber has openings along each side, so cells can interact with each other, as they would in the body. Cancer drugs or other therapeutics can then be added to better monitor how cells respond in a patient.

Lab-fabricated devices have been used for various applications described in more than 40 research publications to date, including studies of cancer and stem cell research, neuroscience, and the circulatory system. This month, AIM Biotech will begin deploying the commercial devices to 47 research groups in 13 countries for user feedback.

Other systems for 3-D cell culturing involve filling deep dishes with hydrogels. Because of the distance these dishes must be kept from the microscope, Kamm says, it’s difficult to capture high-resolution images. AIM Biotech’s devices, on the other hand, he says, can be put directly under the microscope like a traditional plate, which is beneficial for imaging.

“Everything here happens within about 200 microns of the cover slip, so you can get really good high-resolution, real-time images and movies,” Kamm says.

Lab to world

In 2005 at MIT, Kamm’s lab created a prototype of the microfluidics device to better study angiogenesis — the forming of new blood vessels. But there was a major issue: The hydrogel in the middle chamber would spread into the side channels before solidifying, which disturbed the cell cultures.  

As a solution, the researchers lined the hydrogel chamber with minute posts. When injected, the hydrogel seeps out to the posts, but surface tension keeps it from leaking into the side channels, while still allowing the cells to enter. “That’s the key,” Kamm says. “When you put liquid into a small space, surface tension drives where it goes, so we decided to use surface tension to our advantage.”

Soon, Kamm was using the device in his lab: In a 2011 study, researchers in his group discovered that breast cancer cells can break free from tumors and travel against flows normally present inside the tissue; in a 2012 study, they found that macrophages — a type of white blood cells — were key in helping tumor cells break through blood vessels.

And in a 2013 study, Kamm was able to capture high-resolution videos of how the cells escape through minute holes in endothelial walls and travel through the body. “People try to do this in vivo, but you can’t possibly get the kind of resolution you can within a microfluidic system,” Kamm says.

Researchers worldwide began taking notice of the device, which led to several collaborations with researchers locally and in Singapore: The device’s development had been funded, in part, by the Singapore-MIT Alliance for Research and Technology (SMART).

“It became apparent that, if there’s this much interest in these systems and that much need for them, we should set up a company to develop the technology and market it,” Kamm says.

After securing seed funding from Draper Laboratory, the National Institutes of Health, and SMART, Kamm brought the idea for the device to Innovation Teams (i-Teams), where MIT students from across disciplines flesh out strategies for turning lab technologies into commercial products. Among other things, this experience helped Kamm home in on the product’s target market.

“At the time, [I was] trying to decide whether to go for researchers, go directly to pharmaceutical industry, or something that is useful in the clinic,” Kamm says. “One of the i-Teams’ recommendations was to develop systems for researchers. It reinforced what we were heading toward, but it was nice to get that confirmation.”

AIM Biotech launched in Singapore in 2012, under current CEO Kuan Chee Mun, who Kamm met through SMART.

Fighting cancer

A major application for the device, Kamm says, is studying cancer metastasis — as demonstrated with his own work — to develop better treatments.

In the body, cells break loose from a tumor and migrate through tissue into the blood system, where they get stuck in the small blood vessels of a distant organ or adhere to vessel walls. Then they can escape from inside the vessel to form another tumor. AIM Biotech’s microfluidics device produces a similar microenvironment: When endothelial cells are seeded into the side channels or the central gel region, they form a 3-D network of vessels in the hydrogel. Tumor cells can be introduced, flowing naturally or getting stuck in the vessels.

Kamm says this environment could be useful in testing cancer drugs, as well as anti-angiogenesis compounds that prevent the development of blood vessels, effectively killing tumors by cutting off their blood supply. While many such treatments have shown limited success, “there’s a lot of interest in screening for new ones,” Kamm says.

In the future, Kamm adds, AIM Biotech may offer to more accurately screen cancer drugs for pharmaceutical companies. In fact, he says, AIM Biotech recently discovered that its devices revealed discrepancies in some clinically tested therapeutics.

In a study published in Integrative Biology, MIT researchers used Kamm's microfluidics technology to screen several drugs that aim to prevent tumors from breaking up and dispersing throughout the body. Results indicated that the level of drugs needed was often two orders of magnitude higher than predictions based on traditional assays. “So there’s no way to effectively predict, from the 2-D assays, what the efficacy of a particular drug was,” Kamm says.

If pharmaceutical companies were to winnow potential drugs from, say, 1,000 to 100 for testing, Kamm says, “We could test those drugs out in a more realistic setting.”

Real-time data for cancer therapy

Biochemical sensor implanted at initial biopsy could allow doctors to better monitor and adjust cancer treatments.

cencer1cencer2

In the battle against cancer, which kills nearly 8 million people worldwide each year, doctors have in their arsenal many powerful weapons, including various forms of chemotherapy and radiation. What they lack, however, is good reconnaissance — a reliable way to obtain real-time data about how well a particular therapy is working for any given patient.

Magnetic resonance imaging and other scanning technologies can indicate the size of a tumor, while the most detailed information about how well a treatment is working comes from pathologists’ examinations of tissue taken in biopsies. Yet these methods offer only snapshots of tumor response, and the invasive nature of biopsies makes them a risky procedure that clinicians try to minimize.

Now, researchers at MIT’s Koch Institute for Integrative Cancer Research are closing that information gap by developing a tiny biochemical sensor that can be implanted in cancerous tissue during the initial biopsy. The sensor then wirelessly sends data about telltale biomarkers to an external “reader” device, allowing doctors to better monitor a patient’s progress and adjust dosages or switch therapies accordingly. Making cancer treatments more targeted and precise would boost their efficacy while reducing patients’ exposure to serious side effects.

“We wanted to make a device that would give us a chemical signal about what’s happening in the tumor,” says Michael Cima, the David H. Koch (1962) Professor in Engineering in the Department of Materials Science and Engineering and a Koch Institute investigator who oversaw the sensor’s development. “Rather than waiting months to see if the tumor is shrinking, you could get an early read to see if you’re moving in the right direction.”

Two MIT doctoral students in Cima’s lab worked with him on the sensor project: Vincent Liu, now a postdoc at MIT, and Christophoros Vassiliou, now a postdoc at the University of California at Berkeley. Their research is featured in a paper in the journal Lab on a Chip that has been published online.

Measurements without MRI

The sensors developed by Cima’s team provide real-time, on-demand data concerning two biomarkers linked to a tumor’s response to treatment: pH and dissolved oxygen.

As Cima explains, when cancerous tissue is under assault from chemotherapy agents, it becomes more acidic. “Many times, you can see the response chemically before you see a tumor actually shrink,” Cima says. In fact, some therapies will trigger an immune system reaction, and the inflammation will make the tumor appear to be growing, even while the therapy is effective.

Oxygen levels, meanwhile, can help doctors gauge the proper dose of a therapy such as radiation, since tumors thrive in low-oxygen (hypoxic) conditions. “It turns out that the more hypoxic the tumor is, the more radiation you need,” Cima says. “So, these sensors, read over time, could let you see how hypoxia was changing in the tumor, so you could adjust the radiation accordingly.”

The sensor housing, made of a biocompatible plastic, is small enough to fit into the tip of a biopsy needle. It contains 10 microliters of chemical contrast agents typically used for magnetic resonance imaging (MRI) and an on-board circuit to communicate with the external reader device.

Devising a power source for these sensors was critical, Cima explains. Four years ago, his team built a similar implantable sensor that could be read by an MRI scanner. “MRI scans are expensive and not easy to make part of routine care,” he says. “We wanted to take the next step and put some electronics on the device so we could take these measurements without an MRI.”

For power, these new sensors rely on the reader. Specifically, there’s a metal coil inside the reader and a much smaller coil in the sensor itself. An electric current magnetizes the coil inside the reader, and that magnetic field creates a voltage in the sensor’s coil when the two coils are close together — a process called mutual inductance. The reader sends out a series of pulses, and the sensor “rings back,” as Cima puts it. The variation in this return signal over time is interpreted by a computer to which the reader is wired, revealing changes in the targeted biomarkers.

“With these devices, it’s like taking blood pressure. It’s a simple measurement. You get the readout and move on,” says Ralph Weissleder, a radiologist and director of the Center for Systems Biology lab at Massachusetts General Hospital who is familiar with the research. “Whatever you can do right then and there without any complicated testing, the better it is.

Additional applications

Cima’s team successfully tested the sensors in lab experiments, including implanting them in rodents. While the sensors were only implanted for a few weeks, Cima believes they could be used to monitor a person’s health over many years.

“There are thousands of people alive today, because they have implantable electronics, like pacemakers and defibrillators,” he says. “We’re making these sensors out of materials that are in these kinds of long-term implants, and given that they’re so small, I don’t think there will be a problem.”

These initial experiments showed that the sensors could quickly, reliably, and accurately detect pH and oxygen concentration in tissue. The researchers next want to see how well the sensors do measuring changes in pH over an extended period of time.

“I want to push these probes so we can use them to monitor tumor response,” Cima says. “We did a little bit of that in these experiments, but we need to make that really robust.”

While the primary application of these sensors would be cancer care, Cima is also eager to collaborate with researchers in other fields, such as environmental science. “For example, you could use these to measure dissolved oxygen or pH from a lot of different sites all over a pond or a lake,” Cima says. “I’m excited about using these sensors to bring big data to environmental monitoring.”

วันอังคารที่ 4 สิงหาคม พ.ศ. 2558

Toward tiny, solar-powered sensors

June 22, 2015

Larry Hardesty | MIT News Office

New ultralow-power circuit improves efficiency of energy harvesting to more than 80 percent.

The MIT researchers' prototype for a chip measuring 3 millimeters by 3 millimeters. The magnified detail shows the chip's main control circuitry, including the startup electronics; the controller that determines whether to charge the battery, power a device, or both; and the array of switches that control current flow to an external inductor coil. This active area measures just 2.2 millimeters by 1.1 millimeters.<br /><br />Courtesy of the researchers

(Image: The MIT researchers' prototype for a chip measuring 3 millimeters by 3 millimeters. The magnified detail shows the chip's main control circuitry, including the startup electronics; the controller that determines whether to charge the battery, power a device, or both; and the array of switches that control current flow to an external inductor coil. This active area measures just 2.2 millimeters by 1.1 millimeters. Courtesy of the researchers.)


The latest buzz in the information technology industry regards “the Internet of things” — the idea that vehicles, appliances, civil-engineering structures, manufacturing equipment, and even livestock would have their own embedded sensors that report information directly to networked servers, aiding with maintenance and the coordination of tasks.

Realizing that vision, however, will require extremely low-power sensors that can run for months without battery changes — or, even better, that can extract energy from the environment to recharge.

Last week, at the Symposia on VLSI Technology and Circuits, MIT researchers presented a new power converter chip that can harvest more than 80 percent of the energy trickling into it, even at the extremely low power levels characteristic of tiny solar cells. Previous ultralow-power converters that used the same approach had efficiencies of only 40 or 50 percent.

Moreover, the researchers’ chip achieves those efficiency improvements while assuming additional responsibilities. Where most of its ultralow-power predecessors could use a solar cell to either charge a battery or directly power a device, this new chip can do both, and it can power the device directly from the battery.

All of those operations also share a single inductor — the chip’s main electrical component — which saves on circuit board space but increases the circuit complexity even further. Nonetheless, the chip’s power consumption remains low. “We still want to have battery-charging capability, and we still want to provide a regulated output voltage,” says Dina Reda El-Damak, an MIT graduate student in electrical engineering and computer science and first author on the new paper. “We need to regulate the input to extract the maximum power, and we really want to do all these tasks with inductor sharing and see which operational mode is the best. And we want to do it without compromising the performance, at very limited input power levels — 10 nanowatts to 1 microwatt — for the Internet of things.” The prototype chip was manufactured through the Taiwan Semiconductor Manufacturing Company's University Shuttle Program.

Ups and downs

The circuit’s chief function is to regulate the voltages between the solar cell, the battery, and the device the cell is powering. If the battery operates for too long at a voltage that’s either too high or too low, for instance, its chemical reactants break down, and it loses the ability to hold a charge. To control the current flow across their chip, El-Damak and her advisor, Anantha Chandrakasan, the Joseph F. and Nancy P. Keithley Professor in Electrical Engineering, use an inductor, which is a wire wound into a coil. When a current passes through an inductor, it generates a magnetic field, which in turn resists any change in the current.

Throwing switches in the inductor’s path causes it to alternately charge and discharge, so that the current flowing through it continuously ramps up and then drops back down to zero. Keeping a lid on the current improves the circuit’s efficiency, since the rate at which it dissipates energy as heat is proportional to the square of the current.

Once the current drops to zero, however, the switches in the inductor’s path need to be thrown immediately; otherwise, current could begin to flow through the circuit in the wrong direction, which would drastically diminish its efficiency. The complication is that the rate at which the current rises and falls depends on the voltage generated by the solar cell, which is highly variable. So the timing of the switch throws has to vary, too.

Electric hourglass

To control the switches’ timing, El-Damak and Chandrakasan use an electrical component called a capacitor, which can store electrical charge. The higher the current, the more rapidly the capacitor fills. When it’s full, the circuit stops charging the inductor.

The rate at which the current drops off, however, depends on the output voltage, whose regulation is the very purpose of the chip. Since that voltage is fixed, the variation in timing has to come from variation in capacitance. El-Damak and Chandrakasan thus equip their chip with a bank of capacitors of different sizes. As the current drops, it charges a subset of those capacitors, whose selection is determined by the solar cell’s voltage. Once again, when the capacitor fills, the switches in the inductor’s path are flipped.

“In this technology space, there’s usually a trend to lower efficiency as the power gets lower, because there’s a fixed amount of energy that’s consumed by doing the work,” says Brett Miwa, who leads a power conversion development project as a fellow at the chip manufacturer Maxim Integrated. “If you’re only coming in with a small amount, it’s hard to get most of it out, because you lose more as a percentage. [El-Damak’s] design is unusually efficient for how low a power level she’s at.”

“One of the things that’s most notable about it is that it’s really a fairly complete system,” he adds. “It’s really kind of a full system-on-a chip for power management. And that makes it a little more complicated, a little bit larger, and a little bit more comprehensive than some of the other designs that might be reported in the literature. So for her to still achieve these high-performance specs in a much more sophisticated system is also noteworthy.”

ที่มา : https://www.eecs.mit.edu/news-events/media/toward-tiny-solar-powered-sensors