“The point is this: No matter how fat a pipe you have to the Internet, at some given amount of data, it’s going to be faster, cheaper, or both to use some manual method to ship data on some storage medium.”
I wrote that in September, 2011, in what remains (if I do say so myself) a fascinating and true look at the challenges in transferring really, really big amounts of data (aka “calculating the bandwidth of a station wagon full of tapes hurtling down the highway,” with a side look at the Avian Carrier Protocol, or sending data by carrier pigeon).
Well, it’s time to update it — and, actually, there was something back from 2009 I failed to include.
Let’s say you’re setting up a cloud storage service. Okay. How do you fill it up in the first place? If you’ve got terabytes or petabytes of data, getting it over to the new cloud storage service is going to be a hassle. And a time-consuming hassle at that — 166 days to move a terabyte of data over DSL, according to Amazon.
It turns out that Amazon announced in 2009, and Google duplicated this week, a service that enables you to burn all your data to a hard disk, and simply mail it to a facility, where elves from Amazon or Google will upload the data for you. (No word on whether Google sends the disk drives back to you afterwards, though the form does ask for a return mailing address; Amazon specifies that it returns the drive.) Amazon’s service is called AWS Import/Export, while Google’s is called Offline Disk Import for Google Cloud Storage.
The companies each charge $80 per disk drive for this service, which means it behooves you to use the biggest disk drive you can rather than a whole lot of commodity disks (a la Backblaze). Amazon has an upper limit of 16 TB, though it indicates it might be able to handle bigger ones on request; Google’s form lists varying total size options, ending at more than 400 TB.) Amazon also charges a $2.49 per hour upload fee; Google doesn’t mention such a charge. For the time being, at least, Google is limiting the service to U.S. users; while Amazon supports it in Singapore and Ireland as well as in several U.S. locations, the return address for the Ireland location must be in the European Union.
In addition, Google requires that you encrypt the data before sending. While this adds time and space and money, it adds an interesting wrinkle in terms of security. We now know that online services are regularly being monitored by the Feds, and are required to decrypt data based on court orders and other legal documents. But vendors such as Dropbox, which got busted for this (coincidentally) in 2011, pointed out that if the data was encrypted before it was uploaded to the service, there was nothing they could do in the Feds came calling. So if you’re newly worried about what the NSA might find in your data on the cloud, send it all encrypted on a disk drive, and Bob’s your uncle
While Amazon’s 2009 announcement didn’t create a stampede of providers offering a similar service, it will be interesting to see, in light of Google’s announcement, whether other providers will follow suit – particularly after the NSA revelations. (One wonders if Google’s timing was a coincidence.)
Google said the feature was experimental and that it might “rapidly evolve the feature,” which might break backwards compatibility, and that it couldn’t guarantee quality of service.
Meanwhile, a number of people want to know whether the shipping-data-on-disk services will also work the other way, for quicker restores.
Whether you believe Edward Snowden is a traitor or a hero, one thing is clear: the federal government is still apparently clueless when it comes to thumb drive security.
Word is that Snowden — as well as Bradley Manning before him, three years ago — downloaded information onto a thumb drive that he’d smuggled in. “Apparently he’s got a thumb drive,” Sen. Saxby Chambliss (R-Ga.) said Tuesday in the New York Daily News. “He’s already exposed part of it and I guess he’s going to expose the rest of it.”
The thing is, what Snowden allegedly did could have been done just as easily by many other people. “You can walk out of a building with a Zip drive or a USB stick on the end of your keychain with all of the information that’s in that building and walk right out without sweating a bit or anybody noticing what you’re doing,” says Joel Brenner, former inspector for the National Security Agency, on NPR.
This is nothing new. As far back as Stuxnet, thumb drives have been implicated in all sorts of security issues, both bringing malware in and taking legitimate data out. A 2011 Ponemon study found all sorts of security issues around thumb drives.
Thumb drives have been banned from the Pentagon, including the NSA, since October, 2008, according to the LA Times. Oh goody. That should have solved the problem, because of course everyone obeys regulations, especially people who are about to blow the whistle on the country’s security agency, right?
Aside from the fact that there were always “exceptions” to the bans, especially for network administrators, look at it this way: there is no way that the NSA, or any other organization (including yours) is going to be able to keep people from smuggling in thumb drives. Even if they set off metal detectors (and I’m not sure they do), for someone dedicated enough, they’re going to find a way. (Let’s just say it’s a new meaning for “dark fiber” and leave it at that, shall we?)
Investigators are now saying they know how many documents Snowden allegedly downloaded and what server they came from,” according to an official who would not be named while speaking about the ongoing investigation. Well, that’s very nice, but why didn’t they know that before he left the building?
“The federal government uses a variety of tools that could identify the activities of employees,” writes Eric Chabrow in BankInfoSecurity. “Those include keylogging software and computer logs that pinpoint staff members’ whereabouts and actions within federal IT systems and networks, sources familiar with the federal government’s security clearance systems say. But having the tools in place — and not all tools are used by all agencies at all times — doesn’t mean that the proper authorities are alerted in a timely manner to activities that could jeopardize the nation’s security.”
Chabrow went on to quote Robert Bigman, who retired last year after 15 years as the CIA’s CISO, who said the Defense Department and the intelligence community continually rejected the idea of using digital rights management tools to restrict access to specified content in order to secure intelligence reporting. “They need to re-evaluate that decision,” he says in the article. You think?
So the question is, what is it you can do to keep someone from using this smuggled-in thumb drives?
- Do your computers have functioning USB slots?
- Can someone plug something into one of these USB slots without being detected?
- To what sort of data do people have access?
- Can someone download that data onto a thumb drive without being detected?
- Is that downloaded data unencrypted?
If any of these things are true in your organization, you, too, are vulnerable. And whistleblower, traitor, or run-of-the-mill thief, it won’t make a difference.
Legality and ethics aside — which is something for the law and politics blogs to talk about — the interesting part about the alleged NSA tracking issue are the logistical issues involved. How much data is there? How did the NSA manage it? Where did they put it?
“The NSA is storing all those Verizon (and, presumably, other carrier records) in a massive database system called Accumulo, which it built itself (on top of Hadoop) a few years ago because there weren’t any other options suitable for its scale and requirements around stability or security,” claims Derrick Harris in GigaOm. “The NSA is currently storing tens of petabytes of data in Accumulo.”
With Accumulo, the NSA has said it can process a 4.4-trillion-node (callers), 70-trillion-edge (connections between two callers) graph, according the NSA’s own slide presentation, Harris says. “By way of comparison, the graph behind Facebook’s Graph Search feature contains billions of nodes and trillions of edges.”
(And, in keeping with the Obama Administration’s effort to use open source tools, the NSA donated Accumulo to the Apache Foundation in 2011.)
While people like to think that the amount of data required to spy on a country just isn’t realistic, in practice it doesn’t take that much – particularly if the data stored is only traffic analysis, or the date, length, time, duration, and parties to a call, and not the content of the call itself. Researcher John Villasenor noted last summer that storage costs had dropped by a factor of a million since 1984. And the NSA has always been on the bleeding edge of such research.
“In the 1960s, the National Security Agency used rail cars to store magnetic tapes containing audio recordings and other material that the agency had collected but had never managed to examine, said James Bamford, an author of three books on the agency,” reported Scott Shane in the New York Times, about Villasenor’s work. “In those days, the agency used the I.B.M. 350 disk storage unit, bigger than a full-size refrigerator but with a capacity of 4.4 megabytes of data. Today, some flash drives that are small enough to put on a keychain hold a terabyte of data, about 227,000 times as much.”
And how many calls are we talking now? AT&T has 107.3 million wireless customers and 31.2 million landline customers; Verizon has 98.9 million wireless customers and 22.2 million landline customers; and Sprint has 55 million customers in total, according to the Wall Street Journal. Another Journal article went into more detail on the requirements.
“The task of storing and processing the metadata for all the calls in the U.S. is actually rather trivial, according to Jack Norris, chief marketing officer at MapR Technologies Inc., a company that provides commercial-grade services based on open source database technology such as Hadoop, originally developed by Google Inc. ‘This amount of data is easily analyzed on a MapR Hadoop cluster,’ Mr. Norris said in an email. He assumed, in his calculation, that there are 250 million teenagers and adults in the U.S., each making an average of 10 calls a day, or 2.5 billion calls in total. He also assumed that the average call data record is 2,000 kilobytes. That means all the calls records take up five terabytes worth of storage.”
That would be per day. The storage costs, the Journal quoted Gartner analyst David Cappuccio as saying, would be 46.8 million — 20% less if open source technologies were used.
Reportedly, the data is being stored and analyzed in a $2 billion NSA data center in Utah, code-named Bumblehive (a double allusion to Utah’s large LDS population, which uses the bee as its symbol – to the extent that state highway road signs are beehive-shaped). The 1 million sq. ft. facility is thought to use flash storage for improved performance. Some reports said the center was due to be completed in March of this year, others this fall, while others said it could be as far away as 2016.
Interestingly, this isn’t the first time the NSA Utah data center has come up. The Salt Lake City Tribune reported on the data center as long ago as July, 2009. “The enormous building, which will have a footprint about three times the size of the Utah State Capitol building, will be constructed on a 200-acre site near the Utah National Guard facility’s runway.”
Numerous other sites have reported on the progress of the Utah data center over the years since then, along with data centers from vendors such as Oracle and eBay being built there – not to mention Twitter, which would be convenient for NSA monitoring of its data. (And hey! The NSA data center is reportedly compliant with the silver-level Leadership in Energy and Environmental Design specifications in sustainable development!) In fact, many states and small cities love such data center developments for the construction jobs (as many as 10,000 for the NSA site) and other economic development benefits they bring.
According to the Tribune, the Baltimore Sun had reported in 2006 that the NSA was forced to move outside the Beltway area because it had maxed out the local power grid. The Utah data center will reportedly use up to 65 megawatts of power — or as much as the entire city of Salt Lake City itself. The facility also asked to be annexed into the nearby city of Bluffdale to ensure it would have an adequate water supply for cooling the computers.
In fact, the state of Utah passed a law earlier this year that would enable it to add a new 6% tax to the power used, which could raise up to $2.4 million annually on the expected power costs of $40 million. The NSA is apparently quite put out at the potential additional charges.
Courts have been ruling one way or another in the past few years about whether someone accused of encrypting incriminating information needs to reveal the encryption key to law enforcement. Now we actually have a case where the judge reversed himself.
In April, Judge William Callahan ruled that Jeffrey Feldman — a Wisconsin software engineer accused of possessing child pornography, and who had 16 storage devices, nine of which were encrypted — did not have to reveal his encryption key, saying it would violate his Fifth Amendment right against self-incrimination.
But this week, Judge Callahan reversed himself. His original ruling had been based on there not being enough evidence tying Feldman to child pornography or the disk drives in question. However, prosecutors were able to decrypt one of the drives — out of a total of almost 20 TB of storage — and reportedly found some 700,000 child pornography files, along with enough personal information about Feldman to tie him to the disk. This was enough to persuade the judge to change his ruling.
This is part of a continuing process where courts are trying to figure out what an encryption key is, legally speaking. Is it a physical thing, like a key to a lockbox, which is not protected by the Fifth Amendment? Or is it like the a combination to a safe — the “expression of the contents of an individual’s mind” — which is protected? In some countries, people have even been jailed for refusing to reveal an encryption key.
This case, like most of the other ones regarding revealing encryption keys, has to do with child pornography, which adds another nuance to the issue. Are law enforcement and the legal profession more likely to push the envelope of legal search because they so badly want to catch child pornographers? Or because they think people will be less likely to criticize their methods because the crime is so heinous? (Or as Mike Wheatley put in his blog, Silicon Angle, about the original case, “Data Encryption Makes Perverts Untouchable.“)
“That’s also the whole point of the Bill of Rights: ‘mere suspicion’ is not enough to let the government search your premises and invade your privacy; the government needs actual evidence of wrongdoing before it can interfere with your life,” countered Jennifer Abel, in the Daily Dot, about the April case. “Nowhere in the text of the U.S. Constitution does it say ‘All rights listed herein may be suspended, if cops suspect you did something really really bad.'”
Legal experts expect the issue will eventually be decided by the Supreme Court.
There’s a new arms race, which all started a week ago when Yahoo! announced that users of its Flickr online photo-sharing service would get 1 terabyte of storage each, free.
The mind reels.
“To provide a perspective on that terabyte” — which, it claims, is 70 times what competitors offer — “it represents the equivalent, Yahoo says, of 537,731 6.5-megapixel images,” writes Edward Baig in USA Today. “Indeed if you took a photo every hour every day, it would take you more than 61 years to fill that space, Yahoo says. Moreover, you can upload and download your pictures in full resolution and share them, Yahoo claims, without any loss of quality. You can upload videos of up to three minutes in length as well.”
(According to Marketing Land, Yahoo! CEO Marissa Mayer said that if all the photos in the world were uploaded to Flickr today at full resolution, they would take up just a tenth of Flickr’s allotted storage space.)
I know I’m not the only one who found it a good time to blow the dust off a long-unused Flickr account (with one photographer noting he’d never had as much Flickr engagement as he had the three days after the announcement). Like many people, one of the biggest data hogs on my home storage is photos — and not just photos, but duplicates, backups, backups of duplicates, and duplicates of backups. The 5- and 10-gigabyte freebies offered by online storage services such as Google Drive and Box wouldn’t make a dent and would just exacerbate the multiple repository problem.
(According to USA Today, Flickr has 87 million users. At a terabyte each? 87 million terabytes? That’s 83 exabytes. Where’s Yahoo! getting all that storage if everyone actually takes them up on it? And how much of its $5 billion cash horde will go to provide it? I’d love to see some followup from them with all the geekly details.)
And the 1TB offer seems to have gone a long way to assuage the fears of Flickr users — and, again, I’m one — that Yahoo! would unceremoniously pull the rug out from under the service, which it acquired in 2005, as it did with Delicious. Granted, that’s still a possibility — and would be even more difficult to deal with as a 1TB system — but few people are talking about that now. (In fact, there’s some interesting maybe-we-were-too-hasty backtracking going on.) It’s also likely to drive a new wave of innovation as people develop add-ons, links, and other tweaks to Flickr.
Even with Google rejiggering its free online storage to give people 15GB shared among all its services, all the attention is being focused now on Flickr. (Google charges $49.99 for a terabyte.) Expect a number of online storage providers to match Yahoo!’s offer — and if they don’t, being faced with some hard questions about why not.
It isn’t the first time that the Apple-Samsung case has come up in an electronic discovery context, but in the bravo-for-life’s-little-ironies department, Google was criticized by a judge for saying search is too hard.
“U.S. Magistrate Judge Paul S. Grewal in San Jose, Calif., ordered Google within two days to disclose what terms it’s using to find documents Apple has requested in pretrial information sharing, and to tell Apple which Google employees those documents came from,” writes Bloomberg Business Week. “Google had argued the collection of information would be too burdensome.”
“The court cannot help but note the irony that Google, a pioneer in searching the Internet, is arguing that it would be unduly burdened by producing a list of how it searched its own files,” Grewal wrote in a footnote to his order.
Apple took the step of asking the judge to intervene because it believed the search terms Google was using in response to Apple’s document production requests weren’t inclusive enough, and so left things out.
“Apple believes Google purposely uses suboptimal search terms,” writes the FossPatents blog. “For example, Apple claims to know that Google uses a different term internally for what Apple calls ‘slide to unlock.’ As a result, searches for ‘slide to unlock’ wouldn’t deliver too many documents in which Google employees discussed this patented technology. ”
(Why is Google involved in the Samsung case in the first place? Because all of the products said to be infringing run the Android operating system that Google developed.)
Apple was also criticized by the judge for not being more cooperative, such as by telling Google what documents it thought were missing. As you may recall, the 2006 rules for electronic discovery require the different sides in a legal suit to work together and agree on how they will search for documents.
There is, of course, more to the story.
“Warren, a lawyer for Google who is also representing Samsung, explained to the judge that turning over the requested information Apple is seeking could lead to ‘future discovery that we don’t think they’re entitled to’ and give the company ‘ideas about how to proceed that they wouldn’t have had,'” writes BGR.
Good story though.
It’s ironic that EMC’s EMC World party is featuring Bruno Mars, whose most recent hit, “When I Was Your Man,” features the singer lamenting the loss of his girlfriend because he took her for granted. Billed as the “Customer Appreciation Event,” the party is the culmination of almost a week in Las Vegas where EMC is attempting to demonstrate to its customers that it does, too, notice that other things are going on in the storage industry besides bigger and bigger, more expensive boxes to hold data.
EMC is sort of the IBM of the storage industry — big, not necessarily terribly exciting or innovative, but continuing to be a major player because, remember big? Just like IBM can suddenly decide to make a particular technology front-page news by throwing a billion dollars at it, like it did with flash a few weeks ago, EMC can make a big deal about storage virtualization, software-defined storage, mobile, cloud, and so on simply by virtue of being EMC, even though other storage vendors have been doing it for years.
There’s other places to read about the specific announcements so I won’t go into them, other than to observe that EMC is saying you will be able to use them to have your own Facebook-like data center. Except the whole point of the Facebook’s data center storage is that it uses commodity hardware, and if you’re using commodity hardware, then what do you need EMC for anyway? I know, I know, it’s a metaphor, never mind.
Befitting the conference’s theme of “transformation,” EMC seemed to be spending an awful lot of time explaining the various reorganizations it’s had over the past few years, starting when CEO Joe Tucci decided he was going to retire, then changed his mind, followed by a lot of musical chairs between EMC and VMware, and culminating in the recent announcement of Pivotal, which rearranges yet more pieces of EMC and VMware.
At the same time, the company also spent a lot of time talking about the “third platform” — a conglomeration of mobile, big data, cloud, and so on, after the first platform of mainframes and the second platform of client/server. After all, if EMC can make mobile and the cloud sound like just another generational version of mainframes, it sounds more like they’ll continue to be the logical alternative, right?
And of course EMC is going to do all it can to promote big data. Like Cowboy Curtis, who knows that “big feet” means “big boots,” EMC knows that big data means big hardware to put it on, and nobody does it bigger than EMC.
Ironically, this was all happening against a backdrop of EMC announcing it was laying off more than 1,000 people, with VMware laying off another 800. The company said it was always doing this and that by the end of the year it would actually have more people than it started with. Okay. But seriously? After all the investment in hiring and training those people, the company sees no other way but to do a forklift upgrade of its employees?
On second thought, for EMC, maybe that isn’t so surprising after all.
In any event, EMC has to at least go through the motions of being up on what users are interested in, lest it sound too much like another Bruno Mars number, “The Lazy Song” [mildly NSFW]:
Today I don’t feel like doing anything
I just wanna lay in my bed
Don’t feel like picking up my phone, so leave a message at the tone
‘Cause today I swear I’m not doing anything
I’ll be lounging on the couch just chilling in my Snuggie
Click to MTV so they can teach me how to dougie
‘Cause in my castle I’m the freaking man
In another case of governments behaving badly with personal data, the state of Utah has learned that a data breach a year ago is likely to be even more costly than originally estimated – and that’s after the initial estimate was itself increased by almost 30 times.
“In late March 2012, hackers broke into a Medicaid server that a technician had placed online without changing the factory password and downloaded the personal information of 780,000 Utahns,” writes the Salt Lake City Tribune. (To put that in perspective, that’s one out of every six Utahns.) “Some were on Medicaid, but also affected were the privately insured, uninsured and retirees on Medicare whose providers had sent their data to Medicaid in the hopes of billing the low-income program.” Of those, 280,000 people had their Social Security numbers exposed, which puts them at particular risk.
Initially, it was thought that only 24,000 people had had their information put at risk. Stephen Fletcher, executive director of the state’s Department of Technical Services lost his job over the incident.
“Utah’s Medicaid Management Information System, which receives eligibility inquiries and billing information from providers, was not protected by a firewall as it was upgrading on March 10, when hackers in Eastern Europe first gained access to the state server,” wrote the Deseret News last May. “That server was also installed by an independent contractor more than a year ago, which is not typical protocol for the department, [new DTS director Mark] VanOrden said. A process to ensure that new servers are monitored and a risk assessment performed prior to use was not followed, and factory-issued default passwords were still in effect on the server, which is also not ‘routine.’ The final ‘mistake,’ he said, is that information stayed on the server for too long and while it was there, it was not encrypted, leaving it vulnerable to hackers who began downloading the sensitive information March 30.”
A year later, the state is now saying that the damage is estimated to be $9 million, with $3.4 million coming from the department. It includes $467,000 to hire an ombudsman, staff a hotline, run ads and hold community meetings to notify victims; $1.9 million to provide two years of credit monitoring for those whose Social Security numbers were compromised; $741,000 on a legal consultant and forensic security audit; and $300,000 to create an Office of Health Information and Data Security. The state also spent $1.2 million on a review of state servers and $4.4 million to increase security, according to the Associated Press.
In addition, state residents and businesses face potential fraud of up to $406 million, according to new estimates from Javelin Strategy & Research, which examined the Utah breach. “Based on Javelin’s calculations, 122,000 cases of fraud will occur as a result of this breach, with each incident resulting in $3,327.87 of loss,” wrote the company – which admittedly has a vested interest in making the case look as bad as possible. ”Each Utahn whose info is misused as a result of this data theft will incur $770.49 in out of pocket costs and spend 20 hours resolving these cases.” The company estimates that victims of data theft now have a 1 in 4 chance – up from 1 in 9 – have having their information using fraudulently.
Unfortunately, this is not uncommon. “According to information posted by the Privacy Rights Clearinghouse, of the 203 data breaches reported so far this year in the US, 103 involved either government or healthcare information,” Mary Jander of Internet Evolution wrote last year. “Of that subset, 16 cases were the result of hacking.”
As in a similar case in South Carolina last fall, Utah said it didn’t encrypt the data because the federal government didn’t require it. After the South Carolina incident, politicians from the Republican party – normally the party of small government that is against federal mandates – called for the federal government to require encryption of PII by state governments, apparently not trusting state governments to connect the dots themselves. Like South Carolina, Utah is also a Republican state, but thus far its politicians have limited themselves to a state bill that requires more notifications – but also not requiring encryption.
Jingming Zhang is one unlucky SOB. After five years of research, as he was working on the thesis required for his PhD in chemistry from Rutgers University, the laptop containing all of his data was reportedly stolen from an unlocked lab in the college.
Zhang wrote a note and put up flyers about the theft, which was picked up by ABC News and which a friend of his posted to his Facebook page, and which was then posted to Reddit and many other websites beyond that. He offered $1000 to the thieves for the data, telling them exactly where on the disk they could find it, giving them the password, and telling them they could keep the computer already; he just wanted to graduate.
Now, in honor of the “Everything Wrong With … in X Minutes” CinemaSins YouTube movie spoofs (and they’re hysterical), here’s everything wrong with this story.
- “Zhang’s laptop had been in an unlocked room in Wright-Rieman, which houses laboratories.” People can walk into Rutgers University lab rooms and walk out with laptops? Doesn’t campus security worry about thieves stealing other equipment, student records, dangerous chemicals, and so on?
- “Rutgers is an open campus,” said [Rutgers Police Lt. Paul ] Fischer. “It’s not like a small liberal arts college where it’s gated in. So, even if the buildings are secured, people can piggyback in.” This is the reaction of the security guy, whose job it supposedly is to keep the campus secure? Oh well, people can walk in and take things?
- Campus security doesn’t have security cameras, even in laboratories where people are working with chemicals and on laptops?
- Does Rutgers really want their security guy on national television telling everyone how easy it is to steal things from the campus?
- Just how many things get stolen from Rutgers, anyway?
- If it’s so easy to steal things from Rutgers, wouldn’t it be a good idea for the campus police to tell this to the students, before students lose five years of research?
- “Fischer said that he wouldn’t suggest offering monetary rewards in the future” because it can invite fraud. Okay. What should the student have done differently (other than your barn-door suggestion that he hang on to his laptop next time)? Can’t he get the student to withdraw the reward if it’s such a bad idea?
- Is the Rutgers security guy working with this student to ensure he doesn’t agree to meet someone, get bopped on the head, and also be out $1000? Or to otherwise protect him from fraud?
- Does the Rutgers security guy think that having the theft nationally publicized on ABC News is a smart move? And on Facebook? And on Reddit?
- Shouldn’t the Rutgers security guy suggest to Facebook that maybe it would be a good idea to redact the student’s personal information from the posting, which has more than 33,000 shares?
- Is the Rutgers security guy maybe checking Craigslist? And eBay?
- Doesn’t the chemistry department have a server to which students can save their data? Hell, I went to Boise State and we had that.
- If it’s this easy to steal things from campus, and there’s no provision for students to back up their data on campus, and nobody warns students their work is that vulnerable, and the student may have to start his research over, doesn’t he have the basis of a nice lawsuit?
- Just what sort of chemical research is this student doing, anyway? Do we need to worry about a new kind of poison gas or IED springing up in New Jersey?
- How competitive is the chemistry research program at Rutgers? Is it possible the thief is someone in his department who’s fighting with him for grants or something?
- What are the chances that the student isn’t actually ready for his thesis defense and this is his way of procrastinating until the laptop is “found”?
- This student’s been going to Rutgers for five years and he didn’t know the buildings are insecure?
- “…from where his computer was taken sometime between 10 a.m. and 5:15 p.m.” This student leaves his laptop unattended in an unlocked room from 10 am to 5:15 pm and is surprised that it’s gone? Are we sure that Lost & Found didn’t pick it up?
- We’ve got a student smart enough to be getting a PhD in chemistry but not smart enough to keep from leaving his laptop in an unlocked room?
- Or to copy his data to a DVD?
- To a thumb drive?
- To a cloud storage service?
- To an external hard disk?
- To email it to himself?
- To do a backup? “’A lot of people are asking me why I didn’t back up my data,” Jim told the Daily Dot. “I think the reason is that I am pretty busy recently and this kind of thing never happened to me before.’”
- “The posters contained very specific instructions and details regarding his dilemma, including his laptop’s password.” Well, that certainly makes it easier for the thieves to use the laptop.
- Where is the student getting the $1000, anyway? And how did he come up with that figure?
- The posts also included his phone number. If the thieves even wanted to call, would they be able to make it through the blizzard of harassing phone calls he must be getting by now?
- He has also suffered several scamming attempts. “’There are a few people sending me messages saying they have my laptop and asking for money, but when I asked for proof, they cannot give anything to me,’ he said.” You think?
- Really, should this student even be allowed to be messing with chemicals in the first place?
- Does the student think that the thief is stupid enough to show up to a meeting to exchange the data and money?
- Or to pick it up at a mailbox?
- How exactly does the student think this is going to work? The thief will send him the data and trust him to send the money? He’ll send the money and trust the thief to send him the data? The thief will hand him the data and hang around while he checks it?
- Even if he gets the data back, how is he going to know that the thief didn’t change some of the data just to mess with him?
- How many backup companies are offering to pay all the student’s expenses in return for his doing an ad for them?
You know how every time you go to a new doctor, you have to sign this form (does anybody read it?) that talks about your rights to privacy for your medical records? Vendors of medical services have their own requirements to live up to, and Box has announced that it is complying with those regulations, in hopes that it will become more widely used as a file transfer medium in the healthcare industry.
“Compliance with the Health Insurance Portability and Accountability Act means that Box provides file redundancy to prevent data loss in a disaster, restrictions on employees’ access to documents, a breach-notification policy, data encryption and other features, ” writes GigaOm’s Jordan Novet.
In addition, the company now has ten new healthcare applications. Box is doing this by partnering with a number of other vendors. According to Jasmine Pennic at HIT Consultant Media, those applications are:
- Clinical documentation: Drchrono, a cloud and web-based HER application accessible from iPads and iPhones; and Umbie DentalCare, a dental care web-based practice management system for dentists available on the desktop and tablet.
- Care coordination: TigerText, an encrypted SaaS platform for secure text messaging in a clinical setting; Doximity, an online professional network designed for U.S. physicians; and mMedigram, a secure group messaging app for the hospital environment; PostureScreen Mobile, posture analysis screening and evaluation software for mobile devices.
- Interoperability: MedViewer, a DICOM viewer for viewing, communicating and sharing medical images on iPhone and iPad; iPaxera PACS Viewer, a PACS viewing app designed for iPad, iPhone and iPod; and Medi-Copy, which provides Release of Information (ROI) request services and creates electronic copies of patient medical records.
- Access to care: HealthTap, which provides users with personalized health information and free online and mobile answers from physicians.
Box is also supporting the requirements of the Health Information Technology for Economic and Clinical Health (HITECH) Act, and is investing in drchrono.
Compliance requirements include the following, writes Patrick Ouellette in Health IT Security.
- Data encryption occurs in transit and at rest
- Restricted physical access to production servers
- Strict logical system access controls
- Data file access granted by customers
- Audit trail of account activities on both user and content
- Formally defined and tested breach notification policy
- Training of employees on security policies and controls
- Employee access to customer data files are highly restricted
- Redundant data center facilities to mitigate disaster situations
Support for HIPAA and HITECH could also help the cloud storage company improve its reputation for security and privacy overall; various incidents have sometimes led to such services, rightly or wrongly, being seen as insecure. In particular, noted GigaOm, it may make Box more attractive to enterprise users, as well as for a planned initial public offering.
Moreover, HIPAA support could also make it easier for healthcare providers to implement BYOD, writes Ouellette. “Clinicians would now be able to set up secure cloud folders for a patient’s medical records or collaborate on a patient’s diagnosis with the Box mobile application in a compliant manner,” he writes.
HIPAA requirements can be pretty arduous; for example, the Boise-based WhiteCloud Analytics healthcare analytics software company, had to have a separate set of doors, through which one can enter only by being buzzed in, due to HIPAA requirements.
Chances are, this isn’t the first such announcement. Now that Box has come up with the idea, one can expect that other cloud storage vendors — like Dropbox, Microsoft’s Skydrive, Google’s Drive, and so on — will soon follow suit. Microsoft’s Office 365 already supports HIPAA and in fact the company has also announced improvements in its HIPAA support.