Then why wouldn’t I rave over a brilliant new device, this Kindle, or the somewhat larger 9.7″ screen Kindle DX? Both of these e-readers are already product improved versions in their second generation. After reading a link from Yale, I found that even David Byrne had one. Who am I to question him?
It actually isn’t just any one reason I’m not really into the Kindle. I’ll admit, I like paper. I’m all for recycling and not wasting resources to start with, but I am just a little tradition bound since I started seriously reading paper and hard back novels at age six. While it is a daily vice for me, there could be worse.
It does have some convenience factors vs. my other typical Amazon orders (old tech / paper and ink) but I just prefer to have a physical product to hold onto, and to pass down.
While the “1984 recall” was not-so-hot, it really isn’t the exact point. While to Amazon’s credit they did return the purchase price of the books they deleted, I’m essentially against any product / service that even has some mechanism to recall / remove / change data on my end of things. I simply don’t want to license my purchases, I want to own my particular copy.
Yes, I actually like to pay cash for things too, rather than purchase on someone else’s credit. And “librarians, unite”
Good Saturday to all. Some of the Tech bits I’ve posted in the past involved how to deploy the Windows operating system from a corporate network, RIS, etc. Now, we aren’t going to be pushing images via PXE.
Today I’d like to focus on the lower volume users, the home / home office. You know how it is, someone wants you to help them, because “their system doesn’t seem to be working right”. Ahem You know the realistic course rather than trying to debug all that has plagued their system is a reinstall.
Backup the data you care about before I come over, you tell them. (this is the point that you hear “what’s a backup“, so you know you’ll need to verify that you have all of their data backed up before you “floor mat” their system; yes, someone once asked me over the phone to floor mat instead of format)
Then you contemplate manually installing everything; even the Windows install now looks like significant work.
If you do this often enough, it doesn’t have to be. Invest the time up front in a couple of things, and it will go quickly, and without your intervention :
While some of the automation side was technically possible before Nuhi and team created these, it was a lot more arcane. These two are simple and menu driven, you can add or remove major or minor components of the OS, adjust tweaks, automate user creation / login / etc., adjust default time zone settings, slipstream service packs and patches up to today’s date with, and it also boasts integrated .iso creation or burn straight to disc.
One of the nicest things that these programs can do is to integrate specific text-mode drivers for certain system components. This is a godsend under XP, if not generally as much of a factor in Windows 7. That custom motherboard you run for the “gaming machine” can now have the built-in RAID recognized during the install, without the need to try to find a USB floppy drive to install drivers from.
I just enjoy having a potentially slimmed down version of what I want to install, that had security patches up to the day I created my image. Less to download, less security risk. Security is also sometimes increased by removing some unneeded components; if you’re not going to use them, perhaps they shouldn’t be there.
Regardless, one of these two (as appropriate) is really worth checking into.
Very handy for reinstalling any set of programs within Windows; I’ve used this for both DVD based installs, directly from the DVD (and automatically running after Windows installs) and more commonly from a fast USB memory stick drive, after the install, by double-clicking it (the latter method to take advantage of faster read speeds / large space for a huge menu of programs for friends, family, self, etc.)
Per the WPIW site :
Windows Post-Install Wizard (WPI for short) is a hypertext application designed for giving users choice. While Windows XP offers many ways of customizing the setup process out of the box, its major drawback is the lack of being able to select which applications an end user may install. In the past, end users and administrators needed to either download the files manually, or create overly complex scripts that could only be used once. WPI allows you to create one image, which can then be custom configured, and optionally, automated, so that end users can install any applications.
WPI is a simple to use automation program for the choice and installation of multiple programs, tweaks and scripts. No longer will you need a dozen CD’s or more when doing a fresh Installation. No longer will you need multiple files when you are servicing another PC. With Windows Post-Install Wizard all that you will need is 1 or 2 CD\DVD’s to fully install your PC with all of your apps, scripts, registry files or tweaks. Instead of having to re-download those apps like adobe reader, flash or updates to programs you can have them all on one disk. With WPI you can have all of them all on one disk and then have a nice interface for selecting which apps to install and after configured properly WPI will install all of them without any needed input from you. WPI also and is commonly used added to your windows installation disks. This way you can automate the complete process of Windows and program installations.
With your typical setup of WPI you have your OS and all of the apps, tweaks and such on one disk. After windows installs, WPI kicks in and you are given a selection of everything you have configured WPI with, then you can select the ones you want or simply let the timer run out and your default apps will install.
This can be customized with multiple default menus / layout with a little bit of experimentation, to allow you to handle your Window 7 Ultimate developer needs in 64 bit, and your less technical friends need with Windows 32 bit, and quite a different (and possibly smaller) set of programs.
While WPI does take a bit of time to configure, it is easily recouped with just a few installs. You do still have to maintain an “up to date” set of install programs, such as Firefox, Java, etc. Hint to Adobe : please release a new downloadable version of at least Acrobat Reader, and don’t force folks to go online to go from 9.1.0 (the latest available for download) to update to 9.1.3, which includes security fixes. Ahem
You will learn about the whole new world of silent installs / switchless, if you’ve never done this. I much prefer to configure all things for an install of the operating system, or all of the application programs on one screen, start it, and walk away. You can always work on or enjoy other things while it’s running for however long it needs to install all that you require.
Between these two things, I think you’ll find your world to be better off. Once you’ve gotten that nice new system in place, you might consider using at least Windows backup to the cloud, for one off-site copy of the original / basics. If you’re looking for more features and insurance, I can highly recommend Acronis True Image Home 2010.
Pretty interesting NAS device, for the starter market (like at home). That’s a lot of features; granted, their bottom level, one drive NAS is $300 but still, it’s always refreshing to see innovation like this. And the ability to hook up other drives externally to it is a plus’ you could get a backup rotation going there.
I’ve gotten a few inquiries about backups and storage since there’s been some focus here, so I’ll share my (not perfect, but seems reasonable) backup plan.
At home, I tend to copy off any new install CD, any new movie I’ve purchased, etc. to my main system, and then place the original on edge on one of the many bookshelves. This means it’ll both be locatable if I need it again, and pretty safe from scratching. Add to that the music collection I’ve carefully digitized to lossless, etc. as well as many high resolution digital pictures, and we have quite a bit of space taken up.
These reside on the primary system internally, across 4 1.5 terabyte drives running in Raid. 2 partitions in the raid manager, 400 gig of striped Raid-0 for speed, and 2.5t of Raid-5 for redundancy. Within Windows 7, the strip is split in half, C and D drives, with D being scratch / temp space. Drive E: is all of the raid5; all are set as primary and GPT partitions under Windows 7.
I backup to external drives, as above, and am paranoid enough about lightning etc. that even with a quality UPS per system I just know it’s hard for electricity to make it across that “air capacitance gap” when the external is unplugged from power and data, at the drive end for my convenience.
Multiple local drives for backup (reused a number of old .75t drives) as well as online backup via service as well as multiple Gladinet targets, for second tier. This keeps me from having to shuttle external disks between locations, in order to keep some of the backups “off site”. It’s not hard to tell who has experienced data loss, is it?
Acronis is very confident in their new product, and I have to agree; it looks even easier to use than before, and they’ve been steadily adding features instead of Bling. It’s always a good sign when they’re willing to let you try it out for free : Acronis True Image Home 2010 Free Trial Download.
For each picture below, clicking it should show an enlarged version in a new window.
I ran through a couple of Beta testing versions before this new launch, and Acronis looks like they made the minor changes needed. I was really glad to see the launch, as I’ve been relying on this for some time since I’m running the production Windows 7 on my (windows) machines.
Acronis says : “With Acronis True Image Home 2010, rest assured that all your important data including images, music, documents and applications are well protected and can easily be recovered in the event of any disaster. Also the newest Acronis True Image Home 2010 is the best solution for moving your system to Windows 7 and storing your backups online.”
The only portion I would add is that the online backup is optional; you can still use the conventional backup mode to practically any device (DVD, network, firewire or usb hard disk, etc.)
Since I prefer to have both a local copy (external hard disk, unplugged from electrical system when not in use) as well as an offsite backup copy (online is increasingly attractive, as long as it’s well encrypted, which True Image 2010 supports) this really fits my needs.
I’m quite happy with the dual destination backup feature as well; it’s refreshing to see this brought from their Enterprise market down to a much less expensive home / home office product.
Thanks for reading this launch information and review of Acronis True Image 2010. I hope you’ll be as satisfied as I am with the newest version of their flagship product. You can download a completely functional evaluation copy for free here, or you can order the full product for $49.99 directly from the picture link below.
Excerpt : “Samsung is showing a robot vacuum cleaner at the IFA that will ensure you won’t have to move an inch or oversee anything while it is performing its cleaning chore. Called as the Furot II, it has an integrated camera and Visionary Mapping System that allows the robot cleaner to see, think and remember.”
Excerpt : Roomba not sucking the way it used to? Samsung sure hopes not, as it has just recently pushed out a robotic vacuum cleaner of its very own. Quietly showcased during IFA earlier this month, the Furot II packs an oh-so-familiar design and sports an integrated camera and mapping system that enables it to find its way, remember its course and clean your floors with practically no human assistance.
In this issue:
Eighth Anniversary of 9/11
Real-World Access Control
On London’s Surveillance Cameras
Robert Sawyer’s Alibis
Stealing 130 Million Credit Card Numbers
“The Cult of Schneier”
Comments from Readers
** *** ***** ******* *********** *************
Eighth Anniversary of 9/11
On September 30, 2001, I published a special issue of Crypto-Gram discussing the terrorist attacks. I wrote about the novelty of the attacks, airplane security, diagnosing intelligence failures, the potential of regulating cryptography — because it could be used by the terrorists — and protecting privacy and liberty. Much of what I wrote is still relevant today.
Skein is one of the 14 SHA-3 candidates chosen by NIST to advance to the second round. As part of the process, NIST allowed the algorithm designers to implement small “tweaks” to their algorithms. We’ve tweaked the rotation constants of Skein.
The revised Skein paper contains the new rotation constants, as well as information about how we chose them and why we changed them, the results of some new cryptanalysis, plus new IVs and test vectors.
Tweaks were due today, September 15. Now the SHA-3 process moves into the second round. According to NIST’s timeline, they’ll choose a set of final round candidate algorithms in 2010, and then a single hash algorithm in 2012. Between now and then, it’s up to all of us to evaluate the algorithms and let NIST know what we want. Cryptanalysis is important, of course, but so is performance.
The second-round algorithms are: BLAKE, Blue Midnight Wish, CubeHash, ECHO, Fugue, Grøstl, Hamsi, JH, Keccak, Luffa, Shabal, SHAvite-3, SIMD, and Skein. You can find details on all of them, as well as the current state of their cryptanalysis, at the SHA-3 Zoo.
In other news, we’re making Skein shirts available to the public. Those of you who attended the First Hash Function Candidate Conference in Leuven, Belgium, earlier this year might have noticed the stylish black Skein polo shirts worn by the Skein team. Anyone who wants one is welcome to buy it, at cost. All orders must be received before 1 October, and then we’ll have all the shirts made in one batch.
Access control is difficult in an organizational setting. On one hand, every employee needs enough access to do his job. On the other hand, every time you give an employee more access, there’s more risk: he could abuse that access, or lose information he has access to, or be socially engineered into giving that access to a malfeasant. So a smart, risk-conscious organization will give each employee the exact level of access he needs to do his job, and no more.
Over the years, there’s been a lot of work put into role-based access control. But despite the large number of academic papers and high-profile security products, most organizations don’t implement it — at all — with the predictable security problems as a result.
Regularly we read stories of employees abusing their database access-control privileges for personal reasons: medical records, tax records, passport records, police records. NSA eavesdroppers spy on their wives and girlfriends. Departing employees take corporate secrets
A spectacular access control failure occurred in the UK in 2007. An employee of Her Majesty’s Revenue & Customs had to send a couple of thousand sample records from a database on all children in the country to National Audit Office. But it was easier for him to copy the entire database of 25 million people onto a couple of disks and put it in the mail than it was to select out just the records needed. Unfortunately, the discs got lost in the mail and the story was a huge embarrassment for the government.
Eric Johnson at Dartmouth’s Tuck School of Business has been studying the problem, and his results won’t startle anyone who has thought about it at all. RBAC is very hard to implement correctly. Organizations generally don’t even know who has what role. The employee doesn’t know, the boss doesn’t know — and these days the employee might have more than one boss — and senior management certainly doesn’t know. There’s a reason RBAC came out of the military; in that world, command structures are simple and well-defined.
Even worse, employees’ roles change all the time — Johnson chronicled one business group of 3,000 people that made 1,000 role changes in just three months — and it’s often not obvious what information an employee needs until he actually needs it. And information simply isn’t that granular. Just as it’s much easier to give someone access to an entire file cabinet than to only the particular files he needs, it’s much easier to give someone access to an entire database than only the particular records he needs.
This means that organizations either over-entitle or under-entitle employees. But since getting the job done is more important than anything else, organizations tend to over-entitle. Johnson estimates that 50 percent to 90 percent of employees are over-entitled in large organizations. In the uncommon instance where an employee needs access to something he normally doesn’t have, there’s generally some process for him to get it. And access is almost never revoked once it’s been granted. In large formal organizations, Johnson was able to predict how long an employee had worked there based on how much access he had.
Clearly, organizations can do better. Johnson’s current work involves building access-control systems with easy self-escalation, audit to make sure that power isn’t abused, violation penalties (Intel, for example, issues “speeding tickets” to violators), and compliance rewards. His goal is to implement incentives and controls that manage access without making people too risk-averse.
In the end, a perfect access control system just isn’t possible; organizations are simply too chaotic for it to work. And any good system will allow a certain number of access control violations, if they’re made in good faith by people just trying to do their jobs. The “speeding ticket” analogy is better than it looks: we post limits of 55 miles per hour, but generally don’t start ticketing people unless they’re going over 70.
There is a movement in the U.K. to replace the pint glasses in pubs with plastic because too many of them are being used as weapons. I don’t think this will go anywhere, but the sheer idiocy is impressive. Reminds me of the call to ban pointy knives. That recommendation also came out of the UK. What’s going on over there? http://www.schneier.com/blog/archives/2009/08/banning_beer_gl.html
Hacking swine flu: ”So it takes about 25 kilobits — 3.2 Kbytes — of data to code for a virus that has a non-trivial chance of killing a human. This is more efficient than a computer virus, such as MyDoom, which rings in at around 22 Kbytes. It’s humbling that I could be killed by 3.2 Kbytes of genetic data. Then again, with 850 Mbytes of data in my genome, there’s bound to be an exploit or two.” http://www.bunniestudios.com/blog/?p=353
Nils Gilman’s lecture on the global illicit economy Malware is one of Nils Gilman’s examples, at about the nine-minute mark. http://video.google.com/videoplay?docid=3173247273890946684#
The seven rules of the illicit global economy (he seems to use “illicit” and “deviant” interchangeably in the talk):
1. Perfectly legitimate forms of demand can produce perfectly deviant forms of supply.
2. Uneven global regulatory structures create arbitrage opportunities for deviant entrepreneurs.
3. Pathways for legitimate globalization are always also pathways for deviant globalization.
4. Once a deviant industry professionalizes, crackdowns merely promote innovation.
5. States themselves undermine the distinction between legitimate and deviant economics.
6. Unchecked, deviant entrepreneurs will overtake the legitimate economy.
7. Deviant globalization presents an existential challenge to state legitimacy.
File deletion is all about control. This used to not be an issue. Your data was on your computer, and you decided when and how to delete a file. You could use the delete function if you didn’t care about whether the file could be recovered or not, and a file erase program — I use BCWipe for Windows — if you wanted to ensure no one could ever recover the file.
As we move more of our data onto cloud computing platforms such as Gmail and Facebook, and closed proprietary platforms such as the Kindle and the iPhone, deleting data is much harder.
You have to trust that these companies will delete your data when you ask them to, but they’re generally not interested in doing so. Sites like these are more likely to make your data inaccessible than they are to physically delete it. Facebook is a known culprit: actually deleting your data from its servers requires a complicated procedure that may or may not work. And even if you do manage to delete your data, copies are certain to remain in the companies’ backup systems. Gmail explicitly says this in its privacy notice.
Online backups, SMS messages, photos on photo sharing sites, smartphone applications that store your data in the network: you have no idea what really happens when you delete pieces of data or your entire account, because you’re not in control of the computers that are storing the data.
This notion of control also explains how Amazon was able to delete a book that people had previously purchased on their Kindle e-book readers. The legalities are debatable, but Amazon had the technical ability to delete the file because it controls all Kindles. It has designed the Kindle so that it determines when to update the software, whether people are allowed to buy Kindle books, and when to turn off people’s Kindles entirely.
Vanish is a research project by Roxana Geambasu and colleagues at the University of Washington. They designed a prototype system that automatically deletes data after a set time interval. So you can send an e-mail, create a Google Doc, post an update to Facebook, or upload a photo to Flickr, all designed to disappear after a set period of time. And after it disappears, no one — not anyone who downloaded the data, not the site that hosted the data, not anyone who intercepted the data in transit, not even you — will be able to read it. If the police arrive at Facebook or Google or Flickr with a warrant, they won’t be able to read it.
The details are complicated, but Vanish breaks the data’s decryption key into a bunch of pieces and scatters them around the web using a peer-to-peer network. Then it uses the natural turnover in these networks — machines constantly join and leave — to make the data disappear. Unlike previous programs that supported file deletion, this one doesn’t require you to trust any company, organization, or website. It just happens.
Of course, Vanish doesn’t prevent the recipient of an e-mail or the reader of a Facebook page from copying the data and pasting it into another file, just as Kindle’s deletion feature doesn’t prevent people from copying a book’s files and saving them on their computers. Vanish is just a prototype at this point, and it only works if all the people who read your Facebook entries or view your Flickr pictures have it installed on their computers as well; but it’s a good demonstration of how control affects file deletion. And while it’s a step in the right direction, it’s also new and therefore deserves further security analysis before being adopted on a wide scale.
We’ve lost the control of data on some of the computers we own, and we’ve lost control of our data in the cloud. We’re not going to stop using Facebook and Twitter just because they’re not going to delete our data when we ask them to, and we’re not going to stop using Kindles and iPhones because they may delete our data when we don’t want them to. But we need to take back control of data in the cloud, and projects like Vanish show us how we can.
Now we need something that will protect our data when a large corporation decides to delete it.
A recent report has concluded that the London’s surveillance cameras have solved one crime per thousand cameras per year.
I haven’t seen the report, but I know it’s hard to figure out when a crime has been “solved” by a surveillance camera. To me, the crime has to have been unsolvable without the cameras. Repeatedly I see pro-camera lobbyists pointing to the surveillance-camera images that identified the 7/7 London Transport bombers, but it is obvious that they would have been identified even without the cameras.
And it would really help my understanding of the report’s per-crime cost-to-detect of £20,000 (I assume it is calculated from £200 million for the cameras times 1 in 1000 cameras used to solve a crime per year divided by ten years) if I knew what sorts of crimes the cameras “solved.” If the £200 million solved 10,000 murders, it might very well be a good security trade-off. But my guess is that most of the crimes were of a much lower level.
Back in 2002, science fiction author Robert J. Sawyer wrote an essay about the trade-off between privacy and security. I’ve never forgotten the first sentence: ”Whenever I visit a tourist attraction that has a guest register, I always sign it. After all, you never know when you’ll need an alibi.”
Since I read that, whenever I see a tourist attraction with a guest register, I do the same thing. I sign “Robert J. Sawyer, Toronto, ON” — because you never know when he’ll need an alibi.
Here’s a video of a talk, “The Future of the Security Industry,” I gave at an OWASP meeting in August in Minneapolis. http://vimeo.com/6495257
** *** ***** ******* *********** *************
Stealing 130 Million Credit Card Numbers
Someone has been charged with stealing 130 million credit card numbers.
Yes, it’s a lot, but that’s the sort of quantities credit card numbers come in. They come by the millions, in large database files. Even if you only want ten, you have to steal millions. I’m sure every one of us has a credit card in our wallet whose number has been stolen. It’ll probably never be used for fraudulent purposes, but it’s in some stolen database somewhere.
Years ago, when giving advice on how to avoid identity theft, I would tell people to shred their trash. Today, that advice is completely obsolete. No one steals credit card numbers one by one out of the trash when they can be stolen by the millions from merchant databases.
If there’s actually a cult out there, I want to hear about it. In an essay by that name, John Viega writes about the dangers of relying on Applied Cryptography to design cryptosystems:
But, after many years of evaluating the security of software
systems, I’m incredibly down on using the book that made Bruce
famous when designing the cryptographic aspects of a system. In
fact, I can safely say I have never seen a secure system come out
the other end, when that is the primary source for the crypto
design. And I don’t mean that people forget about the buffer
overflows. I mean, the crypto is crappy.
My rule for software development teams is simple: Don’t use
Applied Cryptography in your system design. It’s fine and
fun to read it, just don’t build from it.
The book talks about the fundamental building blocks of
cryptography, but there is no guidance on things like, putting
together all the pieces to create a secure, authenticated
connection between two parties.
Plus, in the nearly 13 years since the book was last revised, our
understanding of cryptography has changed greatly. There are
things in it that were thought to be true at the time that turned
out to be very false….
I agree. And, to his credit, Viega points out that I agree:
But in the introduction to Bruce Schneier’s book, Practical
Cryptography, he himself says that the world is filled with
broken systems built from his earlier book. In fact, he wrote
Practical Cryptography in hopes of rectifying the problem.
This is all true.
Designing a cryptosystem is hard. Just as you wouldn’t give a person — even a doctor — a brain-surgery instruction manual and then expect him to operate on live patients, you shouldn’t give an engineer a cryptography book and then expect him to design and implement a cryptosystem. The patient is unlikely to survive, and the cryptosystem is unlikely to be secure.
Even worse, security doesn’t provide immediate feedback. A dead patient on the operating table tells the doctor that maybe he doesn’t understand brain surgery just because he read a book, but an insecure cryptosystem works just fine. It’s not until someone takes the time to break it that the engineer might realize that he didn’t do as good a job as he thought. Remember: Anyone can design a security system that he himself cannot break. Even the experts regularly get it wrong. The odds that an amateur will get it right are extremely low.
For those who are interested, a second edition of Practical Cryptography will be published in early 2010, renamed Cryptography Engineering and featuring a third author: Tadayoshi Kohno.
Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise. You can subscribe, unsubscribe, or change your address on the Web at <http://www.schneier.com/crypto-gram.html>. Back issues are also available at that URL.
Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.
CRYPTO-GRAM is written by Bruce Schneier. Schneier is the author of the best sellers “Schneier on Security,” “Beyond Fear,” “Secrets and Lies,” and “Applied Cryptography,” and an inventor of the Blowfish, Twofish, Threefish, Helix, Phelix, and Skein algorithms. He is the Chief Security Technology Officer of BT BCSG, and is on the Board of Directors of the Electronic Privacy Information Center (EPIC). He is a frequent writer and lecturer on security topics. See <http://www.schneier.com>.
Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of BT.
SAN FRANCISCO (Reuters) — Google is disappointed with the lack of breakthrough investment ideas in the green technology sector but the company is working to develop its own new mirror technology that could reduce the cost of building solar thermal plants by a quarter or more.
“We’ve been looking at very unusual materials for the mirrors both for the reflective surface as well as the substrate that the mirror is mounted on,” the company’s green energy czar Bill Weihl told Reuters Global Climate and Alternative Energy Summit in San Francisco on Wednesday.
Google in late 2007 said it would invest in companies and do research of its own to produce affordable renewable energy within a few years.
The company’s engineers have been focused on solar thermal technology, in which the sun’s energy is used to heat up a substance that produces steam to turn a turbine. Mirrors focus the sun’s rays on the heated substance.