General Discussion
Related: Editorials & Other Articles, Issue Forums, Alliance Forums, Region ForumsThis is bad.....very bad.
France has been making very draconian laws in the name of stopping terrorism.
Now they want to jail tech people who do what Apple is doing.
The bill, adopted Tuesday by 474 votes to 32 in the National Assembly, will now have to be debated by the Senate.
One measure would punish executives in companies like Apple and Google with a fine of up to 350,000 euros ($386,000) and a five-year prison sentence if they deny prosecutors access to a suspect's encrypted data.
During the debate, French lawmakers referred to the case that has opposed Apple to the U.S. government regarding an iPhone used by the shooter who killed 14 people in San Bernardino, California.
VMA131Marine
(4,138 posts)these companies will stop selling in France (maybe even all of Europe).
Angel Martin
(942 posts)HuckleB
(35,773 posts)lapfog_1
(29,199 posts)the companies would NOT be able to give the un-encrypted data to anyone.
caveat : Installing a secret key logger and capturing everything entered MIGHT do the trick, which the companies could install into their device without your knowledge... to combat this, one should use an external device to create the key and encrypt using the generated secure public key and the private key is never stored on the device.
That leaves brute force decryption. While it is possible to make the encryption so strong that brute force might take years and millions of dollars of compute capability, it is likely that the NSA (by using quantum computers) could break it... however, effective quantum computing generally available to even the FBI is still years away.
Of course, encryption is just the start... creating a one time pad coding (switching words for other words and never repeating) can be effective against most spying... especially if the volume of encrypted data is small... however, that is a huge inconvenience to YOU accessing your coded data.
My favorite technique is steganography... this is where you take something like the bottom two bits of every pixel value in a picture and change it to two bits from your encrypted data. You take something like 100,000 photos of random things (don't use stock photos because someone might be able to compare to the originals), store your encrypted data (again, using a throw away external device to hold the app and keys that does this), and, for all intents and purposes... it appears that you have no encrypted data and you are an avid photographer.
Of course, if they find the device that holds the app that unravels what you have stored... you are screwed. But first they have to figure out that this is what you have done. Changing the lower 2 bits of each color of each 24 bit pixel won't be visible to the human (or computer) eye (unless there is an original to compare it too of course).
dixiegrrrrl
(60,010 posts)That's the whole point of the Apple issue...the FBI wants them to CREATE code to get around the encryption.
It is killing the snoops that there is something they cannot snoop on.
This is a much bigger issue than the current case.
lapfog_1
(29,199 posts)to create a version on their "unlock" program that accepts any password (or no password at all) to allow access to the phone data.
That's actually an easy thing to do if you have the source to the password checking program. I think it would take around 10 to 15 min to create the new version and load it on the phone to be hacked.
What I'm talking about is actual Public Key Encryption where the private key (known only to the person that wants to retrieve the data) is needed by the program to actually unencrypt the data.
https://en.wikipedia.org/wiki/Public-key_cryptography
awake
(3,226 posts)What Apple has done with its encryption is allow the customer own and protect their own data. Apple has stated that it has no interest in getting access to your data, so if it is never in there hands to begin with they will not be refusing to share it.