Amazon Echo - anyone have one?

I think tech like this will come about and show a true generational divide as far as personal privacy and security are concerned.

Also... I just don't get it :p
 
Last edited:
I had £55 in my Amazon account from my birthday, so I've ordered an Echo Dot. If it streams from TunedIn radio player app well then I will be impressed. It could just be gimmick or Amazon and app developers could really expand its potential.
 
Mr Robot did a really good job of cross-advertising this. The whole dialogue between the FBI agent and Alexa was one of the deepest plugs I've seen in a TV Show.

It made me want one, but I just don't need it as I don't use spotify nor iot home devices.
 
Not sure why people are so worried about privacy.

Sure, Amazon will be aggregating huge amounts of data and using it for targeted advertising.

But it's not like Jeff Bezos is sitting there listening to you and the mrs chatting every night after work.
 
Not sure why people are so worried about privacy.

Sure, Amazon will be aggregating huge amounts of data and using it for targeted advertising.

Because some of that could be health information and then a few years down the line people find that their private health insurance is not happening or very expensive.

You don't need much to then form a picture of someone that could really prejudice them a telephone number would do but this is on another level.

Political beliefs known, health info known, little secrets known, fears and hopes known, contacts known, etc - yep I totally get why people are so worried.
 
People tend to get a bit irrational about personal data etc... for example here are plenty of innovative healthcare startups and established tech companies out there for instance ready to potentially make revolutionary steps in diagnosis and treatment of various conditions but one of the biggest issues for them is being able to get hold of the required data. Does it really matter if 'your' anonymous MRI scan along with thousands of others is put into a database and made use of by companies researching diagnostic tools? It isn't like it can be linked back to you yet when this sort of thing happens the press have a field day and people are up in arms about 'muh data'. It is just slowing progress and delaying potential advancements in healthcare by a decade or two.
 
I'd rather we got the terrible attitudes that most organisations have towards security fixed before deciding that mass retention of personal data is desirable, than to go ahead with the data sharing bit and then try to push a security agenda at a later date.
 
I don't really see the security issue with the previously mentioned example, at least as far as individuals are concerned. So the database gets compromised - so what? Hackers get some anonymous MRI images.

Frankly maybe that sort of thing ought to be openly available, forget security altogether and put the information out there to make developments in this area even more feasible for smaller companies.
 
It isn't like it can be linked back to you

Except I've seen info that is passed back and looking at the demographic part it's either me or the bloke at No 13 because no-one else in that postcode fits the criteria. So yes it can be linked back to you. And in this case Amazon know damn well who to link it back to. I think you're being incredibly naive here. Whether that bothers you is another thing.
 
I was once in Screwfix and queueing behind a guy who wanted to buy a tool but when they asked for his name he completely refused to give it to them and instantly started with

"I am a very private man" and caused a huge fuss about not wanting the "big corporations" to have his "information"

It wouldn't have surprised me if he regularly donned a tinfoil hat :p
 
Except I've seen info that is passed back and looking at the demographic part it's either me or the bloke at No 13 because no-one else in that postcode fits the criteria. So yes it can be linked back to you. And in this case Amazon know damn well who to link it back to. I think you're being incredibly naive here. Whether that bothers you is another thing.

I was making a general point, perhaps I am being naive but in which case can you explain what relevance a post code would have for MRI imaging data used to train say a deep learning algo for a diagnostic tool? Why include it?
 
I don't really see the security issue with the previously mentioned example, at least as far as individuals are concerned. So the database gets compromised - so what? Hackers get some anonymous MRI images.

Frankly maybe that sort of thing ought to be openly available, forget security altogether and put the information out there to make developments in this area even more feasible for smaller companies.

Because you're relying on a company to actually anonymise that image properly and keep the non-anonymised version secure until it can be deleted. Maybe their systems are poorly configured and the deletions aren't actually happening but the application is abstracted away enough from the underlying storage that nobody realises. Or are you going to rely on operators of MRI scanners to anonymise the data before passing it across to a third party? Now that third party ingests the data and stores it based on where it came from, and it's not so anonymous any more.
 
In the case of the UK the biggest operator of MRI scanners is the NHS - that the data came from some NHS trust isn't particularly informative. So far the issues highlighted seem to be relatively trivial to solve in the grand scheme of things.
 
I was making a general point, perhaps I am being naive but in which case can you explain what relevance a post code would have for MRI imaging data used to train say a deep learning algo for a diagnostic tool? Why include it?

Abstract data is very rarely useful without demographic details. Let's say your diagnostic tool was looking at lesions prevalence, location and change for a diagnostic tool for the management of MS. Interesting and useful info that could also shape the picture at that time would be bloodwork, age group, gender, ethnicity etc using a postcode would give you some useful data as it would hint towards a socio-economic group. Now I could scan that image in no time and instantly see areas of whitening (eg the lesions) and compare and contrast with a previous version - that's a doddle - what I can't do though is trawl though all the other factors and compute how they may then be related. From that we can get patterns - now we've moved beyond some simple diagnostic tool which is fairness is doing us no favours and not improving efficiency or efficacy and into the realms of epidemiology where we can look at where and why the disease pattern may be occurring.

Now there are times when raw result analysis would be good eg Demis Hassabis's lot looking at the creatinine and urea levels as early indicators of renal failure worsening etc but even then without extra information you're really wasting your time. Now I don't know whether you are medical or not but you tell me which would tell you the most.

Case 1

Urea and Creatinine levels increasing steadily over time. (these being indicators for effective renal functioning)

Case 2

Urea and Creatinine levels increasing steadily over time in a teenager boy from a lower socio-economic postcode who has previously had a kidney transplant.

Now Case 1 would maybe tell you there is an issue but you that's pretty obvious stuff is getting worse. Case 2 may though get flagged up as an important aberration in an age group that is known to have compliance issues with taking medicine post transplant. That gives you a solution.

So in answer to your question it's quite simple - how can the tool learn without context. The first thing your doctor will ask at a hospital is history - that gives you context. Doctor my leg hurts is rather banal - doctor my legs hurt cause I am a muppet who jumped off a 3 m wall would give them a good clue where to look! Otherwise we are getting a machine to do a job it shouldn't and needn't and working to its actual strengths.

Edit: If you are just talking about say classification eg tumour grading etc then that is really a narrow subset of the actual passed on data.
 
Last edited:
I'm not medical I am however interested in machine learning. Your post code issue is solvable in that instance by replacing it with a parameter reflecting socioeconomic status. (Assuming it is useful in that context anyway)

I'm not sure you're correct to dismiss the usefulness of say a diagnostic tool to aid radiologists and perhaps speed up their work and/or help prevent mistakes(though I'm not going to argue that point as you're the medic). But for any tools(whether including additional data or simply looking at the images) to be developed by the group you mentioned (or by similar groups using similar methods - though that group has by far the most researchers) rather large data sets are required.
 
I'm not sure you're correct to dismiss the usefulness of say a diagnostic tool to aid radiologists and perhaps speed up their work and/or help prevent mistakes(though I'm not going to argue that point as you're the medic). But for any tools(whether including additional data or simply looking at the images) to be developed by the group you mentioned (or by similar groups using similar methods - though that group has by far the most researchers) rather large data sets are required.

I am not dismissing them - they are the future. Given the later edit I made as an example then they could and will be invaluable. But people must be protected.

It's doesn't matter what you do at the end of the day if this data is to meaningful it needs to have enough stuff added to it to and at that time it then is at risk of giving away details that may have far reaching implications.

There was a good study done recently by just using phone numbers - not call contents - and then establishing what exactly you could then find out about a person.

It's a balancing act at the end of the day for example if you had data analysis on cause of death and age and postcode would Harold Shipman have been able to do what he did for so long? Probably not. But we have to decide what we give up in the process. What I was doing was countering the - well what are you worried about attitude. Well actually quite a lot really - if people can get anonymous data from Amazon, Google, etc and then cross reference it then they may find out far more about us that we actually like and if they do cross reference they will easily get past the anonymous nature because at the end of the day the data has to be identifiable in some way to be useful. Get enough of that data and you see the real person. It's like playing Cluedo at the end of the day you rule enough stuff out only one answer remains. Except it won't be someone playing Cluedo will it it will be a health insurance provider. If that doesn't bother then fair enough - me I find it rather insidious that people will willingly sign up to this with a flippant attitude. If you make that choice good for you but do it from a position of knowledge.

The last data set I used had age (10 yr banding), ethnicity code, sex, postcode, various medical markers, and the actual stuff being looked at. Now like I said ethnicity, age band and sex would narrow down that data were I live to 2 people me and someone else.
 
Last edited:
Well my attitude certainly isn't 'well what are you worried about' in the general case but I'm critical of the knee jerk objections often found in the media and among various groups relating to pretty much *any* personal data. If you're talking about data that can specifically used to identify people (or at least narrow them down to that extent) then that needs greater care perhaps than say my hypothetical example of anonymous imaging data.
 
People will always fear change though so you will always get this kneejerk reaction. What you can't do and which you seem to agree with is just disregard those concerns as being without value.
 
Back
Top Bottom