WhatsApp Down?

Soldato
Joined
28 Oct 2006
Posts
12,456
Location
Sufferlandria
I'd like to know how they get Forums like OCUK to put Facebook code on every page.

You're looking at it the wrong way. Facebook don't need to convince OcUK to let them track you on here - OcUK want to track you. If you click on a link or advert on Facebook to a product on here, they want to know more than just how many people clicked on the link. They track you so they know if you made a purchase, what you purchased, did you search for something or use the menu to find what you need, did you buy the linked product or were you looking for something else, did you buy other items as well? This can then be matched up to the facebook adverts and OcUK can see how successful their advert was.

*This is ignoring the fact that the facebook code you've picked out it some sort of display code (probably for like/share buttons on posts or embedding directly from facebook) I can't see any facebook tracking on the forums. The main shop site has this to load the facebook tracking code: <script type="text/javascript" async="" src="https://connect.facebook.net/en_US/fbevents.js"></script>
 
Soldato
Joined
1 Mar 2010
Posts
21,892
I'd like to know how they get Forums like OCUK to put Facebook code on every page.

courtesy of ublockO - I do block facebook script execution on OC forums(save some cpu cycles/electricity), and block 3rd party cookies
... but OC could track my IP address to establish where I spend money in the shop, so ublockO is not a panacea
maybe the purchase value is what triggered access to the hidden forums, stock picking etc.
 
Man of Honour
Joined
29 Mar 2003
Posts
56,808
Location
Stoke on Trent
I can't see any facebook tracking on the forums.

At the very top of this page this is wrote in the source code!!
If it's innocent then fair enough, I don't know about such things.

<!DOCTYPE html>
<html id="XenForo" lang="en-US" dir="LTR" class="Public NoJs LoggedIn NoSidebar Responsive" xmlns:fb="http://www.facebook.com/2008/fbml">
<head>

I've just clicked on a thread that has just been started with only one post, it has the same code!
 
Soldato
Joined
28 Oct 2006
Posts
12,456
Location
Sufferlandria
At the very top of this page this is wrote in the source code!!
If it's innocent then fair enough, I don't know about such things.

:confused: I explained what that is in the part of my post which you didn't quote? It's some sort of display code. I'd guess it's used for displaying a like/share button on posts (which this forum doesn't have enabled) or for embedding content from facebook into a post.
 
Soldato
Joined
30 Nov 2003
Posts
3,384
No. Rolling back a DNS/bgp router configuration will not impact data in any way unless you really have no idea how to do IT

Wasn't specifically on about DNS roll back itself. Articles I read said it was a full blown configuration change that locked out multiple systems, not just DNS. Company that size I'd assume they host everything themselves in their own data centers.
 
Caporegime
Joined
30 Jun 2007
Posts
68,784
Location
Wales
Yes there's an internal report she has seen that was suppressed.

https://www.wsj.com/articles/the-facebook-files-11631713039

Tbf nearly everything listed there is "people being ********" not really some conspiracy by Facebook.


Like the exempting VIPs from the auto blocking tools is going to be because of big backlashes when that automatic tool accidentally bans someone famous and instead there's gonna be lots of human reports they can rely on or so many fake ones they screw the automatic tool
 
Soldato
Joined
1 Mar 2010
Posts
21,892
Listening to Frances Haugen presenting to the MP's doesn't sound like a credible witness, it's just a fairly vitriolic rant blaming the 'anonymous' algorithms,
telling MP's what they wanted to hear - I'd like to hear to what extent this is generally an AI application with a target goal of maximising profit, or, genuinely under human control

Is it akin to the already reproached AI type algorithms showing 'unconcious' colour bias used by (was it) google for employment/cv triage ?
 
Commissario
Joined
17 Oct 2002
Posts
33,018
Location
Panting like a fiend
Listening to Frances Haugen presenting to the MP's doesn't sound like a credible witness, it's just a fairly vitriolic rant blaming the 'anonymous' algorithms,
telling MP's what they wanted to hear - I'd like to hear to what extent this is generally an AI application with a target goal of maximising profit, or, genuinely under human control

Is it akin to the already reproached AI type algorithms showing 'unconcious' colour bias used by (was it) google for employment/cv triage ?
A lot of what I heard (admittedly little) did make sense as a lot of the time the people doing the programming do so based on their own experiences or without quite realising how it's going to work in the real world - things like algorithms that are designed to maximise the views are generally going to do so by promoting either the most controversial or popular ones and those like them and if you're aiming them at people that have liked/seen similar content you get a feed back loop where it sees it's succeeding by the original design goals but not in the way that was intended.
She also made a very good point that tools designed to help spot problematic content won't even work in the same basic language unless you allow for differences in how it's used, American English is different enough to UK English that it causes confusion even amongst humans who speak one or other versions, unless you have people who speak all 3 main English variants (UK, US and Aus) any "English" tool is likely to be useless for a good portion of the uses who speak the language. This is especially true if you don't have enough support from native english speakers for that version, although UK english speakers seem to have slightly fewer issues than US ones, mainly because in the UK we get so exposed to US use of language*.
You can see this in things like captchas where invariably they'll use American images and wording - AFAIK we don't have "fire hydrants" in the UK certainly in the NY style of nice red ones above ground, but that's what you see as one common example (and Americans probably wouldn't have a clue about pelican or zebra crossings as they seem to just call them crosswalks).
To get such tools to work properly requires "localisation" (something routinely done in books published in the US/UK by the respective publishers), which in turn requires a more diverse group than is often found.

Software and design has a very very long history of mistakes being baked in because programmers or designers didn't think to go outside their circle of friends or experiences, it's why things like facial recognition software can be laughably bad (the image sets used are typically very much based on the likes of American models/historical photos, so the further away you get from that the worse it gets**), or things like sensors that are meant to kick in when someone puts their hand under a tap/dryer that don't work because they were never calibrated with anything other than a "pale" skin tone in mind.
Even things as common as your average car is often partly designed around data sets often from the 50's and 60's that made assumptions about who would be driving and anatomy that are now known to be incorrect, things as simple as not allowing for the height difference that is typical between men and women, or that people tend to be bigger today (height and weight) than in the 60's. IIRC they only started to commonly make crash test dummies that actually accounted for differences in anatomy between men and women a decade or two back (before that they just used different sizes).
Even the availability of shoes is often based on information that is now incorrect because the data used to select which sizes to make (and how many) is out of date (I say this as a 12w/13 shoe wearer who is utterly fed up of trying to find shoes that fit*** and have been since I hit size 11 in my teens:p)
IIRC even the basics of photography has issues because at least one standard was set for a very specific use back in something like 1930, which means you have to allow for that in any software that you want to use widely.

*I've had to explain a few times to Americans on an old UO forum why we (Brits and Aussies) were finding some of their phrasing confusing/funny, they'd not considered that something they'd heard out of context by a Brit was actually quite rude/could mean something very different depending on the exact context or that the US names for things were different to those used elsewhere and could be seen as slightly rude/have very different meanings.

**IIRC there are now companies who specialise in nothing but trying to get good phots from multiple angles of subjects from a host of backgrounds and countries rather than basically jobbing actors in LA, or the people that tended to get good photos taken back when they were expensive (and many of the newspaper archives used will skew strongly towards what celebs looked like).

***I basically can't buy them retail as many stores seem to stop stocking at about a 10 or 11 in most styles (or so my experience shows), even if the manufacturer makes them larger.

[edit]
sorry long carry on sentences.
 
Last edited:
Caporegime
Joined
30 Jun 2007
Posts
68,784
Location
Wales
Listening to Frances Haugen presenting to the MP's doesn't sound like a credible witness, it's just a fairly vitriolic rant blaming the 'anonymous' algorithms,
telling MP's what they wanted to hear - I'd like to hear to what extent this is generally an AI application with a target goal of maximising profit, or, genuinely under human control

Is it akin to the already reproached AI type algorithms showing 'unconcious' colour bias used by (was it) google for employment/cv triage ?


Amazon found an AI for recruitment selected men over women as they had higher performance scores. (And less negatives like sick days etc)

So they anonymoised the cvs for the ai no name no gender


The ai was able to work out from things like education and employment background which cvs were women and disregarded them.


In the end they ditched it and went back to humans doing it
 
Soldato
Joined
1 Mar 2010
Posts
21,892
A lot of what I heard (admittedly little) did make sense as a lot of the time the people doing the programming do so based on their own experiences or without quite realising how it's going to work in the real world - things like algorithms that are designed to maximise the views are generally going to do so by promoting either the most controversial or popular ones and those like them and if you're aiming them at people that have liked/seen similar content you get a feed back loop where it sees it's succeeding by the original design goals but not in the way that was intended.

That seems to be the nub of the issue, in the limit, facebook is achieving a radicalization, that terrorists strive to generate, all on its own, impacting children/susceptibe adults.
Listening to the experiences of parents with young children, seems to be an inherenet trust by children that what they see on the internet is true, and that the rules of the physical world speaking to strangers etc. get paused in the digital world, or never transposed.
Maybe the online harms legislation we could legislate, now, outside of EU constraints, can be a positive, and, redress the balance.

(not that you were disputing it) computer programmers/engineers are the same the world over so concurring their own experiences on the algorithms seems a fait accompli, they probably need to monitor the behaviour of people outside the social media ecosystem to see if the algorithms are natrural (I'll volunteer)
 
Back
Top Bottom