I wanted to post this in GD because I thought it would get a lot more responses from a broader range of people who dont necessarily check out the "Music & Box Office" forum.
For quite some time it feels as if british television has been dominated by american shows. I'm not complaining, as a kid I'd watched british tv because I didnt know any better but as a teen I found hollywood movies, which led me to american tv shows and the rest is history.
But still I couldnt really answer my own question. I have an opinion but that doesnt mean I'm right. If more people agree then maybe I'd get a better understanding of it.
Is it because american television is better? America, big place. Big enough to find some decent talent that knows how to write, act, direct, all that malarky.
Is it because they have more money? I assume they have a lot of money with the amount of big budget movies/tv shows they put out compared to the UK.
So whats the answer? Why does american television/cinema dominate our screens. Is it something that will change? Do we actually have any quality tv worth mentioning?
I have heard that they steal some of our ideas for shows and create them with american actors because apparently no one wants to hear and all brit show and they cant understand our humour.
Thoughts?
For quite some time it feels as if british television has been dominated by american shows. I'm not complaining, as a kid I'd watched british tv because I didnt know any better but as a teen I found hollywood movies, which led me to american tv shows and the rest is history.
But still I couldnt really answer my own question. I have an opinion but that doesnt mean I'm right. If more people agree then maybe I'd get a better understanding of it.
Is it because american television is better? America, big place. Big enough to find some decent talent that knows how to write, act, direct, all that malarky.
Is it because they have more money? I assume they have a lot of money with the amount of big budget movies/tv shows they put out compared to the UK.
So whats the answer? Why does american television/cinema dominate our screens. Is it something that will change? Do we actually have any quality tv worth mentioning?
I have heard that they steal some of our ideas for shows and create them with american actors because apparently no one wants to hear and all brit show and they cant understand our humour.
Thoughts?

) etc etc
. Also the plot is much better and so is the acting (compared to Hollyoaks most things are!).
.