World News

YouTube TV adds ViacomCBS, raises price to $64.99 per month

Coronavirus lockdowns set off the golden age of streaming: Investor

Jeff Sica of Circle Squared Alternative Investments argues the old model of going to movie studios and theaters has been broken.

YouTube TV is now one of the priciest live TV steaming services with a monthly asking price of $64.99.

Continue Reading Below


The company announced its $15 price hike in a blog post on Tuesday. The reason for the increase is because of its expanding channel package. In May, YouTube TV shared that its parent company, Google, inked a deal with ViacomCBS to bring more channels onto the streaming service’s platform, including BET, CMT, Comedy Central, MTV, Nickelodeon, Paramount Network, TV Land and VH1 – all of which debuted June 30.

Ticker Security Last Change Change %
GOOGL ALPHABET INC. 1,418.05 +20.88 +1.49%


"We're excited to launch ViacomCBS' portfolio on YouTube TV this summer," Lori Conkling, YouTube TV’s global head of partnerships, has said. "Our expanded partnership delivers on our promise to offer a premium portfolio of content to our YouTube TV subscribers, as well as across the YouTube platforms."

Prior to the announcement, YouTube TV had cost cord-cutters $49.99. And before that, the service had cost $35 when it first launched in 2017. Similar to the ViacomCBS deal, these price jumps resulted from YouTube TV adding popular channels like PBS and the Discovery Network’s HGTV and Food Network.


Meanwhile, Hulu + Live TV and Fubo TV start at $54.99 per month and AT&T Now Plus start at $55 per month and AT&T Now Max starts at $80 per month with HBO Max. Sling TV, on the other hand, offers two packages – Sling Blue and Sling Orange – for $30 each or $45 for both packages combined.

Despite its competitor’s offerings, YouTube TV now offers more than 85 channels and even added new features to its app, including segment jumping, extra DVR controls in addition to unlimited DVR space, a dark mode menu option, a “Mark as Watched” selection for organizing shows and a refreshed live guide.


YouTube also offers 17 original series that can only be watched on its streaming service.

The price change will be reflected in YouTube TV subscriber bills after July 30.


Source: Read Full Article

World News

YouTube not playing videos for some people – how to fix it now

WEB browser Microsoft Edge is causing some major problems for users just trying to watch YouTube videos.

The problem reportedly occurs when a user is also running AdBlock or AdBlock Plus.

According to 9to5Google, if you're running the ad blocker on the browser and trying to load a YouTube video you'll just be met with a black screen.

This message is also said to appear: "An error occurred. Please try again later."

Fortunately, Microsoft has acknowledged the problem and has issued a fix via a blog post.

It said: "Thankfully, the team has also identified an easy workaround while they investigate this further.

"If you’re experiencing issues with YouTube and have either Adblock extension enabled, they recommend disabling it then reloading the web page.

"Give it a try and let us know if it works in the comments below!"

Microsoft also wants users to warn it if they're experiencing this issue without an AdBlock extension or if the workaround doesn't work.

It said: "Please submit feedback through Microsoft Edge by holding down Shift+Alt+I within the browser, or navigating to the '…' menu, selecting 'Help and feedback', then choosing 'Send feedback'.

"Please include a detailed description of what you’re encountering, and select the checkbox to include diagnostic data."

One of Edge's most appealing features is the fact it can run AdBlock extensions originally designed for Google Chrome.

In other news, a Windows 10 update has been bugging users with false security warnings about apps that don't exist.

Android users have been warned to delete a rogue app called SnapTube.

And, scammers are using Google Alerts to send out links to malware.

Do you use Microsoft Edge? Let us know in the comments…

We pay for your stories! Do you have a story for The Sun Online Tech & Science team? Email us at [email protected]

Source: Read Full Article

World News

YouTube says it'll ban Nazi accounts and Sandy Hook deniers

New York (CNN Business)Six days after YouTube said it would ban supremacist content and remove videos that deny well-documented atrocities like the Holocaust, accounts belonging to some of the most prominent purveyors of hate in the US, such as white supremacist Richard Spencer and former KKK leader David Duke, are still on the platform.

YouTube has taken some action against Duke’s account, which he uses to, among other things, rail against what he calls the “Zio” media — “Zio” is a code word he uses for “Jewish” — and post bizarre fitness videos with advice on how to avoid shrunken testicles. Features like comments and sharing have been removed from the channel, and YouTube has added a warning that his videos contain “inappropriate” or “offensive” content. But a YouTube spokesperson told CNN Business that those actions predated the company’s announcement last week.
The majority of videos on the account for the National Policy Institute, a white supremacist group that Spencer runs, do not contain any content warnings and most of the videos can still be shared and commented on. Spencer, who helped found the alt-right movement, was one of the leaders of the Unite the Right rally in Charlottesville, Virginia, in August 2017. Violence at that rally led to dozens of injuries and the death of counterprotester Heather Heyer.

    One video that has a content warning and other restrictions shows Spencer interviewing Maram Susli, a YouTube creator known as “Syrian Girl,” who has contributed to conspiracy site InfoWars.
    YouTube says it'll ban accounts that promote Nazism or deny Sandy Hook massacre
    In its blog post on Wednesday, YouTube said it was prohibiting “videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.” YouTube also said it would remove hundreds of thousands of videos that it had not previously considered to be in violation of its policies.

    A YouTube spokesperson declined to comment on specific accounts, but said that enforcement of the updated policy will take time and that the company will expand its coverage of the new rules over the next several months. The spokesperson also said accounts are removed after they have repeatedly violated YouTube’s “Community Guidelines” or if the channel is dedicated to violating YouTube’s policies.
    How effectively YouTube will enforce its new policy is an open question. CNN Business found on Thursday that one Nazi channel that YouTube has twice before deleted was back up, and making no attempt to hide itself or its connection to the two previously banned accounts.
    The channel was first taken down in April 2018 in wake of a CNN investigation which found that ads from over 300 companies and organizations ran on YouTube channels promoting white nationalists, Nazis, pedophilia, conspiracy theories and North Korean propaganda. Run by Brian Ruhe, who had emphasized to CNN in 2018 that he did not want to be referred to as a “neo-Nazi,” because he thinks of himself as a “real, genuine and sincere Nazi,” the account deleted on Wednesday had over 3,300 subscribers when it was taken down. Earlier this year, Ruhe had posted to the channel a video of himself and friends celebrating Adolf Hitler’s birthday, complete with a cake featuring a Swastika made out of icing and “Heil Hitler!” salutes.
    Even though that account was deleted, a new Brian Ruhe account was already up on the site and posting videos on Wednesday, only hours after YouTube’s policy announcement.
    After CNN Business asked YouTube about the new account, the company took it down. Ruhe confirmed to CNN Business that both accounts belonged to him. He said YouTube told him the accounts were taken down for “severe or frequent violations” of YouTube’s policy prohibiting hate speech. But Ruhe claimed: “I deny that I have hate or that I use hate speech.”
    YouTube’s policies and its enforcement of them can be vague and inconsistent.
    The company says its rules are based on content, and not the person behind the content.
    But CNN Business’ attempts to get answers as to YouTube’s actions and thinking regarding several channels apparently owned by Ruhe were met with vague answers and new actions by YouTube that contradicted its previous positions.
    In addition to the new account Ruhe started Wednesday, CNN Business found two other accounts belonging to him. One focused on his brand of Buddhism; the other, which was dedicated to him livestreaming, contained only two lengthy videos, one of which included mentions of Adolf Hitler and Nazi ideology.
    After CNN Business asked YouTube about the accounts, it removed the livestreaming account and the account that Ruhe had started after the new policy was announced last week, though not the account about Buddhism.
    A cursory review of the account Ruhe started last week, though, did not reveal any content in obvious violation of YouTube’s policies. When CNN Business asked YouTube why it was removed, since both it and the Buddhism account did not immediately seem to be in violation, YouTube responded by removing the Buddhism account. A YouTube spokesperson declined to provide any further explanation about these decisions.
    Asked why YouTube hadn’t caught an account that had been banned twice before and was making no effort to hide what it was, the YouTube spokesperson said the platform relies on a combination of machine learning and user flags to address banned users making new accounts. The spokesperson also said YouTube removes reuploads of videos when flagged by its systems or users. The spokesperson declined to provide more details about how YouTube will address this issue in the future or how Ruhe was able to create multiple accounts.
    YouTube won't take down homophobic harassment videos, but it will demonetize them
    The company has long faced criticism for letting misinformation, conspiracy theories and extremist views spread on its platform.
    The company takes action on videos that violate its policies in several ways. It says it has four “pillars” for protecting users from harmful content, including deleting videos, restricting features on “borderline content,” promoting authoritative voices, and rewarding trusted creators with the ability to make money from their channel, while demonetizing those who violate its hate speech policies.
    YouTube deletes videos for violating its guidelines, including uploading pornography, copyrighted material or content whose primary purpose is inciting hatred.
    For videos that are what the company calls “borderline content,” it can opt to restrict certain features, such as removing the sidebar that appears to the right of most videos that recommends other content, or restricting the questionable videos from appearing on the “recommended” tab on the YouTube homepage. It can also add a content warning or disable comments.
    While YouTube’s community guidelines forbid “racial, ethnic, religious, or other slurs where the primary purpose is to promote hatred,” it has resisted removing Duke’s page, which includes among other things, a video in which he rails against the “Zionist Matrix of Power” that he falsely claims “controls Media, Politics and Banking.” Instead, YouTube has chosen to strip away several features from his videos, such as disabling comments, removing the sidebar next to the video that recommends videos and adding a content warning filter.

      YouTube can also take action on a channel by cutting off its ability to make money, such as through ads running on its videos. It’s unclear whether Duke and Spencer have the ability to monetize their channels, but neither channel appears to have ads running on them.
      YouTube’s new policies come after Facebook (FB) said in March it was banning white supremacist content from its platforms. Facebook’s ban came after the suspect in the terror attack at two New Zealand mosques live streamed part of the massacre on its platform.
      Source: Read Full Article