Google’s cloud platform improves its free tier
Google today quietly launched an improved always-free tier and trial program for its Cloud Platform. The free tier, which now offers enough power to run a small app in Google’s cloud, is offered in addition to an expanded free trial program.
Image Courtesy: lh5.ggpht
This free trial gives you $300 in credits that you can use over the course of 12 months. Previously, Google also offered the $300 in credits, but those had to be used within 60 days.
The free tier, which the company never really advertised, now allows for free usage of a small (f1-micro) instance in Compute Engine, Cloud Pub or Sub, Google Cloud Storage and Cloud Functions. In total, the free tier now includes 15 services.
The addition of the Compute Engine instance and 5GB of free Cloud Storage usage is probably the most important update here because those are, after all, the services that are at the core of most cloud applications.
It’s worth noting that the free tier is only available in Google’s us-east1, us-west1 and us-central1 regions. The idea here is clearly to get people comfortable with Google’s platform.
Astro raises an $8 million Series A for its AI-powered email solution for teams
The real story is that the startup, backed with a new $8 million Series A led by Redpoint, is gearing up to pitch enterprises on its collaboration platform that combines AI, social graphs and integrations with common CRM, ticketing and group messaging tools.
Astro, evoking images of Jetsons-era futurism, is a standalone mobile and desktop client for email. It has all the bells and whistles of a high-end Gmail replacement — think snooze, send later, priority inbox, unsubscribe and custom notifications.
But in the face of the bot-pocolypse, Astro is trying to sell users on the idea of an email bot that can act as the conversational interface for many common workflows like archiving messages and setting reminders.
The reality for Astro and its competitors is that new email solutions crop up and fail faster than we can learn to accommodate them into our workflows — we’re still getting over the death of Mailbox.
But Astro plans to supplement its revenue in a very serious way by servicing enterprises.
Google’s compute engine now offers machines with up to 64 CPU cores, 416GB of RAM
Google is doubling the maximum number of CPU cores developers can use with a single virtual machine on its Compute Engine service from 32 to 64.
These high-power machines are now available across all of Google’s standard configurations and as custom machine types, which allow you to select exactly how many cores and memory you want.
If you opt to use 64 cores in Google’s range of high-memory machine types, you’ll also get access to 416GB of RAM.
That’s also twice as much memory as Compute Engine previously offered for a single machine and enough for running most memory-intensive applications, including high-end in-memory databases.
Facebook’s instant articles now let sites show more ads
Facebook’s Instant Articles were always a bad deal for news outlets. While quick to load so they drove more readers, the hosted-on-Facebook mobile web format sterilized the design of publishers and severely limited how many ads and other business-critical units they could display.
Publishers need paying subscribers, event attendees and loyal daily readers, but they traded those for preferred status and referral traffic from Facebook because if they didn’t, their competitors would.
Now Facebook is cutting publishers a slightly better arrangement, allowing them to put a few more ads in each Instant Article. Ads can now appear every 250 words, instead of every 350.
This is one of the first launches out of Facebook’s new Journalism Project, which sees it trying to work more closely with publishers as it’s quickly become one of their top sources of traffic, and is also absorbing many of the ad dollars that used to go to them.
Image courtesy: fbookmedia.files.wordpress
To Facebook’s credit, it’s been throwing the news business some bones, but Instant Articles still feel unfair.
Here’s the list of initiatives the Journalism Project promised:
• Story packages that bundle multiple articles
• Investments in local news
• Subscription trials that let users sign up with their Facebook payment info
• Facebook + publisher team hackathons
• Facebook journalism training courses for reporters, such as how to use Live
• Ability to designate non-admins as contributors who can broadcast Live from a Page
• Bringing the Live video API’s capabilities to user profiles
• Free CrowdTangle access
• Expanded partnership with First Draft Partner Network for finding eyewitnesses
• Public service announcements promoting news literacy
• Additional features to fight fake news
Twitter tests a feature that warns users of profiles with ‘potentially sensitive content’
Twitter confirmed it’s testing a new feature that flags users’ profiles as potentially including “sensitive content.”
When you click on one of these profiles from a link on Twitter, or if you visit the profile’s web page directly, you won’t be immediately shown the users’ tweets. Instead, a warning message displays, reading “Caution: This profile may include sensitive content.”
When you click a link to the profile on Twitter, the message appears in a pop-up window. And if you visit the profile directly, the warning message is all that displays until you agree to view the content by clicking the “Yes, view profile” button.
A reporter at Mashable first spotted the feature when trying to view the profile of technology analyst Justin Warren, but could not determine how the content was flagged.
According to twitter, the new feature works similarly to how other sensitive content on Twitter gets flagged, based on users’ settings. Users can choose to mark themselves as someone who tweets sensitive content through their “Privacy and Safety” settings.
In addition, other Twitter users can report tweets to the Twitter team for review. In this case, if the tweet is determined to be potentially sensitive, Twitter will label the content appropriately — or remove it, if it’s a live video.
It may also adjust your account setting for you, so your future tweets are marked accordingly, if it deems it necessary.
Google makes it easier for companies to transfer data to its cloud
At Google’s Cloud Next conference, the company announced a series of new tools to assist users with data preparation and integration.
The updates bolster both the power and agility of Google Cloud for businesses. The first of these releases is the new private beta of Google Cloud Dataprep. Dataprep makes the data preparation process more visual.
Image Courtesy: Slideshare
The tool includes anomaly detection and employs machine learning to suggest data transformations that can improve the quality of data.
In an attempt to democratize the process, Google prioritized cleanliness of its interface, opting to enable control via drag-and-drop. Dataprep is optimized to be integrated with GCP, meaning it can create pipelines in Google Cloud Dataflow for easy export to BigQuery.
BigQuery itself also got attention from Google, with a new BigQuery Data Transfer Service. The idea behind the release is to simplify the process of merging data from multiple sources.
These capabilities increase with support for commercial data sets from Xignite, HouseCanary, Remind, AccuWeather and Dow Jones. When connected to visualization services like Tableau, users can seamlessly prepare and display analytics.
BigQuery will now support Cloud Bigtable for larger projects so users don’t have to waste time copying data from one system to the next.
Latest posts by Daniel Obaike (see all)
- How to create a blog and earn big bucks like Linda Ikeji - July 19, 2017
- 5 online marketing strategies every entrepreneur needs - May 4, 2017
- How do you convert clicks, like and comments to actual sales - April 12, 2017