InVision acquires design visibility tool Wake

Posted by on 3 April, 2018

This post was originally published on this site

InVision, the NY-based design platform focused on collaboration, has today announced the acquisition of Wake.

Wake is a design tool focused squarely on supporting design visibility throughout a particular organization. Wake allows companies to share design assets and view work in progress as designers build out their screens, logos, or other designs. Design team leaders, or other higher-ups at the company, can upvote certain design projects or give feedback on specific tweaks.

InVision CEO Clark Valberg said that one of the most attractive features of Wake is that sharing on the Wake platform was implicit, rather than on InVision where designers have to take an extra step to upload their prototypes on InVision.

Wake will continue to operate independently within InVision, and Valberg has plans to integrate some of the Wake tools into the InVision core product. Moreover, as part of the deal, Wake will be introducing a free tier.

“We’re in the midst of a shift,” said CEO Clark Valberg. “The screen is the most important place in the world. Every company is now a digital product company. The world of design is growing and the Wake product represents a very interesting philosophical vector of that market.”

The entire Wake team will join InVision. Wake was founded in 2013 by Chris Kalani and Johan Bakken, with a customer list that includes Capital One, Spotify, Palantir, Stripe, and Airbnb. In fact, InVision’s Valberg said that Wake’s customer overlap with InVision was one of the first things that alerted InVision to Wake.

Wake has raised a total of $3.8 million, with investments from FirstMark Capital and Design Fund.

The terms of the deal were not disclosed.

Posted Under: Tech News
JPMorgan’s blockchain head is leaving to start her own business

Posted by on 3 April, 2018

This post was originally published on this site

JPMorgan’s key blockchain executive is departing the bank for the world of startups, it has emerged.

Amber Baldet heads up JPMorgan’s Blockchain Center of Excellence, which explores the development of distributed ledger technology and uses cases of blockchain technology across the firm’s business. A high-profile figure in the blockchain space in her own right, she is leaving to start her venture, according to Reuters.

Baldet set up JPMorgan’s blockchain strategy and headed up its enterprise-focused Quorum blockchain, which is reportedly being considered for a spinout. As Baldet’s six-year tenure at the bank ends, she will be replaced by Christine Moy, who led blockchain services across the bank’s Investor Services and Capital Markets segments, according to Fortune.

The exit is an example of a talent drain that is beginning to take shape in banking and financial services with engineers and business execs moving over to blockchain and crypto projects that are seen to have serious growth potential.

That’s despite a rocky year to date for crypto, at least in terms of valuations.

Bitcoin reached nearly $20,000 per coin in December but it spent much of March below $10,000. As of today, one bitcoin is worth $7,387, according to Coinmarketcap.com. Top tokens like Ethereum, Litecoin and Ripple are also down significantly on their record January-December prices.

There’s a strong case to be made that a more stable crypto market is for the best, even if there is a loss in value. High prices led to high transaction fees, which made life difficult for developers whilst also adding uncertainty for market speculators and token collectors.

Note: The author owns a small amount of cryptocurrency. Enough to gain an understanding, not enough to change a life. (That’s definitely the case lately.)

Posted Under: Tech News
Apple, in a very Apple move, is reportedly working on its own Mac chips

Posted by on 2 April, 2018

This post was originally published on this site

Apple is planning to use its own chips for its Mac devices, which could replace the Intel chips currently running on its desktop and laptop hardware, according to a report from Bloomberg.

Apple already designs a lot of custom silicon, including its chipsets like the W-series for its Bluetooth headphones, the S-series in its watches, its A-series iPhone chips, as well as customized GPU for the new iPhones. In that sense, Apple has in a lot of ways built its own internal fabless chip firm, which makes sense as it looks for its devices to tackle more and more specific use cases and remove some of its reliance on third parties for their equipment. Apple is already in the middle of in a very public spat with Qualcomm over royalties, and while the Mac is sort of a tertiary product in its lineup, it still contributes a significant portion of revenue to the company.

Creating an entire suite of custom silicon could do a lot of things for Apple, the least of which bringing in the Mac into a system where the devices can talk to each other more efficiently. Apple already has a lot of tools to shift user activities between all its devices, but making that more seamless means it’s easier to lock users into the Apple ecosystem. If you’ve ever compared connecting headphones with a W1 chip to the iPhone and just typical Bluetooth headphones, you’ve probably seen the difference, and that could be even more robust with its own chipset. Bloomberg reports that Apple may implement the chips as soon as 2020.

Intel may be the clear loser here, and the market is reflecting that. Intel’s stock is down nearly 8% after the report came out, as it would be a clear shift away from the company’s typical architecture where it has long held its ground as Apple moves on from traditional silicon to its own custom designs. Apple, too, is not the only company looking to design its own silicon, with Amazon looking into building its own AI chips for Alexa in another move to create a lock-in for the Amazon ecosystem. And while the biggest players are looking at their own architecture, there’s an entire suite of startups getting a lot of funding building custom silicon geared toward AI.

Apple declined to comment.

Posted Under: Tech News
Microsoft launches 2 new Azure regions in Australia

Posted by on 2 April, 2018

This post was originally published on this site

Microsoft continues its steady pace of opening up new data centers and launching new regions for its Azure cloud. Today, the company announced the launch of two new regions in Australia. To deliver these new regions, Azure Australia Central and Central 2, Microsoft entered a strategic partnership with Canberra Data Centers and unsurprisingly, the regions are located in the country’s capital territory around Canberra. These new central regions complement Microsoft’s existing data center presence in Australia, which previously focused on the business centers of Sydney and Melbourne.

Given the location in Canberra, it’s also no surprise that Microsoft is putting an emphasis on its readiness for handling government workloads on its platform. Throughout its announcement, the company also emphasizes that all of its Australia data centers are also the right choice for its customers in New Zealand.

Julia White, Microsoft corporate VP for Azure, told me last month that the company’s strategy around its data center expansion has always been about offering a lot of small regions to allow it to be close to its customers (and, in return, to allow its customers to be close to their own customers, too). “The big distinction is the number of regions we have. “White said. “Microsoft started its infrastructure approach focused on enterprise organizations and built lots of regions because of that. We didn’t pick this regional approach because it’s easy or because it’s simple, but because we believe this is what our customers really want.”

Azure currently consists of 50 available or announced regions. Over time, more of these will also feature numerous availability zones inside every region, though for now, this recently announced feature is only present in two regions.

Posted Under: Tech News
Activist investors Elliott snag 10.3 percent stake in Commvault

Posted by on 2 April, 2018

This post was originally published on this site

Elliott Management, an investment firm long known for its activist streak, set it sights on Commvault today, purchasing a 10.3 percent stake and nominating four Elliott-friendly members to the company’s board of directors. It likely means that Elliott is ready to push the company to change direction and cut costs, if it sticks to its regular MO.

As an older public company founded in 1988 with a strong product, but weak stock performance, Commvault represents just the kind of company Elliott tends to target. In its letter outlining why it acquired its stake in Commvault, it presented a stark picture of a company in decline.

As just one small example, Elliott discussed the stock performance and it didn’t pull punches or mince words when it stated:

“Commvault’s strategy, operations, execution and leadership over the past eight years have failed to generate returns to shareholders, despite a leadership position in a growing market with a product set that customers like and competitors respect. Commvault’s underperformance has been so profound that an investor would have been better off buying the NASDAQ index instead of Commvault’s stock on 99% of trading days in the last eight years. …”

Ouch.

As it is wont to do, Elliott buys a stake and then forces its way onto the board of directors and this deal is no different where it will be adding 4 members:

“Given the long-term issues at the Company, we believe the Board would benefit from fresh perspectives, primarily in the area of operational execution, software go-to-market experience and current technology expertise. The level of required change at the Company is significant and requires a Board with new and relevant experiences to guide the Company’s turnaround. We have been involved in dozens of similar situations and have worked constructively with many companies to add top-tier, C-suite executives and experienced Board members to these companies. For Commvault, we are submitting a group of highly qualified director nominees with what we believe is the right experience to help guide the Company on its path forward.”

As some examples of that past experience it alluded to in the letter, Elliott bought a stake in EMC in 2014 and began to pressure the Board to sell its stake in VMware. The company turned back the attempt and eventually sold out to Dell for $67 billion, still giving Elliott a nice return on its one percent investment in the company, no doubt.

More recently, it bought a  6.5 percent stake in Akamai in December. At the next earnings call in February, the company announced it was laying off 400 employees, which accounted for almost 5 percent of the worldwide workforce. The layoffs are consistent with cost cutting that tends to happen when Elliott buys a stake in a company.

What happens next for Commvault is difficult to say, but investors obviously think there is going to be some movement as the stock is up over 11 percent as of this writing. Chances are they are onto something, and given Elliott’s track record they are probably right.

Posted Under: Tech News
SiFive gets $50.6M to help companies get their custom chip designs out the door

Posted by on 2 April, 2018

This post was originally published on this site

With the race to next-generation silicon in full swing, the waterfall of venture money flowing into custom silicon startups is already showing an enormous amount of potential for some more flexible hardware for an increasingly changing technology landscape — and Naveed Sherwani hopes to tap that for everyone else.

That’s the premise of SiFive, a startup that’s designed to help entrepreneurs — or any company — come up with a custom designed chip for their needs. But rather than having to raise tens of millions of dollars from a venture firm or have a massive production system in place, SiFive’s goal is to help get that piece of silicon in the hands of the developer quickly so they can see if it actually works based off a set of basic hardware and IP offered, and then figure out when and how to move it into full-scale production. The company starts by offering templates and then allows them to make some modifications for what eventually ends up as a piece of RISC-V silicon that’s in their hands. SiFive today said it has raised $50.6 million in venture financing in a round led by Sutter Hill Ventures, Spark Capital, and Osage University Partners.

“The way we view it, is that we think we should not depend on people learning special languages and things of that nature to be able to modify the architecture and enhance the architecture,” Sherwani said. “What we believe is there could be a high-level interface, which is what we’re building, which will allow people to take existing cores, bring them into their design space, and then apply a configuration. Moving those configurations, you can modify the core, and then you can get the new modified core. That’s the approach we take, we don’t have to learn a special language or be an expert, it’s the way we present the core. We’d like to start with cores that are verified, and each of these modifications does not cause to become non-verifiable.”

SiFive is based on a design framework for silicon called RISC-V. You could consider it a kind of open source analog to designs by major chip fab firms, but the goal for RISC-V chips is to lean on the decades of experience since the original piece of silicon came out of Intel to develop something that is less messy while still getting the right tasks done. Sherwani says that RISC-V chips have more than 50 instruction sets while common chips will have more than 1,000. By nature, they aren’t at the kind of scale of an Intel, so the kinds of efficiencies those firms might have don’t exist. But SiFive hopes to serve a wide array of varying needs rather than mass-producing a single style of silicon.

There are two flows for developers looking to build out silicon using SiFive. First is the prototype flow, where developers will get a chance to spec out their silicon and figure out their specific needs. The goal there is to get something into the hands of the developer they can use to showcase their ideas or technology, and SiFive works with IP vendors and other supply chain partners — during this time, developers aren’t paying for IP. Once the case is proved out (and the startup has, perhaps, raised money based on that idea) they can switch to a production flow with SiFive where they will start paying for IP and services. There’s also a potential marketplace element as more and more people come up with novel ideas for operational cores.

“For any segment in the market there will be a few templates available,” Sherwani said. “We’ll have some tools and methodologies there, and among all the various templates are available show what would be the best for [that customer]. We also have an app store — we are expecting people who have designed cores who are willing two share it, because they don’t need it to be proprietary. If anyone uses that template, then whatever price they can put on it, they can make some money doing that. This whole idea of marketplaces will get more people excited.”

As there is an intense rush to develop new customized silicon, it may be that services like the ones offered by SiFive become more and more necessary. But there’s another element to the bet behind SiFive: making the chip itself less ambiguous and trying to remove black boxes. That doesn’t necessarily make it wildly more secure than the one next to it, but at the very least, it means when there is a major security flaw like Intel’s Spectre problems, there may be a bit more tolerance from the developer community because there are fewer black boxes.

“All these complications are there and unless you have all this expertise, you can’t do a chip,” Sherwani said. “Our vision is that we deliver the entire chip experience to that platform and people can be able to log in. They don’t need a team, any tools, they don’t need FPGAs because all those will be available on the web. As a result the cost goes down because it’s a shared economy, they’re sharing tools, and that is how we think dramatically you can do chips at much lower cost.”

While there is a lot of venture money flowing into the AI chip space — with many different interpretations of what that hardware looks like — Sherwani said the benefit of working with SiFive is to be able to rapidly adapt an idea to a changing algorithm. Developers have already proven out a lot of different tools and frameworks, but once a piece of silicon is in production it’s not easy to change on the fly. Should those best practices or algorithms change, developers will have an opportunity to reassess and redesign the chip as quickly as possible.

The idea of that custom silicon is going to be a big theme going forward as more and more use cases emerge that could be easier with a customized piece of hardware. Already there are startups like Mythic and SambaNova Systems, which have raised tens of millions of dollars and specialize in the rapid-fire calculations for typical AI processes. But this kind of technology is now showing up in devices ranging from an autonomous vehicle to a fridge, and each use case may carry different needs. Intel and other chip design firms probably can’t hit every niche, and the one-size-fits-all (or even something more modular like an FPGA from Intel) might not hit each sweet spot. That, in theory, is the hole that a company like SiFive could fill.

Posted Under: Tech News
Atlassian’s two-year cloud journey

Posted by on 2 April, 2018

This post was originally published on this site

A couple of years ago, Dropbox shocked a lot of people when it decided to mostly drop the public cloud, and built its own datacenters. More recently, Atlassian did the opposite, closing most of its datacenters and moving to the cloud. Companies make these choices for a variety of reasons. When Atlassian CTO Sri Viswanath came on board in 2016, he made the decision to move the company’s biggest applications to AWS.

In part, this is a story of technical debt — that’s the concept that over time your applications become encumbered by layers of crusty code, making it harder to update and ever harder to maintain. For Atlassian, which was founded in 2002, that bill came due in 2016 when Viswanath came to work for the company.

Atlassian already knew they needed to update the code to move into the future. One of the reasons they brought Viswanath on board was to lead that charge, but the thinking was already in place even before he got there. A small team was formed back in 2015 to work out the vision and the architecture for the new cloud-based approach, but they wanted to have their first CTO in place to carry it through to fruition.

Shifting to microservices

He put the plan into motion, giving it the internal code name Vertigo — maybe because the thought of moving most of their software stack to the public cloud made the engineering team dizzy to even consider. The goal of the project was to rearchitect the software, starting with their biggest products Jira and Confluence, in a such a way that it would lay the foundation for the company for the next decade — no pressure or anything.

Photo: WILLIAM WEST/AFP/Getty Images

They spent a good part of 2016 rewriting the software and getting it set up on AWS. They concentrated on turning their 15-year old code into microservices, which in the end resulted in a smaller code base. He said the technical debt issues were very real, but they had to be careful not to reinvent the wheel, just change what needed to be changed whenever possible.

“The code base was pretty large and we had to go in and do two things. We wanted to build it for multi-tenant architecture and we wanted to create microservices,” he said. “If there was a service that could be pulled out and made self-contained we did that, but we also created new services as part of the process.”

Migrating customers on the fly

Last year was the migration year, and it was indeed a full year-long project to migrate every last customer over to the new system. It started in January and ended in December and involved moving tens of thousands of customers.

Photo: KTSDesign/Science Photo Library

First of all, they automated whatever they could and they also were very deliberate in terms of the migration order, being conscious of migrations that might be more difficult. “We were thoughtful in what order to migrate. We didn’t want to do easiest first and hardest at the end. We didn’t want to do just the harder ones and not make progress. We had to blend [our approaches] to fix bugs and issues throughout the project,” he said.

Viswanath stated that the overarching goal was to move the customers without a major incident. “If you talk to anyone who does migration, that’s a big thing. Everyone has scars doing migrations. We were conscious to do this pretty carefully.” Surprisingly, although it wasn’t perfect, they did manage to complete the entire exercise without a major outage, a point of which the team is justifiably proud. That doesn’t mean that it was always smooth or easy.

“It sounds super easy: ‘we were thoughtful and we migrated,’ but there was warfare every day. When you migrate, you hit a wall and react. It was a daily thing for us throughout the year,” he explained. It took a total team effort involving engineering, product and support. That included having a customer support person involved in the daily scrum meetings so they could get a feel for any issues customers were having and fix them as quickly as possible.

What they gained

As in any cloud project, there are some general benefits to moving an application to the cloud around flexibility, agility and resource elasticity, but there was more than that when it came to this specific project.

Photo: Ade Akinrujomu/Getty Images

First of all it has allowed faster deployment with multiple deployments at the same time, due in large part to the copious use of microservices. That means they can add new features much faster. During the migration year, they held off on new features for the most part because they wanted to keep things as static as possible for the shift over, but with the new system in place they can move much more quickly to add new features.

They get much better performance and if they hit a performance bottleneck, they can just add more resources because it’s the cloud. What’s more, they were able to have a local presence in the EU and that improves performance by having the applications closer to the end users located there.

Finally, they actually found the cloud to be a more economical option, something that not every company that moves to the cloud finds. By closing the datacenters and reducing the capital costs associated with buying hardware and hiring IT personnel to maintain it, they were able to reduce costs.

Managing the people parts

It was a long drawn out project, and as such, they really needed to think about the human aspect of it too. They would swap people in and out to make sure the engineers stayed fresh and didn’t burn out helping with the transition.

One thing that helped was the company culture in general, which Viswanath candidly describes as one with open communication and a general “no bullshit” policy. “We maintained open communication, even when things weren’t going well. People would raise their hand if they couldn’t keep up and we would get them help,” he said.

He admitted that there was some anxiety within the company and for him personally implementing a project of this scale, but they knew they needed to do it for the future of the organization. “There was definitely nervousness on what if this project doesn’t go well. It seemed the obvious right direction and we had to do it. The risk was what if we screwed up in execution and we didn’t realize benefits we set out to do.”

In the end, it was a lot of work, but it worked out just fine and they have the system in place for the future. “Now we are set up for the next 10 years,” he said.

Posted Under: Tech News
Red Hat looks beyond Linux

Posted by on 31 March, 2018

This post was originally published on this site

The Red Hat Linux distribution is turning 25 years old this week. What started as one of the earliest Linux distributions is now the most successful open-source company, and its success was a catalyst for others to follow its model. Today’s open-source world is very different from those heady days in the mid-1990s when Linux looked to be challenging Microsoft’s dominance on the desktop, but Red Hat is still going strong.

To put all of this into perspective, I sat down with the company’s current CEO (and former Delta Air Lines COO) Jim Whitehurst to talk about the past, present and future of the company, and open-source software in general. Whitehurst took the Red Hat CEO position 10 years ago, so while he wasn’t there in the earliest days, he definitely witnessed the evolution of open source in the enterprise, which is now more widespread than every.

“Ten years ago, open source at the time was really focused on offering viable alternatives to traditional software,” he told me. “We were selling layers of technology to replace existing technology. […] At the time, it was open source showing that we can build open-source tech at lower cost. The value proposition was that it was cheaper.”

At the time, he argues, the market was about replacing Windows with Linux or IBM’s WebSphere with JBoss. And that defined Red Hat’s role in the ecosystem, too, which was less about technological information than about packaging. “For Red Hat, we started off taking these open-source projects and making them usable for traditional enterprises,” said Whitehurst.

Jim Whitehurst, Red Hat president and CEO (photo by Joan Cros/NurPhoto via Getty Images)

About five or six ago, something changed, though. Large corporations, including Google and Facebook, started open sourcing their own projects because they didn’t look at some of the infrastructure technologies they opened up as competitive advantages. Instead, having them out in the open allowed them to profit from the ecosystems that formed around that. “The biggest part is it’s not just Google and Facebook finding religion,” said Whitehurst. “The social tech around open source made it easy to make projects happen. Companies got credit for that.”

He also noted that developers now look at their open-source contributions as part of their resumé. With an increasingly mobile workforce that regularly moves between jobs, companies that want to compete for talent are almost forced to open source at least some of the technologies that don’t give them a competitive advantage.

As the open-source ecosystem evolved, so did Red Hat. As enterprises started to understand the value of open source (and stopped being afraid of it), Red Hat shifted from simply talking to potential customers about savings to how open source can help them drive innovation. “We’ve gone from being commeditizers to being innovators. The tech we are driving is now driving net new innovation,” explained Whitehurst. “We are now not going in to talk about saving money but to help drive innovation inside a company.”

Over the last few years, that included making acquisitions to help drive this innovation. In 2015, Red Hat bought IT automation service Ansible, for example, and last month, the company closed its acquisition of CoreOS, one of the larger independent players in the Kubernetes container ecosystem — all while staying true to its open-source root.

There is only so much innovation you can do around a Linux distribution, though, and as a public company, Red Hat also had to look beyond that core business and build on it to better serve its customers. In part, that’s what drove the company to launch services like OpenShift, for example, a container platform that sits on top of Red Hat Enterprise Linux and — not unlike the original Linux distribution — integrates technologies like Docker and Kubernetes and makes them more easily usable inside an enterprise.

The reason for that? “I believe that containers will be the primary way that applications will be built, deployed and managed,” he told me, and argued that his company, especially after the CoreOS acquisition, is now a leader in both containers and Kubernetes. “When you think about the importance of containers to the future of IT, it’s a clear value for us and for our customers.”

The other major open-source project Red Hat is betting on is OpenStack . That may come as a bit of a surprise, given that popular opinion in the last year or so has shifted against the massive project that wants to give enterprises an open source on-premise alternative to AWS and other cloud providers. “There was a sense among big enterprise tech companies that OpenStack was going to be their savior from Amazon,” Whitehurst said. “But even OpenStack, flawlessly executed, put you where Amazon was five years ago. If you’re Cisco or HP or any of those big OEMs, you’ll say that OpenStack was a disappointment. But from our view as a software company, we are seeing good traction.”

Because OpenStack is especially popular among telcos, Whitehurst believes it will play a major role in the shift to 5G. “When we are talking to telcos, […] we are very confident that OpenStack will be the platform for 5G rollouts.”

With OpenShift and OpenStack, Red Hat believes that it has covered both the future of application development and the infrastructure on which those applications will run. Looking a bit further ahead, though, Whitehurst also noted that the company is starting to look at how it can use artificial intelligence and machine learning to make its own products smarter and more secure, but also at how it can use its technologies to enable edge computing. “Now that large enterprises are also contributing to open source, we have a virtually unlimited amount of material to bring our knowledge to,” he said.

 

Posted Under: Tech News
As marketing data proliferates, consumers should have more control

Posted by on 30 March, 2018

This post was originally published on this site

At the Adobe Summit in Las Vegas this week, privacy was on many people’s minds. It was no wonder with social media data abuse dominating the headlines, GDPR just around the corner, and Adobe announcing the concept of a centralized customer experience record.

With so many high profile breaches in recent years, putting your customer data in a central record-keeping system would seem to be a dangerous proposition, yet Adobe sees so many positives for marketers, it likely sees this as a worthy trade-off.

Which is not to say that the company doesn’t see the risks. Executives speaking at the conference continually insisted that privacy is always part of the conversation at Adobe as they build tools — and they have built in security and privacy safeguards into the customer experience record.

Offering better experiences

The point of the exercise isn’t simply to collect data for data’s sake, it’s to offer consumers a more customized and streamlined experience. How does that work? There was a demo in the keynote illustrating a woman’s experience with a hotel brand.

Brad Rencher, EVP and GM at Adobe Experience Cloud explains Adobe’s Cloud offerings. Photo: Jeff Bottari/Invision for Adobe/AP Images

The mythical woman started a reservation for a trip to New York City, got distracted in the middle and was later “reminded” to return to it via Facebook ad. She completed the reservation and was later issued a digital key to key to her room, allowing to bypass the front desk check-in. Further, there was a personal greeting on the television in her room with a custom message and suggestions for entertainment based on her known preferences.

As one journalist pointed out in the press event, this level of detail from the hotel is not something that would thrill him (beyond the electronic check-in). Yet there doesn’t seem to be a way to opt out of that data (unless you live in the EU and are subject to GDPR rules).

Consumers may want more control

As it turns out, that reporter wasn’t alone. According to a survey conducted last year by The Economist Intelligence Unit in conjunction with ForgeRock, an identity management company, consumers are not just willing sheep that tech companies may think we are.

The survey was conducted last October with 1,629 consumers participating from eight countries including Australia, China, France, Germany, Japan, South Korea, the UK and the US. It’s worth noting that survey questions were asked in the context of Internet of Things data, but it seems that the results could be more broadly applied to any types of data collection activities by brands.

There are a couple of interesting data points that perhaps brands should heed as they collect customer data in the fashion outlined by Adobe. In particular as it relates to what Adobe and other marketing software companies are trying to do to build a central customer profile, when asked to rate the statement, “I am uncomfortable with companies building a “profile” of me to predict my consumer behaviour,” 39 percent strongly agreed with that statement. Another 35 percent somewhat agreed. That would suggest that consumers aren’t necessarily thrilled with this idea.

When presented with the statement, Providing my personal information may have more drawbacks than benefits, 32 percent strongly agreed and 41 percent somewhat agreed.

That would suggest that it is on the brand to make it clearer to consumers that they are collecting that data to provide a better overall experience, because it appears that consumers who answered this survey are not necessarily making that connection.

Perhaps it wasn’t a coincidence that at a press conference after the Day One keynote announcing the unified customer experience record, many questions from analysts and journalists focused on notions of privacy. If Adobe is helping companies gather and organize customer data, what role do they have in how their customers’ use that data, what role does the brand have and how much control should consumers have over their own data?

These are questions we seem to be answering on the fly. The technology is here now or very soon will be, and wherever the data comes from, whether the web, mobile devices or the Internet of Things, we need to get a grip on the privacy implications — and we need to do it quickly. If consumers want more control as this survey suggests, maybe it’s time for companies to give it to them.

Posted Under: Tech News
Azure’s availability zones are now generally available

Posted by on 30 March, 2018

This post was originally published on this site

No matter what cloud you build on, if you want to build something that’s highly available, you’re always going to opt to put your applications and data in at least two physically separated regions. Otherwise, if a region goes down, your app goes down, too. All of the big clouds also offer a concept called ‘availability zones’ in their regions to offer developers the option to host their applications in two separate data centers in the same zone for a bit of extra resilience. All big clouds, that is, except for Azure, which is only launching its availability zones feature into general availability today after first announcing a beta last September.

Ahead of today’s launch, Julia White, Microsoft’s corporate VP for Azure, told me that the company’s design philosophy behind its data center network was always about servicing commercial customers with the widest possible range of regions to allow them to be close to their customers and to comply with local data sovereignty and privacy laws. That’s one of the reasons why Azure today offers more regions than any of its competitors, with 38 generally available regions and 12 announced ones.

“Microsoft started its infrastructure approach focused on enterprise organizations and built lots of regions because of that,” White said. “We didn’t pick this regional approach because it’s easy or because it’s simple, but because we believe this is what our customers really want.”

Every availability zone has its own network connection and power backup, so if one zone in a region goes down, the others should remain unaffected. A regional disaster could shut down all of the zones in a single region, though, so most business will surely want to keep their data in at least one additional region.

Posted Under: Tech News
Page 6 of 836« First...45678...203040...Last »

Social Media

Bulk Deals

Subscribe for exclusive Deals

Recent Post

Archives

Facebook

Twitter

Subscribe for exclusive Deals




Copyright 2015 - InnovatePC - All Rights Reserved

Site Design By Digital web avenue