This post is intended for businesses and other organizations interested... Read more →
Posted by Richy George on 11 March, 2019This post was originally published on this site
Security researchers have found dozens of companies inadvertently leaking sensitive corporate and customer data because staff are sharing public links to files in their Box enterprise storage accounts that can be easily discovered.
The discoveries were made by Adversis, a cybersecurity firm, which found major tech companies and corporate giants had left data inadvertently exposed. Although data stored in Box enterprise accounts is private by default, users can share files and folders with anyone, making data publicly accessible with a single link. But Adversis said these secret links can be discovered by others. Using a script to scan for and enumerate Box accounts with lists of company names and wildcard searches, Adversis found over 90 companies with publicly accessible folders.
Not even Box’s own staff were immune from leaking data.
The company said while much of the data is legitimately public and Box advises users how to minimize risks, many employees may not know the sensitive data they share can be found by others.
Worse, some public folders scraped and indexed by search engines, making the data found more easily.
In a blog post, Adversis said Box administrators should reconfigure the default access for shared links to “people in your company” to reduce accidental exposure of data to the public.
Adversis said it found passport photos, bank account and Social Security numbers, passwords, employee lists, financial data like invoices and receipts, and customer data were among the data found. The company contacted Box to warn of the larger exposures of sensitive data, but noted that there was little overall improvement six months after its initial disclosure.
“There is simply too much out there and not enough time to resolve each individually,” he said.
Adversis provided TechCrunch with a list of known exposed Box accounts. We contacted several of the big companies named, as well as those known to have highly sensitive data, including:
Box, which initially had no comment when we reached out, had several folders exposed. The company exposed signed non-disclosure agreements on their clients, including several U.S. schools, as well as performance metrics of its own staff, the researchers said.
Box spokesperson Denis Ron said in a statement: “We take our customers’ security seriously and we provide controls that allow our customers to choose the right level of security based on the sensitivity of the content they are sharing. In some cases, users may want to share files or folders broadly and will set the permissions for a custom or shared link to public or ‘open’. We are taking steps to make these settings more clear, better help users understand how their files or folders can be shared, and reduce the potential for content to be shared unintentionally, including both improving admin policies and introducing additional controls for shared links.”
The cloud giant said it plans to reduce the unintended discovery of public files and folders.
Amadeus, Apple, Box, Discovery, Herbalife, Edelman and Pointcare all reconfigured their enterprise accounts to prevent access to their leaking files after TechCrunch reached out.
Amadeus spokesperson Alba Redondo said the company decommissioned Box in October and blamed the exposure on an account that was “misconfigured in public mode” which has now been corrected and external access to it is now closed. “We continue to investigate this issue and confirm there has been no unauthorized access of our system,” said the spokesperson, without explanation. “There is no evidence that confidential information or any information containing personal data was impacted by this issue,” the spokesperson added. We’ve asked Amadeus how it concluded there was no improper access, and will update when we hear back.
Pointcare chief executive Everett Lebherz confirmed its leaking files had been “removed and Box settings adjusted.” Edelman’s global marketing chief Michael Bush said the company was “looking into this matter.”
Herbalife spokesperson Jennifer Butler said the company was “looking into it,” but we did not hear back after several follow-ups. (Butler declared her email “off the record,” which requires both parties agree to the terms in advance, but are printing the reply as we were given no opportunity to reject the terms.)
When reached, an Apple spokesperson did not comment by the time of publication.
Discovery, Opportunity International, Schneider Electric, and United Tissue Network did not return a request for comment.
Data “dumpster diving” is not a new hobby for the skilled, but it’s a necessary sub-industry to fix an emerging category of data breaches: leaking, public, and exposed data that shouldn’t be. It’s a growing space that we predicted would grow as more security researchers look to find and report data leaks.
This year alone, we’ve reported data leaks at Dow Jones, Rubrik, NASA, AIESEC, Uber, the State Bank of India, two massive batches of Indian Aadhaar numbers, a huge leak of mortgage and loan data, and several Chinese government surveillance systems.
Adversis has open-sourced and published its scanning tool.
Posted by Richy George on 8 March, 2019This post was originally published on this site
Ten years after the launch of Foursquare at SXSW, the company is laying its technology bare with a futuristic version of its old app that doesn’t require a check-in at all. The godfather of location apps is returning to the launchpad with Hypertrending, but this time it hopes to learn what developers might do with real-time info about where people are and where they aren’t.
Hypertrending uses Foursquare’s Pilgrim technology, which is baked into Foursquare’s apps and offered as an third-party enterprise tool, to show where phones are in real time over the course of SXSW in Austin, TX.
This information is relayed through dots on a map. The size of those dots is a reflection of the number of devices in that place at a given time. Users can filter the map by All places, Food, Nightlife, and Fun (events and parties).
Hypertrending also has a Top 100 list that is updated in real time to show which places are super popular, with arrows to show whether a place is trending up or down.
Before you throw up your hands in outrage, the information on Hypertrending is aggregated and anonymized (just like it is within Pilgrim), and there are no trails showing the phone’s route from one place to another. Dots only appear on the map when the phone arrives at a destination.
Hypertrending was cooked up in Foursquare’s skunkworks division, Foursquare Labs, led by the company’s cofounder Dennis Crowley .
The feature is only available during SXSW and in the Austin area, and thus far Foursquare has no plans to launch this publicly. So… what’s the deal?
First and foremost, Hypertrending is about showing off the technology. In many ways, Hypertrending isn’t new at all, in that it runs off of the Pilgrim technology that has powered Foursquare since around 2014.
Pilgrim is the tech that recognizes you’ve just sit down at a restaurant and offers up a tip about the menu on Foursquare City Guide, and it’s the same tech that notices you’ve just touched down in a new city and makes some recommendations on places to go. In Swarm, it’s the tech that offers up a list of all the places you’ve been in case you want to retroactively check in to them.
That sounds rather simple, but a combination of Foursquare’s 10 years worth of location data and Pilgrim’s hyper-precision is unparalleled when it comes to accuracy, according to Crowley.
Whereas other location tech might not understand the difference between you being in the cafe on the first floor or the salon on the second floor, or the bar that shares a wall with both, Pilgrim does.
This is what led Foursquare to build out the Pilgrim SDK, which now sees more than 100 million user-confirmed visits per month. Apps that use the Pilgrim SDK offer users the ability to opt-in to Foursquare’s always-on location tracking for its mobile app panel in the U.S., which has grown to 10 million devices.
These 10 million phones provide the data that powers Hypertrending.
Now, the data itself might not be new, per se. But Foursquare has never visualized the information quite like this, even for enterprise customers.
Whereas customers of the Foursquare Place Insights, Pinpoint and Attribution get snapshots into their own respective audiences, Hypertrending represents on a large scale just what Foursquare’s tech is capable of in not only knowing where people are, but where people aren’t.
This brings us back to SXSW, which happens to be the place where Foursquare first launched back in 2009.
“This week has felt a little nostalgic as we try to get this thing ready to go,” said Crowley. “It’s not that dissimilar to when we went to SXSW in 2009 and showed off Foursquare 1.0. There is this curious uncertainty and my whole thing is to get a sense of what people think of it.”
Crowley recalled his first trip to SXSW with cofounder Naveen Selvadurai. They couldn’t afford an actual pass to the show so they just went from party to party showing people the app and hearing what they thought. Crowley said that he doesn’t expect Hypertrending to be some huge consumer app.
“I want to show off what we can do with the technology and the data and hopefully inspire developers to do interesting stuff with this raw visualization of where phones are at,” said Crowley. “What would you do if you had access to this? Would you make something cool and fun or make something obnoxious and creepy?”
Beyond the common tie of SXSW, Hypertrending brings Foursquare’s story full circle in the fact that it’s potentially the most poignant example of what Crowley always wanted Foursquare to be. Location is one of the most powerful pieces of information about an individual. One’s physical location is, in many ways, the most purely truthful piece of information about them in a sea of digital clicks and scroll-bys.
If this data could be harnessed properly, without any work on the side of the consumer, what possibilities might open up?
“We’ve long talked about making ‘a check-in button you never had to press’,” said Crowley in the blog post. “Hypertrending is part of that vision realized, spread across multiple apps and services.”
Crowley also admits in the blog post that Hypertrending walks a fine line between creepy and cool, which is another reason for the ephemeral nature of the feature. It’s also the exact reason he wants to open it up to everyone.
From the blog post:
After 10 years, it’s clear that we (Foursquare!) are going to play a role in influencing how contextual-aware technologies shape the future – whether that’s apps that react to where you are and where you’ve been, smarter virtual assistants (e.g Alexa, Siri, Marsbot) that understand how you move through cities, or AR objects that need to appear at just the right time in just the right spot. We want to build a version of the future that we’re proud of, and we want your input as we get to work building it.
We made Hypertrending to show people how Foursquare’s panel works in terms of what it can do (and what it will not do), as well as to show people how we as a company think about navigating this space. We feel the general trend with internet and technology companies these days has been to keep giving users a more and more personalized (albeit opaquely personalized) view of the world, while the companies that create these feeds keep the broad “God View” to themselves. Hypertrending is one example of how we can take Foursquare’s aggregate view of the world and make it available to the users who make it what it is. This is what we mean when we talk about “transparency” – we want to be honest, in public, about what our technology can do, how it works, and the specific design decisions we made in creating it.
We asked Crowley what would happen if brands and marketers loved the idea of Hypertrending, but general consumers were freaked out?
“This is an easy question,” said Crowley. “If this freaks people out, we don’t build stuff with it. We’re not ready for it yet. But I’d go back to the drawing board and ask ‘What do we learn from people that are freaked out about it that would helps us communicate to them’, or ‘what are the changes we could make to this that would make people comfortable’, or ‘what are the things we could build that would illustrate the value of this that this view didn’t communicate?’”
As mentioned above, Hypertrending is only available during the SXSW conference in the Austin area. Users can access Hypertrending through both the Foursquare City Guide app and Swarm by simply shaking their phone.
Posted by Richy George on 8 March, 2019This post was originally published on this site
In a blog post announcing the news, Okta co-founder and COO Frederic Kerrest saw the combining of the two companies as a way to move smoothly between applications in a complex workflow without having to constantly present your credentials.
“With Okta and Azuqua, IT teams will be able to use pre-built connectors and logic to create streamlined identity processes and increase operational speed. And, product teams will be able to embed this technology in their own applications alongside Okta’s core authentication and user management technology to build…integrated customer experiences,” Kerrest wrote.
In a modern enterprise, people and work are constantly shifting and moving between applications and services and combining automation software with identity and access management could offer a seamless way to move between them.
This represents Okta’s largest acquisition to-date and follows Stormpath almost exactly two years ago and ScaleFT last July. Taken together, you can see a company that is trying to become a more comprehensive identity platform.
Azuqua, which had raised $16 million since it launched in 2013, appears to have given investors a pretty decent return. When the deal closes, Okta intends to bring its team on board and leave them in place in their Bellevue offices, creating a Northwest presence for the San Francisco company. Azuqua customers include Airbnb, McDonald’s, VMware and Hubspot,
Okta was founded in 2009 and raised over $229 million before going public April, 2017.
Posted by Richy George on 8 March, 2019This post was originally published on this site
Salesforce is celebrating its 20th anniversary today. The company that was once a tiny irritant going after giants in the 1990s Customer Relationship Management (CRM) market, such as Oracle and Siebel Systems, has grown into full-fledged SaaS powerhouse. With an annual run rate exceeding $14 billion, it is by far the most successful pure cloud application ever created.
Twenty years ago, it was just another startup with an idea, hoping to get a product out the door. By now, a legend has built up around the company’s origin story, not unlike Zuckerberg’s dorm room or Jobs’ garage, but it really did all begin in 1999 in an apartment in San Francisco, where a former Oracle executive named Marc Benioff teamed with a developer named Parker Harris to create a piece of business software that ran on the internet. They called it Salesforce .com.
None of the handful of employees who gathered in that apartment on the company’s first day in business in 1999 could possibly have imagined what it would become 20 years later, especially when you consider the start of the dot-com crash was just a year away..
It all began on March 8, 1999 in the apartment at 1449 Montgomery Street in San Francisco, the site of the first Salesforce office. The original gang of four employees consisted of Benioff and Harris and Harris’s two programming colleagues Dave Moellenhoff and Frank Dominguez. They picked the location because Benioff lived close by.
It would be inaccurate to say Salesforce was the first to market with Software as a Service, a term, by the way, that would not actually emerge for years. In fact, there were a bunch of other fledgling enterprise software startups trying to do business online at the time including NetLedger, which later changed its name NetSuite, and was eventually sold to Oracle for $9.3 billion in 2016.
Other online CRM competitors included Salesnet, RightNow Technologies and Upshot. All would be sold over the next several years. Only Salesforce survived as a stand-alone company. It would go public in 2004 and eventually grow to be one of the top 10 software companies in the world.
Co-founder and CTO Harris said recently that he had no way of knowing that any of that would happen, although having met Benioff, he thought there was potential for something great to happen. “Little did I know at that time, that in 20 years we would be such a successful company and have such an impact on the world,” Harris told TechCrunch.
It wasn’t entirely a coincidence that Benioff and Harris had connected. Benioff had taken a sabbatical from his job at Oracle and was taking a shot at building a sales automation tool that ran on the internet. Harris, Moellenhoff and Dominguez had been building salesforce automation software solutions, and the two visions meshed. But building a client-server solution and building one online were very different.
You have to remember that in 1999, there was no concept of Infrastructure as a Service. It would be years before Amazon launched Amazon Elastic Compute Cloud in 2006, so Harris and his intrepid programming team were on their own when it came to building the software and providing the servers for it to scale and grow.
“I think in a way, that’s part of what made us successful because we knew that we had to, first of all, imagine scale for the world,” Harris said. It wasn’t a matter of building one CRM tool for a large company and scaling it to meet that individual organization’s demand, then another, it was really about figuring out how to let people just sign up and start using the service, he said.
“I think in a way, that’s part of what made us successful because we knew that we had to, first of all, imagine scale for the world.” Parker Harris, Salesforce
That may seem trivial now, but it wasn’t a common way of doing business in 1999. The internet in those years was dominated by a ton of consumer-facing dot-coms, many of which would go bust in the next year or two. Salesforce wanted to build an enterprise software company online, and although it wasn’t alone in doing that, it did face unique challenges being one of the early adherents.
“We created a software that was what I would call massively multi-tenant where we couldn’t optimize it at the hardware layer because there was no Infrastructure as a Service. So we did all the optimization above that — and we actually had very little infrastructure early on,” he explained.
From the beginning, Benioff had the vision and Harris was charged with building it. Tien Tzuo, who would go on to be co-founder at Zuora in 2007, was employee number 11 at Salesforce, starting in August of 1999, about five months after the apartment opened for business. At that point, there still wasn’t an official product, but they were getting closer when Benioff hired Tzuo.
As Tzuo tells it, he had fancied a job as a product manager, but when Benioff saw his Oracle background in sales, he wanted him in account development. “My instinct was, don’t argue with this guy. Just roll with it,” Tzuo relates.
As Tzuo pointed out, in a startup with a handful of people, titles mattered little anyway. “Who cares what your role was. All of us had that attitude. You were a coder or a non-coder,” he said. The coders were stashed upstairs with a view of San Francisco Bay and strict orders from Benioff to be left alone. The remaining employees were downstairs working the phones to get customers.
“Who cares what your role was. All of us had that attitude. You were a coder or a non-coder.” Tien Tzuo, early employe
The first Wayback Machine snapshot of Salesforce.com is from November 15, 1999, It wasn’t fancy, but it showed all of the functionality you would expect to find in a CRM tool: Accounts, Contacts, Opportunities, Forecasts and Reports with each category represented by a tab.
The site officially launched on February 7, 2000 with 200 customers, and they were off and running.
Every successful startup needs visionary behind it, pushing it, and for Salesforce that person was Marc Benioff. When he came up with the concept for the company, the dot-com boom was in high gear. In a year or two, much of it would come crashing down, but in 1999 anything was possible and Benioff was bold and brash and brimming with ideas.
But even good ideas don’t always pan out for so many reasons, as many a failed startup founder knows only too well. For a startup to succeed it needs a long-term vision of what it will become, and Benioff was the visionary, the front man, the champion, the chief marketer. He was all of that — and he wouldn’t take no for an answer.
Paul Greenberg, managing principal at The 56 Group and author of multiple books about the CRM industry including CRM at the Speed of Light (the first edition of which was published in 2001), was an early user of Salesforce, and says that he was not impressed with the product at first, complaining about the early export functionality in an article.
A Salesforce competitor at the time, Salesnet, got wind of Greenberg’s post, and put his complaint on the company website. Benioff saw it, and fired off an email to Greenberg: “I see you’re a skeptic. I love convincing skeptics. Can I convince you?” Greenberg said that being a New Yorker, he wrote back with a one-line response. “Take your best shot.” Twenty years later, Greenberg says that Benioff did take his best shot and he did end up convincing him.
“I see you’re a skeptic. I love convincing skeptics. Can I convince you?” Early Marc Benioff email
Laurie McCabe, who is co-founder and partner at SMB Group, was working for a consulting firm in Boston in 1999 when Benioff came by to pitch Salesforce to her team. She says she was immediately impressed with him, but also with the notion of putting enterprise software online, effectively putting it within reach of many more companies.
“He was the ringmaster I believe for SaaS or cloud or whatever we want to call it today. And that doesn’t mean some of these other guys didn’t also have a great vision, but he was the guy beating the drum louder. And I just really felt that in addition to the fact that he was an exceptional storyteller, marketeer and everything else, he really had the right idea that software on prem was not in reach of most businesses,” she said.
One of the ways that Benioff put the company in the public eye in the days before social media was guerrilla marketing techniques. He came up with the idea of “no software” as a way to describe software on the internet. He sent some of his early employees to “protest” at the Siebel Conference, taking place at the Moscone Center in February, 2000. He was disrupting one of his major competitors, and it created enough of a stir to attract a television news crew and garner a mention in the Wall Street Journal. All of this was valuable publicity for a company that was still in its early stages.
Brent Leary, who had left his job as an industry consultant in 2003 to open his current firm, CRM Essentials, said this ability to push the product was a real differentiator for the company and certainly got his attention. “I had heard about Salesnet and these other ones, but these folks not only had a really good product, they were already promoting it. They seemed to be ahead of the game in terms of evangelizing the whole “no software” thing. And that was part of the draw too,” Leary said of his first experiences working with Salesforce.
Leary added, “My first Dreamforce was in 2004, and I remember it particularly because it was actually held on Election Day 2004 and they had a George W. Bush look-alike come and help open the conference, and some people actually thought it was him.”
Greenberg said that the “no software” campaign was brilliant because it brought this idea of delivering software online to a human level. “When Marc said, ‘no software’ he knew there was software, but the thing with him is, that he’s so good at communicating a vision to people.” Software in the 90s and early 2000s was delivered mostly in boxes on CDs (or 3.5 inch floppies), so saying no software was creating a picture that you didn’t have to touch the software. You just signed up and used it. Greenberg said that campaign helped people understand online software at a time when it wasn’t a common delivery method.
One of the big differentiators for Salesforce as a company was the culture it built from Day One. Benioff had a vision of responsible capitalism and included their charitable 1-1-1 model in its earliest planning documents. The idea was to give one percent of Salesforce’s equity, one percent of its product and one percent of its employees’ time to the community. As Benioff once joked, they didn’t have a product and weren’t making any money when they made the pledge, but they have stuck to it and many other companies have used the model Salesforce built.
Bruce Cleveland, a partner at Wildcat Ventures, who has written a book with Geoffrey Moore of Crossing the Chasm fame called Traversing the Traction Gap, says that it is essential for a startup to establish a culture early on, just as Benioff did. “A CEO has to say, these are the standards by which we’re going to run this company. These are the things that we value. This is how we’re going to operate and hold ourselves accountable to each other,” Cleveland said. Benioff did that.
Another element of this was building trust with customers, a theme that Benioff continues to harp on to this day. As Harris pointed out, people still didn’t trust the internet completely in 1999, so the company had to overcome objections to entering a credit card online. Even more than that though, they had to get companies to agree to share their precious customer data with them on the internet.
“We had to not only think about scale, we had to think about how do we get the trust of our customers, to say that we will protect your information as well or better than you can,” Harris explained.
The company was able to overcome those objections, of course, and more. Todd McKinnon, who is currently co-founder and CEO at Okta, joined Salesforce as VP of Engineering in 2006 as the company began to ramp up becoming a $100 million company, and he says that there were some growing pains in that time period.
When he arrived, they were running on three mid-tier Sun servers in a hosted co-location facility. McKinnon said that it was not high-end by today’s standards. “There was probably less RAM than what’s in your MacBook Pro today,” he joked.
When he came on board, the company still had only 13 engineers and the actual infrastructure requirements were still very low. While that would change during his six year tenure, it was working fine when he got there. Within five years, he said, that changed dramatically as they were operating their own data centers and running clusters of Dell X86 servers — but that was down the road.
Before they did that, they went back to Sun one more time and bought four of the biggest boxes they sold at the time and proceeded to transfer all of the data. The problem was that the Oracle database wasn’t working well, so as McKinnon tells it, they got on the phone with Larry Ellison from Oracle, who upon hearing about the setup, asked them straight out why they were doing that? The way they had it set up simply didn’t work.
They were able to resolve it all and move on, but it’s the kind of crisis that today’s startups probably wouldn’t have to deal with because they would be running their company on a cloud infrastructure service, not their own hardware.
About this same time, Salesforce began a strategy to grow through acquisitions. In 2006, it acquired the first of 55 companies when it bought a small wireless technology company called Sendia for $15 million. As early as 2006, the year before the first iPhone, the company was already thinking about mobile.
Last year it made its 52nd acquisition, and the most costly so far, when it purchased Mulesoft for $6.5 billion, giving it a piece of software that could help Salesforce customers bridge the on-prem and cloud worlds. As Greenberg pointed out, this brought a massive change in messaging for the company.
“With the Salesforce acquisition of MuleSoft, it allows them pretty much to complete the cycle between back and front office and between on-prem and the cloud. And you notice, all of a sudden, they’re not saying ‘no software.’ They’re not attacking on-premise. You know, all of this stuff has gone by the wayside,” Greenberg said.
No company is going to be completely consistent as it grows and priorities shift, but if you are a startup looking for a blueprint on how to grow a successful company, Salesforce would be a pretty good company to model yourself after. Twenty years into this, they are still growing and still going strong and they remain a powerful voice for responsible capitalism, making lots of money, while also giving back to the communities where they operate.
One other lesson that you could learn is that you’re never done. Twenty years is a big milestone, but it’s just one more step in the long arc of a successful organization.
Posted by Richy George on 6 March, 2019This post was originally published on this site
Clari started out as a company that wanted to give sales teams more information about their sales process than could be found in the CRM database. Today, the company announced a much broader platform, one that can provide insight across sales, marketing and customer service to give a more unified view of a company’s go-to-market operations, all enhanced by AI.
Company co-founder and CEO Andy Byrne says this involves pulling together a variety of data and giving each department the insight to improve their mission. “We are analyzing large volumes of data found in various revenue systems — sales, marketing, customer success, etc. — and we’re using that data to provide a new platform that’s connecting up all of the different revenue departments,” Byrne told TechCrunch.
For sales that would mean driving more revenue. For marketing it would it involve more targeted plans to drive more sales, and for customer success it would be about increasing customer retention and reducing churn.
The company’s original idea when it launched in 2012 was looking at a range of data that touched the sales process such as email, calendars and the CRM database to bring together a broader view of sales than you could get by looking at the basic customer data stored in the CRM alone. The Clari data could tell the reps things like which deals would be most likely to close and which ones were at risk.
“We were taking all of these signals that had been historically disconnected from each other and we were connecting it all into a new interface for sales teams that’s very different than a CRM,” Byrne said.
Over time, that involved using AI and machine learning to make connections in the data that humans might not have been seeing. The company also found that customers were using the product to look at processes adjacent to sales, and they decided to formalize that and build connectors to relevant parts of the go-to-market system like marketing automation tools from Marketo or Eloqua and customer tools such as Dialpad, Gong.io and Salesloft.
With Clari’s approach, companies can get a unified view without manually pulling all this data together. The goal is to provide customers with a broad view of the to-to-market operation that isn’t possible looking at siloed systems.
The company has experienced tremendous growth over the last year leaping from 80 customers to 250. These include Okta and Alteryx, two companies that went public in recent years. Clari is based in the Bay area and has around 120 employees. It has raised over $60M. The most recent round was a $35 million Series C last May led by Tenaya Capital.
Posted by Richy George on 5 March, 2019This post was originally published on this site
The growth of augmented and virtual reality applications and hardware is ushering in a new age of digital media and imaging technologies, and startups that are putting themselves at the center of that are attracting interest.
TechCrunch has learned and confirmed that Matterport, which started out making cameras but has since diversified into a wider platform to capture, create, search and utilise 3D imagery of interior and enclosed spaces in immersive real estate, design, insurance and other B2C and B2B applications, has raised $48 million. Sources tell us the money came at a pre-money valuation of around $325 million, although the company is not commenting on that.
From what we understand, the funding is coming ahead of a larger growth round from existing and new investors, to tap into what they see as a big opportunity for building and providing (as a service) highly accurate 3D images of enclosed spaces.
The company in December appointed a new CEO, RJ Pittman — who had been the chief product officer at eBay, and before that held executive roles at Apple and Google — also to help fill out that bigger strategy.
Matterport had raised just under $63 million prior to this and had been valued at around $207 million, according to PitchBook estimates.This current round is coming from existing backers, which include Lux Capital, DCM, Qualcomm Ventures and more.
Matterport’s roots are in high-end cameras built to capture multiple images to create 3D interior imagery for a variety of applications from interior design and real estate to gaming. Changing tides in the worlds of industry and hardware have somewhat shifted its course.
On the hardware side, we’ve seen a rise in the functionality of smartphone cameras, as well as a proliferation of specialised 3D cameras at lower price points. So while Matterport still sells its own high-end cameras, it is also starting to work with less expensive devices with spherical lenses — such as the Ricoh Theta, which is nearly 10 times less expensive than Matterport’s Pro2 camera — and smartphones.
Using an AI engine — which it has been building for some time — packaged into a service it calls Matterport Cloud 3.0, it converts 2D panoramic and 360-degree images into 3D ones. (Matterport Cloud 3.0 is currently in beta and will be launching fully on the 18th of March, initially supporting the Ricoh Theta V, the Theta Z1, the Insta360 ONE X, and the Leica Geosystems BLK360 laser scanner.)
Matterport is further using this technology to grow its wider database of images. It already has racked up 1.6 million 3D images and millions of 2D images, and at its current growth rate, the aim is to expand its library to 100 million in the coming years, positioning it as a Getty for 3D enclosed images.
These, in turn, will be used in two ways: to feed Matterport’s machine learning to train it to create better and faster 3D images; and to become part of a wider library, accessible to other businesses by way of a set of APIs.
And, from what I understand, the object will not just to be use images as they are: people would be able to manipulate the images to, for example, remove all the furniture in a room and re-stage it completely without needing to physically do that work ahead of listing a house for sale. Another is adding immersive interior shots into mapping applications like Google’s Street View.
“We are a data company,” RJ Pittman told me when I met him for coffee last month.
The ability to convert 2D into 3D images using artificial intelligence to help automate the process is a potentially big area that Matterport, and its investors, believe will be in increasing demand. That’s not just because people still think there will one day be a bigger market for virtual reality headsets, which will need more interesting content; but because we as consumers already have come to expect more realistic and immersive experiences today, even when viewing things on regular screens; and because B2B and enterprise services (for example design or insurance applications) have also grown in sophistication and now require these kinds of images.
(That demand is driving the creation of other kinds of 3D imaging startups, too. Threedy.ai launched last week with a seed round from a number of angels and VCs to perform a similar kind of 2D-to-3D mapping technique for objects rather than interior spaces. It is already working with a number of e-commerce sites to bypass some of the costs and inefficiencies of more established, manual methods of 3D rendering.)
While Matterport is doubling down on its cloud services strategy, it’s also been making some hires to take the business to its next steps. In addition to Pittman, they have included adding Dave Lippman, formerly design head at eBay, as its chief design officer; and engineering veteran Lou Marzano as its VP of hardware, R&D and manufacturing, with more hires to come.
Posted by Richy George on 5 March, 2019This post was originally published on this site
SurveyMonkey announced today that it has acquired Usabilla, an Amsterdam-based website and app survey company, for $80 million in cash and stock.
Zander Lurie, CEO at SurveyMonkey, said Usabilla filled in a missing piece in its survey toolkit. “A key product that we identified that we really wanted to add to the portfolio, which is really adjacent to our VOC (voice of the customer) solution is a website feedback collector helping people on the web or on mobile apps really understand what users are doing on their site,” Lurie told TechCrunch.
Usabilla CEO Marc van Agteren says his company is adding a complementary product to SurveyMonkey. “If you compare us to the SurveyMonkey enterprise solution where you create surveys that you need to send out via social media or email, our software sits on a website and instantly provides feedback,” he said. For example, if there is a bug on the page, the user can click the Usabilla tool, capture the area of the page that’s problematic as a screenshot, and send it with a comment to the website or app owner for review.
Conversely the website or app owner could display a question for the visitor to answer before he or she exits. This provides a way to get immediate feedback about design or why they are leaving without finishing a transaction, as examples.
Qualtrics, another survey company was about to go public last fall when it was acquired by SAP for $8 billion, but Lurie doesn’t necessarily see this move as a reaction to that. He said that today’s acquisition was really related to enhancing the company’s enterprise product.
As for Qualtrics, he says that with the acquisition, it is more aligned with SAP now and therefore really being marketed to SAP customers. He sees plenty of room in the survey market with customers of Adobe, Salesforce and Microsoft and others, whom he says probably aren’t looking for an SAP solution.
With Usabilla, SurveyMonkey gains a stronger foothold in the EU as the company’s headquarters in Amsterdam will become the SurveyMonkey’s largest EU office. The transaction also adds 130 new employees to the SurveyMonkey family, bringing the total number to over 1000. In addition, it can now access Usabilla’s 450 customers, which include Lufthansa, Philips and Vodafone. Lurie said there is some customer overlap, but given that the majority of Usabilla’s customers are outside the U.S, there would likely be a net customer gain from the purchase.
SurveyMonkey was founded in 1999 and went public last September. This is the company’s sixth acquisition and the first in three years, according to Lurie. Usabilla was founded in 2009 and raised a modest $1 million dollars along the way.
The deal is subject to the normal regulatory approval process and is expected to close some time in the second quarter this year.
Posted by Richy George on 5 March, 2019This post was originally published on this site
Salesforce has been using the notion of trailblazers as a learning metaphor for several years, since it created a platform to teach customers Salesforce skills called Trailhead. Today, the company announced the availability of myTrailhead, a similar platform that enables company to create branded, fully-customizable training materials based on the Trailhead approach.
It’s worth noting that the company originally announced this idea at Dreamforce in November, 2017, and after testing it on 13 pilot customers (including itself) for the last year is making the product generally available today.
While Trailhead is all about teaching Salesforce skills, myTrailhead is about building on that approach to teach whatever other skills a company might find desirable with its own culture, style, branding and methodologies.
It builds on the whole Trailhead theme of blazing a learning trail, providing a gamified approach to self-paced training, where users are quizzed throughout to reinforce the lessons, awarded badges for successfully completing modules and given titles like Ranger for successfully completing a certain number of courses.
By gamifying the approach, Salesforce hopes people will have friendly competition within companies, but it also sees these skills as adding value to an employee’s resume. If a manager is looking for an in-house hire, they can search by skills in myTrailhead and find candidates who match their requirements. Additionally, employees who participate in training can potentially advance their careers with the their enhanced skill sets.
While you can continue to teach Salesforce skills in myTrailhead, it’s really focused on the customization and what companies can add on top of the Salesforce materials to make the platform their own. Salesforce envisions companies using the platform for new employee onboarding, sales enablement or customer service training, but if a company is ambitious, it could use this as a broader training tool.
There is an analytics component in myTrailhead, so management can track when employees complete required training modules, understand how well they are doing as they move through a learning track or recognize when employees have updated their skill sets.
The idea is to build on the Trailhead platform idea to provide companies with a methodology for creating a digital approach learning, which Salesforce sees as an essential ingredient of becoming a modern company. The product is available immediately.
Posted by Richy George on 4 March, 2019This post was originally published on this site
Massive-scale predictive analytics is a relatively new phenomenon, one that challenges both decades of law as well as consumer thinking about privacy.
As a technology, it may well save thousands of lives in applications like predictive medicine, but if it isn’t used carefully, it may prevent thousands from getting loans, for instance, if an underwriting algorithm is biased against certain users.
I chatted with Dennis Hirsch a few weeks ago about the challenges posed by this new data economy. Hirsch is a professor of law at Ohio State and head of its Program on Data and Governance. He’s also affiliated with the university’s Risk Institute.
“Data ethics is the new form of risk mitigation for the algorithmic economy,” he said. In a post-Cambridge Analytica world, every company has to assess what data it has on its customers and mitigate the risk of harm. How to do that, though, is at the cutting edge of the new field of data governance, which investigates the processes and policies through which organizations manage their data.
“Traditional privacy regulation asks whether you gave someone notice and given them a choice,” he explains. That principle is the bedrock for Europe’s GDPR law, and for the patchwork of laws in the U.S. that protect privacy. It’s based around the simplistic idea that a datum — such as a customer’s address — shouldn’t be shared with, say, a marketer without that user’s knowledge. Privacy is about protecting the address book, so to speak.
The rise of “predictive analytics” though has completely demolished such privacy legislation. Predictive analytics is a fuzzy term, but essentially means interpreting raw data and drawing new conclusions through inference. This is the story of the famous Target data crisis, where the retailer recommended pregnancy-related goods to women who had certain patterns of purchases. As Charles Duhigg explained at the time:
Many shoppers purchase soap and cotton balls, but when someone suddenly starts buying lots of scent-free soap and extra-big bags of cotton balls, in addition to hand sanitizers and washcloths, it signals they could be getting close to their delivery date.
Predictive analytics is difficult to predict. Hirsch says “I don’t think any of us are going to be intelligent enough to understand predictive analytics.” Talking about customers, he said “They give up their surface items — like cotton balls and unscented body lotion — they know they are sharing that, but they don’t know they are giving up their pregnancy status. … People are not going to know how to protect themselves because they can’t know what can be inferred from their surface data.”
In other words, the scale of those predictions completely undermines notice and consent.
Even though the law hasn’t caught up to this exponentially more challenging problem, companies themselves seem to be responding in the wake of Target and Facebook’s very public scandals. “What we are hearing is that we don’t want to put our customers at risk,” Hirsch explained. “They understand that this predictive technology gives them really awesome power and they can do a lot of good with it, but they can also hurt people with it.” The key actors here are corporate chief privacy officers, a role that has cropped up in recent years to mitigate some of these challenges.
Hirsch is spending significant time trying to build new governance strategies to allow companies to use predictive analytics in an ethical way, so that “we can achieve and enjoy its benefits without having to bear these costs from it.” He’s focused on four areas: privacy, manipulation, bias, and procedural unfairness. “We are going to set out principles on what is ethical and and what is not,” he said.
Much of that focus has been on how to help regulators build policies that can manage predictive analytics. Since people can’t understand the extent that inferences can be made with their data, “I think a much better regulatory approach is to have someone who does understand, ideally some sort of regulator, who can draw some lines.” Hirsch has been researching how the FTC’s Unfairness Authority may be a path forward for getting such policies into practice.
He analogized this to the Food and Drug Administration. “We have no ability to assess the risks of a given drug [so] we give it to an expert agency and allow them to assess it,” he said. “That’s the kind of regulation that we need.”
Hirsch overall has a balanced perspective on the risks and rewards here. He wants analytics to be “more socially acceptable” but at the same time, sees the needs for careful scrutiny and oversight to ensure that consumers are protected. Ultimately, he sees that as incredibly beneficial to companies who can take the value out of this tech without risking provoking consumer ire.
Talking about data ethics, Europe is in the middle of a superpower pincer. China’s telecom giant Huawei has made expansion on the continent a major priority, while the United States has been sending delegation after delegation to convince its Western allies to reject Chinese equipment. The dilemma was quite visible last week at MWC-Barcelona, where the two sides each tried to make their case.
It’s been years since the Snowden revelations showed that the United States was operating an enormous eavesdropping infrastructure targeting countries throughout the world, including across Europe. Huawei has reiterated its stance that it does not steal information from its equipment, and has repeated its demands that the Trump administration provide public proof of flaws in its security.
There is an abundance of moral relativism here, but I see this as increasingly a litmus test of the West on China. China has not hidden its ambitions to take a prime role in East Asia, nor has it hidden its intentions to build a massive surveillance network over its own people or to influence the media overseas.
Those tactics, though, are straight out of the American playbook, which lost its moral legitimacy over the past two decades from some combination of the Iraq War, Snowden, Wikileaks, and other public scandals that have undermined trust in the country overseas.
Security and privacy might have been a competitive advantage for American products over their Chinese counterparts, but that advantage has been weakened for many countries to near zero. We are increasingly going to see countries choose a mix of Chinese and American equipment in sensitive applications, if only to ensure that if one country is going to steal their data, it might as well be balanced.
To every member of Extra Crunch: thank you. You allow us to get off the ad-laden media churn conveyor belt and spend quality time on amazing ideas, people, and companies. If I can ever be of assistance, hit reply, or send an email to firstname.lastname@example.org.
This newsletter is written with the assistance of Arman Tabatabai from New York.
Posted by Richy George on 4 March, 2019This post was originally published on this site
Scytale, a startup that wants to bring identity and access management to application-to-application activities, announced a $5 million Series A round today.
The round was led by Bessemer Venture Partners, a return investor which led the company’s previous $3 million round in 2018. Bain Capital Ventures, TechOperators and Work-Bench are also participating in this round.
The company wants to bring the same kind of authentication that individuals are used to having with a tool like Okta to applications and services in a cloud native environment. “What we’re focusing on is trying to bring to market, a capability for large enterprises going through this transition to cloud native computing to evolve the existing methods of application to application authentication, so that it’s much more flexible and scalable,” Sunil James, company CEO told TechCrunch.
To help with this, the company has developed the open source, cloud native project, Spiffe, that is managed by the Cloud Native Computing Foundation (CNCF). The project is designed to provide identity and access management for application-to-application communication in an open source framework.
The idea is that as companies transition to a containerized, cloud native approach to application delivery, there needs to a smooth automated way for applications and services to prove they are legitimate very quickly in much the same way individuals provide a username and password to access a website. This could be, for example, as applications pass through API gateways, or as automation drives the use of multiple applications in a workflow.
Webscale companies like Google and Netflix have developed mechanisms to make this work in-house, but it’s been out of reach of most large enterprise companies. Scytale wants to bring this capability to authenticate services and applications to any company.
In addition to the funding announcement, the company also announced Scytale Enterprise, a tool that provides a commercial layer on top of the open source tools that the company has developed. The enterprise version helps companies, who might not have the personnel to deal with the open source version on their own by providing training, consulting and support services.
Bain Capital Venture’s Enrique Salem sees a startup solving a big problem for companies who are moving to cloud native environments and need this kind of authentication.”In an increasingly complex and fragmented enterprise IT environment, Scytale has not only built Spiffe’s amazing open-source community but has also delivered a commercial offering to address hybrid cloud authentication challenges faced by Fortune 500 identity and access management engineering teams,” Salem said in a statement.
The company, which is based in the Bay area, launched in 2017 and currently has 24 employees.
Copyright 2015 - InnovatePC - All Rights Reserved
Site Design By Digital web avenue