TechCrunch Disrupt offers plenty of options for attendees with an eye on the enterprise

Posted by on 22 September, 2019

This post was originally published on this site

We might have just completed a full-day program devoted completely to enterprise at TechCrunch Sessions: Enterprise last week, but it doesn’t mean we plan to sell that subject short at TechCrunch Disrupt next month in San Francisco. In fact, we have something for everyone from startups to established public companies and everything in between along with investors and industry luminaries to discuss all-things enterprise.

SaaS companies have played a major role in enterprise software over the last decade, and we are offering a full line-up of SaaS company executives to provide you with the benefit of their wisdom. How about Salesforce chairman, co-CEO and co-founder Marc Benioff for starters? Benioff will be offering advice on how to build a socially responsible, successful startup.

If you’re interested in how to take your startup public, we’ll have Box CEO Aaron Levie, who led his company to IPO in 2015 and Jennifer Tejada, CEO at PagerDuty, who did the same just this year. The two executives will discuss the trials and tribulations of the IPO process and what happens after you finally go public.

Meanwhile, Slack co-founder and CTO Cal Henderson, another SaaS company that recently IPOed, will be discussing how to build great products with Megan Quinn from Spark Capital, a Slack investor.

Speaking of investors, Neeraj Agrawal, a general partner at Battery Ventures joins us on a panel with Whitney Bouck, COO at HelloSign and Jyoti Bansal, CEO and founder of Harness (as well as former CEO and co-founder at AppDynamics, which was acquired by Cisco in 2017 for $3.7 billion just before it was supposed to IPO). They will be chatting about what it takes to build a billion dollar SaaS business.

Not enough SaaS for you? How about Diya Jolly, Chief Product Officer at Okta discussing how to iterate your product?

If you’re interested in security, we have Dug Song from Duo, whose company was sold to Cisco in 2018 for $2.35 billion, explaining how to develop a secure startup. We will also welcome Nadav Zafrir from Israeli security incubator Team 8 to talk about the intriguing subject of when spies meet security on our main stage.

You probably want to hear from some enterprise company executives too. That’s why we are bringing Frederic Moll, chief development officer for the digital surgery group at Johnson & Johnson to talk about robots, Marillyn A. Hewson, chairman, president and CEO at Lockheed Martin discussing the space industry and Verizon CEO Hans Vestberg going over the opportunity around 5G.

We’ll also have seasoned enterprise investors, Mamoon Hamid from Kleiner Perkins and Michelle McCarthy from Verizon Ventures, acting as judges at the TechCrunch Disrupt Battlefield competition.

If that’s not enough for you, there will also be enterprise startups involved in the Battlefield and Startup Alley. If you love the enterprise, there’s something for everyone. We hope you can make it.

Still need tickets? You can pick those up right here.

Posted Under: Tech News
Facebook has acquired Servicefriend, which builds ‘hybrid’ chatbots, for Calibra customer service

Posted by on 21 September, 2019

This post was originally published on this site

As Facebook prepares to launch its new cryptocurrency Libra in 2020, it’s putting the pieces in place to help it run. In one of the latest developments, it has acquired Servicefriend, a startup that built bots — chat clients for messaging apps based on artificial intelligence — to help customer service teams, TechCrunch has confirmed.

The news was first reported in Israel, where Servicefriend is based, after one of its investors, Roberto Singler, alerted local publication The Marker about the deal. We reached out to Ido Arad, one of the co-founders of the company, who referred our questions to a team at Facebook. Facebook then confirmed the acquisition with an Apple-like non-specific statement:

“We acquire smaller tech companies from time to time. We don’t always discuss our plans,” a Facebook spokesperson said.

Several people, including Arad, his co-founder Shahar Ben Ami, and at least one other indicate that they now work at Facebook within the Calibra digital wallet group on their LinkedIn profiles. Their jobs at the social network started this month, meaning this acquisition closed in recent weeks. (Several others indicate that they are still at Servicefriend, meaning they too may have likely made the move as well.)

Although Facebook isn’t specifying what they will be working on, the most obvious area will be in building a bot — or more likely, a network of bots — for the customer service layer for the Calibra digital wallet that Facebook is developing.

Facebook’s plan is to build a range of financial services for people to use Calibra to pay out and receive Libra — for example, to send money to contacts, pay bills, top up their phones, buy things and more.

It remains to be seen just how much people will trust Facebook as a provider of all these. So that is where having “human” and accessible customer service experience will be essential.

“We are here for you,” Calibra notes on its welcome page, where it promises 24-7 support in WhatsApp and Messenger for its users.

Screenshot 2019 09 21 at 23.25.18

Servicefriend has worked on Facebook’s platform in the past: specifically it built “hybrid” bots for Messenger for companies to use to complement teams of humans, to better scale their services on messaging platforms. In one Messenger bot that Servicefriend built for Globe Telecom in the Philippines, it noted that the hybrid bot was able to bring the “agent hours” down to under 20 hours for each 1,000 customer interactions.

Bots have been a relatively problematic area for Facebook. The company launched a personal assistant called M in 2015, and then bots that let users talk to businesses in 2016 on Messenger, with quite some fanfare, although the reality was that nothing really worked as well as promised, and in some cases worked significantly worse than whatever services they aimed to replace.

While AI-based assistants such as Alexa have become synonymous with how a computer can carry on a conversation and provide information to humans, the consensus around bots these days is that the most workable way forward is to build services that complement, rather than completely replace, teams.

For Facebook, getting its customer service on Calibra right can help it build and expand its credibility (note: another area where Servicefriend has build services is in using customer service as a marketing channel). Getting it wrong could mean issues not just with customers, but with partners and possibly regulators.

Posted Under: Tech News
Chef CEO says he’ll continue to work with ICE in spite of protests

Posted by on 20 September, 2019

This post was originally published on this site

Yesterday, software development tool maker Chef found itself in the middle of a firestorm after a Tweet called them out for doing business with DHS/ICE. Eventually it led to an influential open source developer removing a couple of key pieces of software from the project, bringing down some parts of Chef’s commercial business.

Chef intends to fulfill its contract with ICE, in spite of calls to cancel it. In a blog post published this morning, Chef CEO Barry Crist defended the decision. “I do not believe that it is appropriate, practical, or within our mission to examine specific government projects with the purpose of selecting which U.S. agencies we should or should not do business.”

He stood by the company’s decision this afternoon in an interview with TechCrunch, while acknowledging that it was a difficult and emotional decision for everyone involved. “For some portion of the community, and some portion of our company, this is a super, super-charged lightning rod, and this has been very difficult. It’s something that we spent a lot of time on, and I want to represent that there are portions of [our company] that do not agree with this, but I as a leader of the company, along with the executive team, made a decision that we would honor the contracts and those relationships that were formed and work with them over time,” he said.

He added, “I think our challenge as as leadership right now is how do we collectively navigate through through times like this, and through emotionally-charged issues like the ICE contract.”

The deal with ICE, which is a $95,000 a year contract for software development tools, dates back to the Obama administration when the then DHS CIO wanted to move the department towards more modern agile/DevOps development workflows, according Christ.

He said for people who might think it’s a purely economic decision, the money represents a fraction of the company’s more than $50 million annual revenue (according to Crunchbase data), but he says it’s about a long-term business arrangement with the government that transcends individual administration policies. “It’s not about the $100,000, it’s about decisions we’ve made to engage the government. And I appreciate that not everyone in our world feels the same way or would make that same decision, but that’s the decision that that we made as a leadership team,”Crist said.

Shortly after word of Chef’s ICE contract appeared on Twitter, according to a report in The Register, former Chef employee Seth Vargo removed a couple of key pieces of open source software from the repository, telling The Register that “software engineers have to operate by some kind of moral compass.” This move brought down part of Chef’s commercial software and it took them 24 hours to get those services fully restored, according to Chef CTO Corey Scobie.

Crist says he wants to be clear that his decision does not mean he supports current ICE policies. “I certainly don’t want to be viewed as I’m taking a strong stand in support of ICE. What we’re taking a strong stand on is our consistency with working with our customers, and again, our work with DHS  started in the previous administration on things that we feel very good about,” he said.

Posted Under: Tech News
Vianai emerges with $50M seed and a mission to simplify machine learning tech

Posted by on 20 September, 2019

This post was originally published on this site

You don’t see a startup get a $50 million seed round all that often, but such was the case with Vianai, an early stage startup launched by Vishal Sikka, former Infosys managing director and SAP executive. The company launched recently with a big check and a vision to transform machine learning.

Just this week, the startup had a coming out party at Oracle Open World where Sikka delivered one of the keynotes and demoed the product for attendees. Over the last couple of years, since he left Infosys, Sikka has been thinking about the impact of AI and machine learning on society and the way it is being delivered today. He didn’t much like what he saw.

It’s worth noting that Sikka got his Ph.D. from Stanford with a specialty in AI in 1996, so this isn’t something that’s new to him. What’s changed, as he points out, is the growing compute power and increasing amounts of data, all fueling the current AI push inside business. What he saw when he began exploring how companies are implementing AI and machine learning today, was a lot of complex tooling, which in his view, was far more complex than it needed to be.

He saw dense Jupyter notebooks filled with code. He said that if you looked at a typical machine learning model, and stripped away all of the code, what you found was a series of mathematical expressions underlying the model. He had a vision of making that model-building more about the math, while building a highly visual data science platform from the ground up.

The company has been iterating on a solution over the last year with two core principles in mind: explorability and explainability, which involves interacting with the data and presenting it in a way that helps the user attain their goal faster than the current crop of model-building tools.

“It is about making the system reactive to what the user is doing, making it completely explorable, while making it possible for the developer to experiment with what’s happening in a in a way that is that is incredibly easy. To make it explainable, means being able to go back and forth with the data and the model, using the model to understand the phenomenon that you’re trying to capture in the data,” Sikka told TechCrunch.

He says the tool isn’t just aimed at data scientists, it’s about business users and the data scientists sitting down together and iterating together to get the answers they are seeking, whether it’s finding a way to reduce user churn or discover fraud. These models do not live in a data science vacuum. They all have a business purpose, and he believes the only way to be successful with AI in the enterprise is to have both business users and data scientists sitting together at the same table working with the software to solve a specific problem, while taking advantage of one another’s expertise.

For Sikka, this means refining the actual problem you are trying to solve. “AI is about problem solving, but before you do the problem solving, there is also a [challenge around] finding and articulating a business problem that is relevant to businesses and that has a value to the organization,” he said.

He is very clear, that he isn’t looking to replace humans, but instead wants to use AI to augment human intelligence to solve actual human problems. He points out that this product is not automated machine learning (AutoML), which he considers a deeply flawed idea. “We are not here to automate the jobs of data science practitioners. We are here to augment them,” he said.

As for that massive seed round, Sikka knew it would take a big investment to build a vision like this, and with his reputation and connections, he felt it would be better to get one big investment up front, and he could concentrate on building the product and the company. He says that he was fortunate enough to have investors who believe in the vision, even though as he says, no early business plan survives the test of reality.

For now, the company has a new product and plenty of money in the bank to get to profitability, which he states is his ultimate goal. Sikka could have taken a job running a large organization, but like many startup founders, he saw a problem, and he had an idea how to solve it. That was a challenge he couldn’t resist pursuing.

Posted Under: Tech News
Google is investing $3.3B to build clean data centers in Europe

Posted by on 20 September, 2019

This post was originally published on this site

Google announced today that it was investing 3 billion euro (approximately $3.3 billion USD) to expand its data center presence in Europe. What’s more, the company pledged the data centers would be environmentally friendly.

This new investment is in addition to the $7 billion the company has invested since 2007 in the EU, but today’s announcement was focused on Google’s commitment to building data centers running on clean energy, as much as the data centers themselves.

In a blog post announcing the new investment, CEO Sundar Pichai, made it clear that the company was focusing on running these data centers on carbon-free fuels, pointing out that he was in Finland today to discuss building sustainable economic development in conjunction with a carbon-free future with prime minister Antti Rinne.

Of the 3 billion Euros, the company plans to spend, it will invest 600 million to expand its presence in Hamina, Finland, which he wrote “serves as a model of sustainability and energy efficiency for all of our data centers.” Further, the company already announced 18 new renewable energy deals earlier this week, which encompass a total of 1,600-megawatts in the US, South America and Europe.

In the blog post, Pichai outlined how the new data center projects in Europe would include some of these previously announced projects:

Today I’m announcing that nearly half of the megawatts produced will be here in Europe, through the launch of 10 renewable energy projects. These agreements will spur the construction of more than 1 billion euros in new energy infrastructure in the EU, ranging from a new offshore wind project in Belgium, to five solar energy projects in Denmark, and two wind energy projects in Sweden. In Finland, we are committing to two new wind energy projects that will more than double our renewable energy capacity in the country, and ensure we continue to match almost all of the electricity consumption at our Finnish data center with local carbon-free sources, even as we grow our operations.

The company is also helping by investing in new skills training, so people can have the tools to be able to handle the new types of jobs these data centers and other high tech jobs will require. The company claims it has previously trained 5 million people in Europe for free in crucial digital skills, and recently opened a Google skills hub in Helsinki.

It’s obviously not a coincidence that company is making an announcement related to clean energy on Global Climate Strike Day, a day when people from around the world are walking out of schools and off their jobs to encourage world leaders and businesses to take action on the climate crisis. Google is attempting to answer the call with these announcements.

Posted Under: Tech News
Quilt Data launches from stealth with free portal to access petabytes of public data

Posted by on 19 September, 2019

This post was originally published on this site

Quilt Data‘s founders, Kevin Moore and Aneesh Karve, have been hard at work for the last four years building a platform to search for data quickly across vast repositories on AWS S3 storage. The idea is to give data scientists a way to find data in S3 buckets, and then package that data in forms that a business can use. Today, the company launched out of stealth with a free data search portal that not only proves what they can do, but also provides valuable access to 3.7 petabytes of public data across 23 S3 repositories.

The public data repository includes publicly available Amazon review data along with satellite images and other high-value public information. The product works like any search engine, where you enter a query, but instead of searching the web or an enterprise repository, it finds the results in S3 storage on AWS.

The results not only include the data you are looking for, it also includes all of the information around the data, such as Jupyter notebooks, the standard  workspace that data scientists use to build machine learning models. Data scientists can then use this as the basis for building their own machine learning models.

The public data, which includes over 10 billion objects, is a resource that data scientists should greatly appreciate it, but the company is offering access to this data out of more than pure altruism. It’s doing so because it wants to show what the platform is capable of, and in the process hopes to get companies to use the commercial version of the product.

Screen Shot 2019 09 16 at 2.31.53 PM

Quilt Data search results with data about the data found. Image: Quilt Data

Customers can try Quilt Data for free or subscribe to the product in the Amazon Marketplace. The company charges a flat rate of $550 per month for each S3 bucket. It also offers an enterprise version with priority support, custom features and education and on-boarding for $999 per month for each S3 bucket.

The company was founded in 2015 and was a member of the Y Combinator Summer 2017 cohort. The company has received $4.2 million in seed money so far from Y Combinator, Vertex Ventures, Fuel Capital and Streamlined Ventures along with other unnamed investors.

Posted Under: Tech News
New Relic launches platform for developers to build custom apps

Posted by on 19 September, 2019

This post was originally published on this site

When Salesforce launched Force.com in 2007 as a place for developers to build applications on top of Salesforce, it was a pivotal moment for the concept of SaaS platforms. Since then, it’s been said that every enterprise SaaS company wants to be a platform play. Today, New Relic achieved that goal when it announced the New Relic One Observability Platform at the company’s FutureStack conference in New York City.

Company co-founder and CEO Lew Cirne explained that in order to be a platform, by definition, it is something that other people can build software on. “What we are shipping is a set of capabilities to enable our customers and partners to build their own observability applications on the very same platform that we’ve built our product,” Cirne told TechCrunch.

He sees these third-party developers building applications to enable additional innovations on top of the New Relic platform that perhaps New Relic’s engineers couldn’t because of time and resource constraints. “There are so many use cases for this data, far more than the engineers that we have at our company could ever do, but a community of people who can do this together can totally unlock the power of this data,” Cirne said.

Like many platform companies, New Relic found that as it expanded its own offering, it required a platform for its developers to access a common set of services to build these additional offerings, and as they built out this platform, it made it possible to open it up to external developers to access the same set of services as the New Relic engineering team.

“What we have is metrics, logs, events and traces coming from our customers’ digital software. So they have access to all that data in real time to build applications, measure the health of their digital business and build applications on top of that. Just as Force.com was the thing that really transformed Salesforce as a company into being a strategic vendor, we think the same thing will happen for us with what we’re offering,” he said.

As a proof point for the platform, the company is releasing a dozen open source tools built on top of the New Relic platform today in conjunction with the announcement. One example is an application to help identify where companies could be over-spending on their AWS bills. “We’re actually finding 30-40% savings opportunities for them where they’re provisioning larger servers than they need for the workload. Based on the data that we’re analyzing, we’re recommending what the right size deployment should be,” Cirne said.

The New Relic One Observability Platform and the 12 free apps will be available starting today.

Posted Under: Tech News
Tableau update uses AI to increase speed to insight

Posted by on 18 September, 2019

This post was originally published on this site

Tableau was acquired by Salesforce earlier this year for $15.7 billion, but long before that, the company had been working on its Fall update, and today it announced several new tools including a new feature called ‘Explain Data’ that uses AI to get to insight quickly.

“What Explain Data does is it moves users from understanding what happened to why it might have happened by automatically uncovering and explaining what’s going on in your data. So what we’ve done is we’ve embedded a sophisticated statistical engine in Tableau, that when launched automatically analyzes all the data on behalf of the user, and brings up possible explanations of the most relevant factors that are driving a particular data point,” Tableau chief product officer, Francois Ajenstat explained.

He added that what this really means is that it saves users time by automatically doing the analysis for them, and It should help them do better analysis by removing biases and helping them dive deep into the data in an automated fashion.

Explain Data Superstore extreme value

Image: Tableau

Ajenstat says this is a major improvement in that previously users would have do all of this work manually. “So a human would have to go through every possible combination, and people would find incredible insights, but it was manually driven. Now with this engine, they are able to essentially drive automation to find those insights automatically for the users,” he said.

He says this has two major advantages. First of all, because it’s AI-driven it can deliver meaningful insight much faster, but it also it gives a more of a rigorous perspective of the data.

In addition, the company announced a new Catalog feature, which provides data bread crumbs with the source of the data, so users can know where the data came from, and whether it’s relevant or trustworthy.

Finally, the company announced a new server management tool that helps companies with broad Tableau deployment across a large organization to manage those deployments in a more centralized way.

All of these features are available starting today for Tableau customers.

Posted Under: Tech News
Aliro comes out of stealth with $2.7M to ‘democratize’ quantum computing with developer tools

Posted by on 18 September, 2019

This post was originally published on this site

It’s still early days for quantum computing, but we’re nonetheless seeing an interesting group of startups emerging that are helping the world take advantage of the new technology now. Aliro Technologies, a Harvard startup that has built a platform for developers to code more easily for quantum environments — “write once, run anywhere” is one of the startup’s mottos — is today coming out of stealth and announcing its first funding of $2.7 million to get it off the ground.

The seed round is being led Flybridge Capital Partners, with participation also from Crosslink Ventures and Samsung NEXT’s Q Fund, a fund the corporate investor launched last year dedicated specifically to emerging areas like quantum computing and AI.

Aliro is wading into the market at a key moment in the development of quantum computing.

While vendors continue to build new quantum hardware to be able to tackle the kinds of complex calculations that cannot be handled by current binary-based machines, for example around medicine discovery, or multi-variabled forecasting — just today IBM announced plans for a 53-qubit device — even so, it’s widely acknowledged that the computers that have been built so far face a number of critical problems that will hamper wide adoption.

The interesting development of recent times is the emergence of startups that are tackling these specific critical problems, dovetailing that progress with that of building the hardware itself. Take the fact that quantum machines so far have been too prone to error when used for extended amounts of time: last week, I wrote about a startup called Q-CTRL that has built firmware that sits on top of the machines to identify when errors are creeping in and provide fixes to stave off crashes.

The specific area that Aliro is addressing is the fact that quantum hardware is still very fragmented: each machine has its own proprietary language and operating techniques and sometimes even purpose for which it’s been optimised. It’s a landscape that is challenging for specialists to engage in, let alone the wider world of developers.

“We’re at the early stage of the hardware, where quantum computers have no standardisation, even those based on same technology have different qubits (the basic building block of quantum activity) and connectivity. It’s like digital computing in 1940s,” said CEO and chairman Jim Ricotta. (The company is co-founded by Harvard computational materials science professor Prineha Narang along with Michael Cubeddu and Will Finegan, who are actually still undergraduate students at the university.)

“Because it’s a different style of computing, software developers are not used to quantum circuits,” and engaging with them is “not the same as using procedural languages. There is a steep on-ramp from high-performance classical computing to quantum computing.”

While Aliro is coming out of stealth, it appears that the company is not being specific with details about how its platform actually works. But the basic idea is that Aliro’s platform will essentially be an engine that will let developers work in the languages that they know, and identify problems that they would like to solve; it will then assess the code and provide a channel for how to optimise that code and put it into quantum-ready language, and suggest the best machine to process the task.

The development points to an interesting way that we may well see quantum computing develop, at least in its early stages. Today, we have a handful of companies building and working on quantum computers, but there is still a question mark over whether these kinds of machines will ever be widely deployed, or if — like cloud computing — they will exist among a smaller amount of providers who will provide access to them on-demand, SaaS-style. Such a model would seem to fit with how much computing is sold today in the form of instances, and would open the door to large cloud names like Amazon, Google and Microsoft playing a big role in how this would be disseminated.

Such questions are still theoretical, of course, given some of the underlying problems that have yet to be fixed, but the march of progress seems inevitable, with forecasts predicting that quantum computing is likely to be a $2.2 billion industry by 2025, and if this is a route that is taken, the middlemen like Aliro could play an important role.

“I have been working with the Aliro team for the past year and could not be more excited about the opportunity to help them build a foundational company in Quantum Computing software, “ said David Aronoff, General Partner at Flybridge, in a statement. “Their innovative approach and unique combination of leading Quantum researchers and a world-class proven executive team, make Aliro a formidable player in this exciting new sector.

“At Samsung NEXT we are focused on what the world will look like in the future, helping to make that a reality,” said Ajay Singh, Samsung NEXT’s Q Fund, in a statement. “We were drawn to Prineha and her team by their impressive backgrounds and extent of research into quantum computing. We believe that Aliro’s unique software products will revolutionize the entire category, by speeding up the inflection point where quantum becomes as accessible as classical computing. This could have implications on anything from drug discovery, materials development or chemistry. Aliro’s ability to map quantum circuits to heterogeneous hardware in an efficient way will be truly transformative and we’re thrilled to be on this journey with them.”

Posted Under: Tech News
Salesforce brings AI power to its search tool

Posted by on 18 September, 2019

This post was originally published on this site

Enterprise search tools have always suffered from the success of Google. Users wanted to find the content they needed internally in the same way they found it on the web. Enterprise search has never been able to meet those lofty expectations, but today Salesforce announced Einstein Search, an AI-powered search tool for Salesforce users that is designed to point them to the exact information they are looking for.

Will Breetz, VP of product management at Salesforce says that enterprise search has suffered over the years for a variety of reasons. “Enterprise search has gotten a bad rap, but deservedly so. Part of that is because in many ways it is more difficult than consumer search, and there’s a lot of headwinds,” Breetz explained.

To solve these issues, the company decided to put the power of its Einstein artificial intelligence engine to bear on the problem. For starters, it might not know the popularity of a given topic like Google, but it can learn the behaviors of an individual and deliver the right answer based on a person’s profile including geography and past activity to deliver a more meaningful answer.

Einstein Search Personal

Image: Salesforce

Next, it allows you to enter natural language search phrasing to find the exact information you need, and the search tool understands and delivers the results. For instance, you could enter, “my open opportunities in Boston” and using natural language understanding, the tool can translate that into the exact set of results you are looking for –your open opportunities in Boston. You could use conventional search to click a series of check boxes to narrow the list of results to only Boston, but this is faster and more efficient.

Finally, based on what the intelligence engine knows about you, and on your search parameters, it can predict the most likely actions you want to take and provide quick action buttons in the results to help you do that, reducing the time to action. It may not seem like much, but each reduced workflow adds up throughout a day, and the idea is to anticipate your requirements and help you get your work done more quickly.

Salesforce appears to have flipped the enterprise search problem. Instead of having a limited set of data being a handicap for enterprise search, it is taking advantage of that, and applying AI to help deliver more meaningful results. It’s for a limited set of findings for now such as accounts, contacts and opportunities, but the company plans to additional options over time.

Posted Under: Tech News
Page 18 of 83« First...10...1617181920...304050...Last »

Social Media

Bulk Deals

Subscribe for exclusive Deals

Recent Post

Archives

Facebook

Twitter

Subscribe for exclusive Deals




Copyright 2015 - InnovatePC - All Rights Reserved

Site Design By Digital web avenue