A CogWorld Custom Feature
February 26, 2017 | A CogWorld Custom Feature
By Lisa Wood
"It costs less to avoid getting into trouble than to pay for getting out of it." --Louis M. Brown, USC Law Professor (1909-1996)
The average cost of a commercial tort lawsuit against a business is about $350,000 per case, according to Intraspexion's research. Considering the high cost of settlements, defense attorneys' fees, verdicts and administrative costs, even when you win a case, you lose. So if a company can prevent just three lawsuits annually, its bottom line profit grows by over $1 million per year.
That fits a key goal established by the Association of Corporate Counsel. The ACC wants lawyers to target ways to drive down their companies’ annual legal spend by 25 percent. In response, many corporate legal departments have initiated such cost-cutting initiatives as LEAN/Six Sigma training, standardizing RFPs, flat fee arrangements and moving contract matters offshore.
So with business processes for legal services already being significantly streamlined, what’s left to cut?
The remaining target in reducing legal costs meaningfully is to impact the biggest expense: defending against lawsuits. In federal court alone, the average total expense for commercial tort litigation over the past decade was $160 billion per year, according to Intraspexion. Clearly the best way to reduce corporate legal costs is to avoid lawsuits in the first place as much as possible.
Enter Nick Brestoff. Preventing litigation is what drove Brestoff to retire from practicing law as a litigation specialist in California for 38 years, and to form his software company, Intraspexion. With engineering degrees from UCLA and Caltech, Brestoff earned his law degree at the University of Southern California where he studied under Professor Louis Brown, known as the father of “preventive law.”
~ ~ ~
A CogWorld Custom Feature
AGI.. Still 50 Years Out?? Let’s See.
Of course, a lot depends on how one defines it.
November 21, 2016 | A CogWorld Custom Feature
By Peter Fingar
Artificial General Intelligence (AGI) is often shrouded in confusion … what exactly is it? Consider “intelligence” in general, meaning to have the ability to reason, use strategy, solve puzzles, and make judgments under uncertainty; represent knowledge, including commonsense knowledge; plan; learn; communicate in natural language; and integrate all these skills towards common goals. Estimates of the time needed before a truly flexible AGI is built vary from 10 years to over a century, but the consensus in the AGI research community seems to be that the timeline discussed by Ray Kurzweil in The Singularity is Near (i.e. between now and 2045) is plausible. A lot of the long-range estimates are based on the top-down theory of simulating the human brain via neuroscience and human cognition. We reached out to a man who is in the business of early forms of AGI in the here and now, based on a more bottom up approach. Here’s our report after discussing this with him.
Shita is the founder and CEO of Kimera Systems, one of a number of hot new artificial intelligence startups. What makes Kimera Systems unique is its claim that it has developed something the vast majority of AI experts say is impossible for years, possibly decades to come: artificial general intelligence or AGI.
AGI is often defined as machine intelligence that can successfully perform any intellectual task that a human being can. To reach this goal, venture capitalists and technology giants like Google and Intel have poured billions of dollars into research and development to digitally recreate the human brain. Shita believes this approach is fundamentally flawed, which is why six decades of R&D into this effort – beginning with Alan Turing – has not made any significant progress, and why the experts believe it could be as long as 100 years before it becomes possible to build the very technology Kimera claims to have launched onto Android smartphones in a closed beta in August of this year.
“I have some problems with how the AI industry defines ‘intelligence,’” says Shita. “First, there really isn’t an accepted scientific definition. I ask AI scientists all the time how they define this term, and all too often they cannot really express it. But when they do, more often than not they come back to the brain, biology and neuroscience. The industry is stuck trying to digitize the human brain.”
Shita began his AI research in 2005 purposefully using a different approach: quantum mechanics. Shita argues that intelligence is a force exerted by all things over directional time, that affects all events and outcomes. As Shita defines it, intelligence (as it pertains to AGI) increases the likelihood of reaching a particular outcome by manipulating a chain of events to make that outcome possible. Using quantum mechanics instead of neuroscience as his divining rod, Shita successfully tested AGI use cases even before Kimera was formally founded in 2012.
~ ~ ~
A CogWorld Custom Feature
Big Data: Needles and Haystacks
October 24, 2016 | CogWorld Custom Feature
By Peter Fingar
We’ve been loaded (and overloaded) with information about Big Data, and more recently, Cognitive Computing. I’ve researched and written a lot about these subjects. But recently, I reached out to a CEO who is “in the business of Big Data” to update my thinking. I talked at length with Boyd Davis, CEO at Kogentix Inc., providers of cognitive computing and machine learning to build systems that use Big Data to learn about your business, make timely recommendations, and become increasingly more accurate with every iteration.
Here’s my recap of that enlightening conversation.
Boyd, what’s really new about A.I. (which has been around for 50+ years) and this new buzz about “Big Data?”
Some form of AI has been around for decades, and much of the vision for AI is still years away. However, the emergence of commodity hardware, open source software, and cloud computing has created a unique opportunity for organizations to leverage AI for business value. We are at an inflection point, and now is the time to act.
The goal is to integrate cognitive computing and machine learning to build systems that learn about your business, make timely recommendations to improve KPIs, and become increasingly more accurate with every iteration. The key is to integrate Artificial Intelligence with Big Data. Intelligence, artificial or otherwise, starts with data. The foundation of the needed solutions is an integrated data hub, bringing together data of disparate types – structured and unstructured, current and retrospective, real-time and batch. To meet enterprise requirements, the data must be secure, persistent, and well governed.
So, is this like IBM’s Watson that beat two reigning champions at Jeopardy, going through massive amounts of data to produce the answer (the needle in the haystack) in 3 seconds?
Imagine hundreds of thousands haystacks being thrown at you, some at rapidly increasing speed, others lobbed at a more casual pace. We might think we need to search for the needles, but what if there is more to mine from those haystacks? What if we could understand the value of the haystacks themselves? And most importantly, what if the needles themselves aren’t pertinent to us? Your organization needs to not only mine big data, but re-purpose and find new ways of using your data to boost your enterprise. The goal is to build systems that work with your existing technology to unlock potential goldmines and reveal actionable insights.
That sounds very interesting, but very futuristic. When do we need to act?
AI is real today – the intersection of cloud computing, big data, mature algorithms and the broader convergence of the physical and digital world make now the right time to act. This convergence is leading to an unprecedented change in the competitive environment. For example, in connected vehicles, we see vehicle manufacturers competing with insurers competing with telecommunications service providers, all competing with a new breed of Silicon Valley startups. Success will depend on a combination of innovation, speed, and scale. Larger, more established organizations can leverage their scale, brand, and route to market advantage to win, but only if they take risks, fail fast, and innovate at the pace the market demands. Organizations are already using AI to build new data-driven services, both for internal and external consumption. The game isn’t about the most sophisticated or exotic application of deep learning or neural nets – it’s the practical delivery of real solutions that derive insights and value from physical and digital data.
~ ~ ~