Latest News

Wednesday, October 31, 2018

Could Blockchain Innovations Improve Latency And Inefficiency In Banking?


Ah, latency. That seven-letter word takes nearly seven seconds to say and what feels like forever to settle. In simple terms, latency is the delay before data is moved. In the banking world, latency is the time it takes for money to move across a ledger, which is a book of accounts where transactions are balanced.
According to Merriam-Webster, the word "ledger" comes from the English dialect form leggen, meaning "to lay." I don't believe there’s any denying that money sometimes sits, lies or ostensibly takes a nap when we need it most. Think of all those times you’ve deposited a check into your bank account and have had to wait days to access your money. Issues like these are painful. In fact, as MarketWatch reported, it’s likely you’ll only be able to access a fraction of checks you deposit until the respective banks release their hold.
Another area I believe the banking industry is lagging on is money management services. It’s still common for brick-and-mortar banks to have someone in a suit selling financial services and advice. I feel people should just as easily be able to log in to their mobile device and download a financial app to eliminate the need for that type of in-person service.

Similarly, some insurance companies are still using expensive, commission-driven salespeople, while artificial intelligence (AI) and smart contracts could potentially determine the needs of a consumer and spin up a contract, payments and the delivery of benefits faster, cheaper and better.
With that, you as an innovator can ask yourself: How can I work to speed up and improve the movement of money, information, and services to eliminate the wait? Let’s take a look.

Digital And Mobile Banking Revolution

Around the Great Recession of 2008, many millennials were coming of age and expressing their dissatisfaction with the status quo through movements like Occupy Wall Street. At that time, one could also observe a rise of distrust in banks. Why? In my experience, it's partly because there's a perception that banks profit from the status quo.
The opportunity for businesses lies in innovations that eliminate latency, inefficiencies, costs, unnecessary processes and procedures, error resolution, data corruption, and security hacks. The results of these innovations could include restored trust, which would be a win-win for both financial institutions and their customers.
Over the past decade, I've noticed that the financial industry has witnessed a gargantuan pivot from traditional banking methodologies to financial-technology-driven applications. Millennials, with Gen Z on their heels, may become the reigning economic powerhouse over the medium term — Pew reported their adult numbers will likely outnumber baby boomers' in 2019. They're thinking differently about how they bank. In a world where financial institutions are competing with innovations where you can order a ride or takeout with the push of a button, they need to be thinking about how to make financial solutions available just as quickly.
While the $200-billion-plus cryptocurrency market catapults us light-years into the future technologically, there's still work to be done. For example, large banks, brokers, securities firms, clearing companies, money transmitters and so on, seem to be at the early stages of tapping the power of blockchain technology, distributed ledger technology (DLT), smart contracts, artificial intelligence, machine learning and other technology to change the future and security of money.
Blockchain In Insurance
One of the areas where innovators can begin thinking about using technologies like blockchain would be in the insurance industry, as noted above, to create reliable registries, expand the payment infrastructure, offer microinsurance, automate contracts and payments and to evaluate claims. The benefit would be a boost in customer trust, speed and the ability to drive and compress costs on an operational level. At the end of the day, it could also reduce fraud. (See below.)
I believe blockchain in insurance is a lot further along in ideation than reality. That’s where innovators come in: to harness the power of this technology and close the gap between ideas and implementation.
Blockchain In Finance
Imagine blockchain implementation in a system like the Automated Clearing House (ACH) as a shared distributed ledger technology that all banks could access. If that happened, it would basically become an interledger — a protocol for connecting blockchains and ledgers — that could introduce immutable data and eliminate all kinds of latency.
Blockchain In Identity Protection
We’ve all heard about the Equifax hack, and we know identity verification is one of the biggest challenges we face globally. Now, imagine a world where blockchain stored all your information. I believe blockchain could not only store a person’s ID in the future but that it could also be an immutable solution that integrates a lot of information (like biometrics and addresses) into one system, which could make it harder for a person’s identity to be stolen. This is one of the most powerful blockchain applications I see that innovators should be thinking about.
The dark side of this, however, is that if someone did figure out how to break in, it could be catastrophic. That is our great unknown.
Next Steps
The most interesting aspect of blockchain to me is the structure, which is different from what we have experienced over the past 20 years with the explosion of the internet. HTML’s single stand-alone protocol was used to create the container that held all the websites on the internet. It seemed protocol-thin and application-heavy. The blockchain is different to me; blockchain is protocol-thick and thin at the application level so it may require a re-engineering of the mindset to implement. I believe we must think of ways to innovate using blockchain because we're trained to build at the consumer (application) layer, while blockchain is all about establishing protocols.
To adopt a blockchain mindset, imagine this: You step away from your LaFerrari and start operating a tunnel-boring machine. The blockchain is not about innovating beauty; it’s about building a balanced foundation that can change the world.

Source: https://www.forbes.com/sites/forbesfinancecouncil/2018/10/30/could-blockchain-innovations-improve-latency-and-inefficiency-in-banking/#39e98fbc480a

Thursday, October 11, 2018

Blockbid partners with Whale Tech and TMA Solutions to create Blockchain Development Centre



Cryptocurrency exchange Blockbid has teamed up with Whale Tech and TMA Solutions to offer end-to-end blockchain development solutions to external companies.
The partnership will aid key blockchain principles, such as smart contract creation, token launches/listings on the Blockbid exchange, smartcontract audits, wallet auditing and custodial services, as well as custom blockchain product development.
Blockbid, Whale Tech and TMA Solutions have recognised there is a global shortage of developers that understand and can develop for blockchain projects. Therefore, this partnership is aimed at helping foster the adoption of blockchain technology and making qualified developers readily available for external projects.
Whale Tech, founded by an experienced blockchain developer Bernard Peh, who himself has 20 years of software development experience on government and large commercial projects, helps governments, companies and individuals to adapt to the blockchain revolution by providing quality Blockchain development services and education.
TMA Solutions, established in 1997, provides software outsourcing services to leading companies worldwide. It is one of the largest software outsourcing companies in Vietnam with 2,400 engineers and has clients from 27 different countries.
The combination of the two, plus the expertise in crypto and trading that Blockbid provide, will make for an invaluable offering for the many companies looking to benefit from the advantages of blockchain technology in the coming years.
David Sapper, COO at Blockbid says: “The partnership with TMA Solutions allows us to scale-up our team almost instantly with a pool of readily available Blockchain developers of the highest possible standard, as educated by Bernard Peh who will be facilitating the upskilling and coursework”.
Bernard Peh, Founder of Whale Tech, says: “I’m very excited about the Blockchain Development Center because it is something that the global market desperately need. The BDC will be an all-in-one go to shop for anyone who needs help to turn their Blockchain ideas into reality. Most importantly, the center is backed by real industry experts.”
Dr. Nguyen Le – Chairman of TMA Solutions says: “The Blockchain Development Centre will leverage TMA’s 21 years of experience in building enterprise software solutions and engineering talents to support companies to develop end-to-end Blockchain solutions more quickly”.

Source: https://coinrivet.com/

Friday, October 5, 2018

The Blockchain-Enabled Intelligent IoT Economy

I. Setting the stage
The IoT and consumer hardware industry have seen multiple failures and a few exits over the last 12–18 months (while the B2B side has been doing a bit better overall) and some criticism has been recently made to the industry to slow down.
In spite though of the current push back, the sector is still increasing and attracting capital and talents. Clearly, there are multiple reasons as to why this is the case, but I firmly believe that one of those reasons is the convergence of IoT and Artificial Intelligence with Blockchain as the infrastructural backbone, which is unlocking the next step not only on the tech side but also on the business side.
The industry has indeed evolved from merely creating products, to create networks of products (namely, Internet of Things), to eventually creating Intelligent networks of products (I-IoT). The transition between the first and the second class was straightforward: it was enough to create more and different products and link them together. This generated many new possibilities, but it was clear from day one that it came with a series of issues hard to tackle, such as security/privacy, validation/authentication, and connectivity bottlenecks.
This is where AI and Blockchain come in. The second transition indeed is made possible through a combination of improvements in computing powers, device miniaturization, ubiquitous wireless connectivity and efficient algorithms (Porter and Heppelmann, 2014). The new class of smart products will be (and already are, to some extent) able to monitor, control, optimize, and automatize processes and products with an accuracy previously not imaginable.
Of course, as often happens, the bonus of integrating those fundamental technologies is that they ended up modifying IoT as much as IoT was impacting them in turn.
This convergence is however not accidental, but rather an inevitable necessity almost designed by default: AI needs data, IoT needs intelligence and insights, and both need security and transparent marketplaces.
The magnitude of this convergence is so high that will affect several sectors swinging from energy and manufacturing to home environment, robotics and drones, supply chain, logistics, and healthcare. Every field which is historically data-rich but information-poor will be touched (or should I say brutally hit?) by those technologies.
I will explore how in the next few sections.

II. How Blockchain is changing IoT
Blockchain as a technology is basically providing the IoT stack with a secure data infrastructure to capture and validate data. As simple as that. At least it is a simple statement that contains three different nuances:
  • Securing data better: The first one is indeed the concept of storing data securely. We know that blockchain protocols are not designed to heavily store data (they are indeed ledgers, not databases), but they can provide “control points” to monitor data access (Outlier Ventures, 2018).
  • Creating the right incentive structure: A blockchain can create the right incentive structure to share IoT data, which is something we are currently missing. Cross-sectional data have been proved to have the most disruptive impact when applied across different industries, but the problem of how and why sharing data in the first place remains. Blockchain (and tokenization) can be used to solve this economic dilemma, and once data are shared can be more easily validated, authenticated and secured.
  • Creating a network of computers: Distributing the workload and implementing parallel computing tasks is something it is usually attributed to new AI or High-Performance Computing (HPC) applications, but a blockchain would be essential in this development for authenticating and validating the single nodes of those networks. Some companies that are working on this problem are Golem, iExec, Onai, Hadron, Hypernet, DeepBrain Chain, etc.
III. How Blockchain can change AI
As I have already previously mentioned, blockchain can affect AI in multiple ways:
  • Help AI explaining itself (and making us believe it): The AI black-box suffers from an explainability problem. Having a clear audit trail can improve the trustworthiness of the data as well as of the models and also provide a clear route to trace back the machine decision process, i.e., where data are coming from, who wrote the original algorithm, what data was used for training, etc. It can establish the foundations for “algorithms standards,” as for example which main algorithms, packages, and framework have been developed using a specific training set. This is also essential in machine-to-machine interactions and transactions (Outlier Ventures, 2017), and provides a secure way to share data and coordinate decisions, as well as a robust mechanism to reach a quorum. This is extremely relevant for swarm robotics and multiple agents scenarios, as mentioned by Rob May, who is a tech investor and Talla's CEO.
  • Increase AI effectiveness: A secure data sharing means more data (and more training data), and then better models, better actions, better results…and better new data. A network effect is all that matters at the end of the day. An example of a multi-application intelligence that uses different sets of data is provided by AIBlockchain.
  • Lower the market barriers to entry: Let’s go step by step. Blockchain technologies can secure your data. So why shouldn’t you store all your data privately and maybe sell it? Well, you probably will. So first of all, blockchain will foster the creation of cleaner and more organized personal data. Second, it will allow the emergence of new marketplaces such as a data marketplace, which is the low-hanging fruit and it has currently been pursued by companies such as Ocean Protocol, OpenMined, Neuromation, BurstIQ, AtMatrix, Effect.ai, Datum, Streamr, Deuro, Datawallet, etc., a models marketplace (e.g., Dbrain, etc.), and finally even an AI marketplace, that companies like SingularityNET, Fetch.ai, doc.ai, Computable Labs, Agorai, and similars are trying to build). Hence, easy data-sharing and new marketplaces, jointly with blockchain data verification, will provide a more fluid integration that lowers the barrier to entry for smaller players and shrinks the competitive advantage of tech giants. In the effort of lowering the barriers to entry, we are then actually solving two problems, such as providing a wider data access and a more efficient data monetization mechanism. It is also possible that a blockchain-enabled AI will eventually create new organizational structures for intelligent agents to cooperate or compete.
  • Reduce catastrophic risks scenario: An AI coded in a DAO with specific smart contracts will be able to only perform those actions, and nothing more because it will have a limited action space.

IV. How AI can change IoT
AI is feeding itself with the new stream of data coming from the physical world and the billions (if not trillions) of sensors and “things” that are capturing and monitoring everything we do.
At the same time though, as soon as an AI starts making sense of IoT data flows, it will:
  • Increase data efficiency: An AI will inform those sensors on what data should be captured and stored, and above all where those sensors should be placed to be both more efficient and more effective.
  • Save costs: It is fair to think that an algorithm performance should be tested continuously, and once reached the optimal level with data marginal return approaching zero - in other words, a point in which adding more data does not improve the prediction outcome -  an AI will not store or capture more data, resulting in energy, servers, computation, cloud, and infrastructural savings. In addition to that, unplanned downtime prediction is a second cost saving possibility an AI will open for an IoT ecosystem.
  • Increase security: An AI could clearly be able to not only fight potential external threats for an IoT network but even predict them. AnChain is doing some interesting work in this field.
  • Compute on-the-fly: Edge/fog computing is quickly becoming a hot topic since it allows on-device computation, which in turn reduces the response time for an action, limits the exposures to privacy and compliance issues and solves the huge connectivity bottleneck problem. A few startups are already working in this direction, as for example Foghorn, Mythic, Neureal, SONM, Nebula AI, as well as big incumbents as Google. The company recently released, in addition to federated learning, an entire stack made by an Edge TPU and a Cloud IoT Edge platform. However, things will likely change here due to the rapid development of specialized training and inference chips and the forthcoming introduction of the 5G. Cloud is still necessary for computationally intensive operations and to store data centrally to guarantee an extra layer of security (especially in case of "network disasters"), but custom chips and edge computing algorithms can do most of the operations the final customer needs directly on the device.
V. How AI can change Blockchain
Although extremely powerful, a blockchain has its own limitations as well. Some of these are technology-related while others come from the old-minded culture inherited from the financial services sector, but all of these can be affected by AI in a way or another:
  • Consensus mechanisms: The proof of work or proof of stake are the first consensus mechanisms created but definitely neither the only ones nor the most efficient. AION has recently created a new consensus mechanism called “Proof of Intelligence” where validators are asked to train a neural network and using the parameters of that NN as proof of computation.
  • Energy consumption: Mining is an incredibly hard task that requires a ton of energy and money to be completed (O’Dwyer and David Malone, 2014). AI has already proven to be very efficient in optimizing energy consumption, so I believe similar results can be achieved for the blockchain as well. This would probably also result in lower investments in mining hardware.
  • Scalability: The blockchain is growing at a steady pace of 1MB every 10 minutes and it already adds up to 85GB. Nakamoto (2008) first mentioned “blockchain pruning” (i.e., deleting unnecessary data about fully spent transactions in order to not hold the entire blockchain on a single laptop) as a possible solution, but AI can introduce new decentralized learning systems such as federated learning, for example, or new data sharding techniques to make the system more efficient. Matrix AI is a company that is leveraging AI to fix some of the intrinsic limits of the blockchain.
  • Security: Even if the blockchain is almost impossible to hack, its further layers and applications are not so secure - see what happened with the DAO, Mt Gox, Bitfinex, etc. The incredible progress made by machine learning in the last two years makes AI a fantastic ally for the blockchain to guarantee a secure applications deployment, especially given the fixed structure of the system. Have a look at what, for example, NuCypher is doing in this space.
  • Privacy: The privacy issue of owning personal data raises regulatory and strategic concerns for competitive advantages (Unicredit, 2016). Homomorphic encryption, which is performing operations directly on encrypted data, the Enigma project (Zyskind et al., 2015) or the Zerocash project (Sasson et al., 2014) are definitely potential solutions, but I see this problem as closely connected to the previous two, i.e., scalability and security, and I think they will go side by side.
  • Efficiency: Deloitte (2016) estimated the total running costs associated with validating and sharing transactions on the blockchain to be as much as $600 million a year. An intelligent system might be eventually able to compute on the fly the likelihood for specific nodes to be the first performing a certain task, giving the possibility to other miners to shut down their efforts for that specific transaction and cut down the total costs. Furthermore, even if some structural constraints are present, a better efficiency and a lower energy consumption may reduce the network latency allowing then faster transactions.
  • Hardware: Miners, not necessarily companies but also individuals, poured an incredible amount of money into specialized hardware components. Since energy consumption has always been a key issue, many solutions have been proposed and much more will be introduced in the future. As soon as the system becomes more efficient, some piece of hardware might be converted for neural nets use. The mining colossus Bitmain is already doing exactly this.
  • Lack of talent: This is a leap of faith, but in the same way we are trying to automate data science itself (unsuccessfully, to my current knowledge), I don’t see why we couldn’t create virtual agents that can create new ledgers themselves, and even interact on it and maintain it.
  • Data gates: In a future where all our data will be available on a blockchain and companies will be able to directly buy them from us, we will need help to grant access, track data usage, and generally make sense of what happens to our personal information at a computer speed. This is a job for (intelligent) machines.
VI. How IoT is affecting AI
The generation and analysis of data that were not available earlier open a new spectrum of possibilities for an AI to:
  • Become more efficient: This is pretty straightforward, but new both structured and unstructured data can feed an AI and be used for new use cases or achieve a better performance on the existing ones.
  • Improve existing design: Products and services are going to be designed differently from how we know them given the new data an algorithm can digest and analyze.
  • Change the buyer-seller dynamic: The internet of things shifts completely and perhaps counterintuitively the attention from the hardware to the software. The sensors (and their costs) are becoming irrelevant and the post-sales improvements that a manufacturer can do without changing the hardware are the real secret sauce to make an AI more efficient.
VII. How IoT could change blockchain
If there is a clear trend emerging, it is that decentralized systems are hard to work with and expensive to maintain. Although the relationship is less intuitive than other more direct links, IoT can help blockchain in:
  • The nodes structure: IoT devices often act as lightweight nodes of the chain, which are those nodes that simply pass data to the full nodes that instead store the data, create new blocks, and ensure validity. Better and more powerful devices, possibly powered by AI, can turn every lightweight node into a full one.
  • Reducing energy consumption: A network of more efficient hardware devices could indeed help to reduce the current energy consumption of blockchain stacks.
  • Reduce bandwidth and data burden: There are multiple ways to design an IoT-blockchain architecture (Reyna et al., 2018), and of course at least one of those architectures may result in a system where IoT devices communicate and share information between them and eventually load on the blockchain only the relevant data, therefore reducing both bandwidth and data burden.
VIII. Conclusions
As you might have noticed, the edges of the impact of one technology on the others and vice-versa often blur, and this is not by chance but an inevitable consequence of technologies that are born and developed to create an “intelligence flywheel.
In addition to unlocking a set of new technological scenarios, the integration of blockchain, IoT and AI has generated new powerful business models. The shift from product to service and ownership to access is the key to understand the magnitude of the changes in the tech ecosystem. Even more radically, product-as-a-service and product-sharing business models are emerging and winning in almost every markets, leaving the manufacturer in charge of the ownership as well as maintaining the full responsibility of the product and service operation.
It is counterintuitive and even a bit absurd, if you think about it, that the surge in the hardware industry is in fact shifting the attention toward a “servitization” model (Porter and Heppelmann, 2014), which clearly makes more sense where the cost of service is a significant part of the greater cost of ownership (that is the case in the current technology landscape).
This integration does not come without issues, as we have seen, both technical and commercial, as much as of design. Data democratization may also soon erode the data moat barrier AI companies are nowadays building their empires on. Software and algorithms are no longer private but rather open-source. Computational power is now affordable and will be processed directly on-device. What does it all mean for the evolution of the industry? Who knows. I have no idea of how these phenomena will shape our businesses and lives, but I am sure that the changes will happen at an exponential rate.

Source: https://www.forbes.com/sites/cognitiveworld/2018/10/04/the-blockchain-enabled-intelligent-iot-economy/#73b94a652a59

IoT poses special cyber risks



 Internet-connected devices pose special risks for federal agencies, and the National Institute of Standards and Technology is developing guidance to meet the need.
Connected sensors, smart-building technology, drones and autonomous vehicles can't be managed in the same way as traditional IT, according to a NIST draft publication, Considerations for Managing Internet of Things (IoT) Cybersecurity and Privacy Risks. The document points out that basic cybersecurity capabilities often aren't available in IoT devices.
Federal agencies must “consider that IoT presents challenges in achieving those [cybersecurity] outcomes or there are challenges that IoT may present in achieving security controls -- and we wanted to highlight those,” Katerina Megas, program manager for NIST's Cybersecurity for Internet of Things program, told FCW at the Internet of Things Global Summit on Oct. 4.

"We felt putting out something initial on IoT was the most important -- to get something out as quickly as possible," she said. "There will be plans in the future to get more focused, more specialized."
One of NIST's next steps is to develop a potential baseline of cybersecurity standards for IoT devices, she said.
NIST is accepting comments on the draft through Oct. 24. Before a final version is published, Megas said, "we plan on starting to release iterative discussion documents to talk about if there were a baseline for IoT devices."
Robert S. Metzger, a government contracting attorney at Rogers Joseph O'Donnell, said that the federal government is exposed to the security and privacy risks of the IoT ecosystem through relationships with vendors.
"The IoT is all over us whether we know it or not,"  Metzger said. "Even if government is not buying it, so many surfaces upon which government depends are using it. Vendors are using it, and so the government becomes, if you will, not so much a hostage but among those exposed to the IoT deployment by commercial enterprises."
Although the IoT creates new and more attack surfaces for potential bad actors, and it opens up both networks and hardware to potential threats, that doesn’t mean it should be shunned, Metzger said at the conference.
One place the government can begin to ask for better security is in the procurement process for these technologies, according to Tom McDermott, the deputy assistant secretary of cyber policy at the Department of Homeland Security.
"We are always looking to think about how we can use federal procurement authority and federal procurement power to drive better cybersecurity outcomes," McDermott said.
A bill proposed by Sens. Mark Warner (D-Va.) and Cory Gardner (R-Colo.) last year would impose basic cybersecurity standards on IoT devices procured by the federal government, including changeable passwords and a requirement that software and firmware be patchable. So far, the bill hasn't advanced, although a companion measure was introduced in the House of Representatives.
Separately, NIST put out a call in April for ideas on lightweight encryption, with an eye to developing security measures that could be deployed on resource-constrained IoT devices.

Source: https://fcw.com/articles/2018/10/04/iot-nist-cyber-leonard.aspx

Thursday, October 4, 2018

A Two-Minute Guide To Artificial Intelligence



If you keep hearing about artificial intelligence but aren’t quite sure what it means or how it works, you’re not alone. 
There’s been much confusion among the general public about the term, not helped by dramatic news stories about how “AI” will destroy jobs, or companies that overstate their abilities to “use AI.”
A lot of that confusion comes from the misuse of terms like AI and machine learning. So here’s a short text-and-video guide to explain them:


What’s the difference between AI and machine learning?
Think of it like the difference between economics and accounting.
Economics is a field of study, but you wouldn’t hire a Nobel Prize-winning economist to do your taxes. Likewise, artificial intelligence is the field of science covering how computers can make decisions as well as humans. But machine-learning refers to the popular, modern-day technique for creating software that learns from data.
The difference becomes important when money is at stake. Venture capital investors often dismiss AI as full of hype because they’ve got skin in the game. They prefer startups that make machine-learning software with a clear, commercial application, like a platform that can filter company emails with natural language processing or track customers in a store with facial recognition (these are real businesses).
On the other hand, universities and some large tech companies like Facebook and Google have large labs carrying out research that drives the wider field of AI forward. A lot of the tools they invent, like TensorFlow from Google or Pytorch from Facebook, are freely available online.

Why does the term “learning” (e.g., deep learning) crop up everywhere? 
Because the most exciting application of AI today gives computers the ability to “learn” how to carry out a task from data, without being programmed to do that task.
The terminology is confusing because this involves a mishmash of different techniques, many of which also have the word “learning” in their names.
There are, for instance, three core types of machine learning, which can all be carried out in different ways: unsupervised, supervised and reinforcement, and they can also be used with statistical machine learning, Baeysean machine learning or symbolic machine learning.
You don’t really need to be clued up on these though, since the most popular applications of machine learning use a neural network.

What’s a neural network?
It’s a computer system loosely inspired by the human brain that’s been going in and out of fashion for more than 70 years.

So what is “deep learning”? 
That’s a specific approach to using a neural network—essentially, a (deep) neural network with lots of layers. The technique has led to popular services we use today, including speech-recognition on smartphones and Google’s automatic translation. 
In practice, each layer can represent increasingly abstract features. A social media company might, for instance, use a “deep neural network” to recognize faces. One of the first layers describes the dark edges around someone’s head, another describes the edges of a nose and mouth, and another describes blotches of shading. The layers become increasingly abstract, but put together they can represent an entire face.

What does a neural network look like on a screen—a jumble of computer code? 
Basically, yes. Engineers at Google’s AI subsidiary DeepMind write nearly all their code in Python, a general purpose programming language first released in 1991.
Python has been used to develop all sorts of programs, both basic and highly complex, including some of the most popular services on the Web today: YouTube, Instagram and Google. You can learn the basics of Python here.

Does everyone agree that deep-learning neural networks is the best approach to AI? 
No. While neural networks combined with deep learning are seen as the most promising approach to AI today, that could all change in five years.

Source: https://www.forbes.com/sites/parmyolson/2018/10/03/a-two-minute-guide-to-artificial-intelligence/#af5b6cb61c0a

Test Automation Patterns and Good Practices

 


Test automation, or automating the execution of your tests, provides several advantages: it saves your team time, resources, and the headache of having to do everything manually. However, test automation alone doesn't guarantee success. If you don't set up tests in the right way, you end up with the same bad results (just delivered a whole lot faster!).
Here are some patterns and best practices for test automation, so you can be sure your team is getting maximum value from your automation efforts.

Test Code Review

Tests (at the unit, API, UI, and performance level) should be reviewed to analyze their quality and to find mismatches or bad practices. For example, a set of tests could be fulfilling their coverage criteria, sufficiently invoking the intended code sections, but if proper assertions aren't written, the tests will be useless in identifying problems (that is, they will only execute the code, and not analyze whether what it does is good or not).
The practice of test code review also allows another person with a fresh perspective to add more tests that evaluate other aspects of the system, as well as verify that system requirements are clear and met.

Apply Assertions

As mentioned, assertions make the difference between a test that only tries out the code and one that really identifies problems. Each test automation tool verifies responses by providing a way to determine if a test has failed.
On the other hand, it's also best to pay special attention in the definition of assertions to avoid false negatives and false positives, since an overly generic assertion can result in incorrectly valid or invalid tests.

Behavior-Driven Development (BDD)

BDD refers to Behavior-Driven Development. As the name implies, this isn't a testing technique, but rather a development strategy (like TDD, test-driven development). A common language should be established for the business and engineers to be used as a starting point for development and testing. Typically, user stories are defined and later used as an input for test cases creation. Beyond the possibility of automation, the focus should be on using certain business requirements that can later be translated to acceptance tests.
One obvious advantage of BDD is that developers have clear specifications, so they can focus on what needs to be done - and in what way - at the business level. Like TDD where you have a method along with its tests, with BDD you already have an "acceptance test" at a business level.
Another advantage of the BDD approach is that it brings testability to the system. Thinking about functionalities in an independent way (as complete and involving all the layers of the system) guides a type of development that facilitates testing.

Page Object Pattern for UI Testing

When working with automation at any level, it's essential to focus on a solution's scalability and maintainability. That is, it shouldn't take more time to resolve conflicts or update an automation framework than the time saved by the automatic execution. To achieve this, the correct use of design patterns is fundamental.
When writing automated tests, one begins to generate code that contains a lot of knowledge about the structure of the application, like IDs of fields, buttons, and links, etc. Over time, the application will scale and be modified, so it's important that tests can be maintained with minimal effort.
The Page Objects pattern allows you to reuse this knowledge of the application while making each change in the pattern have minimal impact on tests. It consists of completely decoupling what concerns the structure of the user interface from the test code, and encapsulating this information in objects that represent application interface elements. In this way, the Page Object pattern strengthens tests.
By following a BDD approach, these types of patterns should also be applied, since they are complementary practices. This also applies to any automatic verification at the graphic interface level, regardless of whether Selenium (for web), Appium (for mobile) or any other tool is used.

Data-Driven Testing

While data-driven testing goes beyond a design pattern and involves a whole testing approach, we include it in this post as part of good programming practices.
Essentially, automated tests are parameterized so that the same steps can be executed with diverse amounts of test data. That way, simply adding data to the data file creates new test cases.
The greatest benefit of doing this comes when the same sequence of steps is executed with different data sets, parameterizing both the input data and the expected output data.

Parallel Execution of Tests

Unit tests and API-level tests generally run faster than UI tests. On the other hand, automated tests at the graphic interface level require access to the end-to-end system and handling of interface elements so, they require more execution time. This is why different strategies like parallel execution have become really useful.
If you work on a web application and automate with Selenium (the most commonly used option) it's a good practice to use Selenium Grid. With its multiple nodes, Selenium Grid has the advantage of easily combining operating systems with several browsers accessing the application. This leads to running several tests in parallel using the Grid reducing testing time.
When combining Selenium Grid with any integration tool as Jenkins, tests are triggered by connecting to the Grid and run on each node. Results are stored in association with the task (Job in Jenkins terminology) keeping track of the executions.
These are just some of the patterns and best practices you can use to improve your test automation.
With the right combination of tools and practices, you can move forward knowing test automation is propelling your team in the right direction.

Source: https://dzone.com/articles/test-automation-patterns-and-good-practices

Tags

Recent Post