Posts

Showing posts from December, 2023

Microchip to Boost Edge AI with NVIDIA Holoscan

Image
 The NVIDIA Holoscan AI sensor processing platform, SDK and development ecosystem has helped to streamline the design and deployment of AI and high-performance computing ( HPC ) applications at the edge for real-time insights. Now, FPGAs are unlocking new edge-to-cloud applications for this advanced AI platform while enabling AI/ML inferencing and facilitating the adoption of AI in the medical, industrial and automotive markets.  Microchip's new PolarFire FPGA Ethernet Sensor Bridge is empowering developers to create innovative, real-time solutions with NVIDIA’s edge AI and robotics platforms that will revolutionize sensor interfaces across a wide range of powerful applications. To enable developers building artificial intelligence (AI)-driven sensor processing systems, Microchip Technology has released its PolarFire FPGA Ethernet Sensor Bridge that works with the NVIDIA Holoscan sensor processing platform.  Accelerating Real-time Edge AI with NVIDIA Holoscan The Pola...

Eyes of the machine: Computer Vision and its unravelling wonders

Image
One of the most powerful and compelling types of AI is computer vision which you’ve almost surely experienced in any number of ways without even knowing. Here’s a look at what it is, how it works, and why it’s so awesome (and is only going to get better). Computer vision is the field of computer science that focuses on replicating parts of the complexity of the human vision system and enabling computers to identify and process objects in images and videos in the same way that humans do. Until recently, computer vision only worked in limited capacity. Thanks to advances in artificial intelligence and innovations in deep learning and neural networks, the field has been able to take great leaps in recent years and has been able to surpass humans in some tasks related to detecting and labeling objects. One of the driving factors behind the growth of computer vision is the amount of data we generate today that is then used to train and make computer vision better. Along with a tremendous am...

Understanding Algorithms in Computer Science

Image
  An algorithm is a specific procedure for solving a well-defined computational problem.  The development and analysis of algorithms is fundamental to all aspects of computer science: artificial intelligence, databases, graphics, networking, operating systems, security, and so on. Algorithm development is more than just programming. It requires an understanding of the alternatives available for solving a computational problem, including the hardware, networking, programming language, and performance constraints that accompany any particular solution. It also requires understanding what it means for an algorithm to be “correct” in the sense that it fully and efficiently solves the problem at hand. Algorithms: their meaning in computer science In Computer Science, an algorithm is a list set of instructions, used to solve problems or perform tasks, based on the understanding of available alternatives.  Algorithms are more than just programming, they are specifications for pe...

IBM Releases First-Ever 1000-Qubit Quantum Chip

Image
 The International Business Machines Corporation, nicknamed Big Blue, is an American multinational technology corporation headquartered in Armonk, New York and is present in over 175 countries.  IBM is known for its hardware and software products, including computers, servers, storage systems and networking equipment. It also provides consulting, technology and business services, such as cloud computing, data analytics and artificial intelligence (AI). IBM has unveiled the first quantum computer with more than 1,000 qubits — the equivalent of the digital bits in an ordinary computer. But the company says it will now shift gears and focus on making its machines more error-resistant rather than larger. For years, IBM has been following a quantum-computing road map that roughly doubled the number of qubits every year. The chip unveiled on 4 December, called Condor, has 1,121 superconducting qubits arranged in a honeycomb pattern. It follows on from its other record-setting, bird-...