Devoogle tiene indexados actualmente 14562 recursos relacionados con el desarrollo de software.

CNN, specialized neural networks for Computer Vision tasks, are used in sensitive contexts and exposed in the wild. While extremely accurate, they are also sensitive to imperceptible perturbations that can’t be detected by human eyes. For this reason they have been targeted by hackers which implemented AI-based techniques for their malicious purposes. During the presentation I am going to explain defense strategies to mitigate the effect of such attacks and make neural networks more robust to them, while at the same time keeping minimal impact on the accuracy of the model and implementation costs. #BIGTH20 #AI #MachineLearning #DeepLearing #ComputerVision #Security Session presented at Big Things Conference 2020 by Guglielmo Iozzia, Associate Director - Business Tech Analysis, IT & AI at MSD 16th November 2020 Home Edition Do you want to know more?
The idea of computation is without doubt the most important intellectual advancement of this century. But what does it ultimately mean, and how can we best take advantage of it? This talk will be about computational language and how it allows the power of computation to be tapped, and how it makes possible ubiquitous computational intelligence. The talk will include live demos of the latest computational intelligence in the Wolfram Language, that now powers the world’s major intelligent assistants, as well as many innovative R&D initiatives. Four hundred years ago the invention of mathematical notation opened up the possibility of algebra, calculus and the growth of mathematical science. Learn how computational language as implemented in the Wolfram Language is now opening up “computational X” fields, and how the Wolfram Language defines a bridge between the power of the computational universe and the description of human goals. The talk will discuss how computational language and the computational paradigm has recently made possible the breakthrough Wolfram Physics Project, which is finding a fundamental theory of physics, and whose formalism can be applied to redefine distributed computing in physics-based terms. Talk will also discuss how computational language makes possible computational contracts and computational constitutions for AI ethics. The concept of computational language—implemented over the past 34 years in the Wolfram Language—is destined to be a core element of the future of technology and much more. Many of its implications are still far in the future, but it’s immediately applicable today to create “artifacts from the future”, as an increasing number of organizations are now doing. #BIGTH20 #AI #Cloud #DataScience #BigData #MachineLearning Session presented at Big Things Conference 2020 by Stephen Wolfram, Founder and CEO of Wolfram Research 2020Home EditionDo you want to know more?
Business transformation Cloud Process History is a Software as a Service (SaaS) belonging to CEPSA’s digital catalogue and designed using the latest Cloud technology, which is used for the capture, storage, visualization and advanced real-time analysis of relevant information from industrial facilities or corporative data. Main Features: a) Ingestion, processing and storage Ingestion, enrichment and storage with unlimited capacity of event typologies and/or devices that emit information with disparate structure and content. b) Dynamic configuration and enrichment Parameterized configuration via Web of event typologies, process units and locations. c) Monitoring, Data Quality and Real Time Visualization Real Time Monitoring, Time Series and Dashboard Display, Data Governance and Quality. d) Advanced Analytics and DataLabs Connection to analytical platforms oriented to data scientist. Creation of scalable laboratory environments for high volumetric information. WorkSpaces for business analysts and standard users. #BIGTH20 #Cloud #BigData, #IoT, #Analytics, #Visualization Session presented at Big Things Conference 2020 by Alberto García Hernández, Big Data & Analytics Expert at CEPSA 16th November 2020 Home Edition Do you want to know more?
Natural Language Processing (NLP) is nowadays one of the main points of focus of artificial intelligence and machine learning technologies. While conversational agents such as Siri or Alexa are the most visible representatives of NLP, this field finds wide applications in search engines, chatbots, customer service, opinion mining, and so on. The high levels of success that such NLP solutions have achieved in recent years are mostly fueled by three factors: the public availability of very large datasets (corpora) of web text, the fast upscaling in specialized hardware capabilities (GPUs and TPUs), and the improvements of deep learning models adapted for language. Focusing on this last point, the so called “language models” have been proven to be quite effective in leveraging the large datasets available. A language model is a deep artificial neural network trained on unlabeled corpora with the aim of modelling the distribution of words (or word pieces) in a particular language. In this way, and while trained in an unsupervised fashion, a language model is able to perform NLP tasks such as filling gaps in sentences or generating text following a cue. #BIGTH20 #AI #NLP #DeepLearning #MachineLearning Session presented at Big Things Conference 2020 by Álvaro Barbero Jiménez, Chief Data Scientist at IIC 17th November 2020 Home Edition Do you want to know more?
Medicsen presented in last years BIG THINGS conference regarding a glucose predictive software that generated plenty of interest. Since then, we have created new functionalities based on relevant tech innovations to improve healthcare outcomes and satisfaction. We have created a Software that predicts the risk of a certain patient currently having a disease and it ́s future likelihood of developing a chronic condition (initially focused in diabetes), all based on GDPR compliant data and oriented towards improving patient ́s health and provider ́s cost of services. Optimize business and patient outcomes • Cost savings (Understand workfrow and reduce risk) • Differential value of service (personalized approach) • New business opportunities (predicted future costs) Improve patient´s health and satisfaction, Reduce cost of care, and open new business opportunities With an algorithm that tracks data in real time, predicts future problems and offers valuable insights and recommendations through a multi-platform interactive dashboard. Better understanding workflows, calculating total cost of problems and efficiently prioritizing interventions by their return on investment potential. Accurate, safe and affordable. EXPERT ANALYSIS + PREDICTIVE ALGORITHMS + SUPREME DATA VISUALIZATION • CLASSIFICATION of patients in risk groups according to their current health state and potential of future chronic conditions arising • PREDICTION of likelihood of future disease based on GDPR compliant data, non biometrical. • Integration with ANALYSIS of business variables (costs, times …) • MEDICAL ACTS as the only necessary data-points • ADAPTED to the particularities of the health provider • Graphic and interactive DISPLAY of the information through an online dashboard that can be accessed from multiple platforms. We know that GDPR is a real liability for these companies, so we added a differential value point, which is that our technology DOES – NOT – USE – BIOMETRICAL – DATA. This means that we don ́t need to know the results of a blood test, only to know that the patient took it, so We only need data that healthcare companies CAN use without compliance teams having trouble. If additional data is available, we can use it, but it is not required. OBSERVE: Automatic data tracking connected to the current databases of the healthcare provider to create an enriched layer of aggregated data and a platform to visualize (dashboard), hosted on the cloud and working on any device. DISCOVER: • Unbury hidden insights that drive patient´s health and cost generation • Classify patients in groups according to the most relevant variables for each case • Individual patient or patient group health risk score and costs. Likelihood of having a certain disease PREDICT: • Model the complex interplay of disease progression and service utilization to Forecast the future progression of individual and group risk and costs, anticipating chronic illnesses and consumption of services. • Who might get sick? Predict population health based on future clinical patterns • Disease detection/prediction algorithms DECIDE: • Obtain recommendations and insights to guide decisions on: o Optimal policy to insure o Creating new plans for new verticals of clients o Optimizing resource allocation among patients and centers o Interventions to reduce future patient and group risk and costs Medicsen has validated this technology with international healthcare providers and now we are working on standardizing data access and analysis to decrease cost of adaptation. #BIGTH20 #AI #Cloud #DataScience #BigData #MachineLearning Session presented at Big Things Conference 2020 by EDUARDO W. JØRGENSEN CEO, Medicsen 2020Home EditionDo you want to know more?
Semantic segmentation is the classification of every pixel in an image/video. The segmentation partitions a digital image into multiple objects to simplify/change the representation of the image into something that is more meaningful and easier to analyze. The technique has a wide variety of applications ranging from perception in autonomous driving scenarios to cancer cell segmentation for medical diagnosis. Exponential growth in the datasets that require such segmentation is driven by improvements in the accuracy and quality of the sensors generating the data. This growth is further compounded by exponential advances in cloud technologies enabling the storage and compute available for such applications. The need for the semantically segmented datasets is a key requirement to improve the accuracy of inference engines that are built upon them. #BIGTH20 #AI #Cloud #Analytics #MachineLearning #Big Data #Deep Learning #Visualization #ComputerVision Session presented at Big Things Conference 2020 by Arvind Hosagrahara, Chief Solutions Architect at MathWorks 17th November 2020 Home Edition Do you want to know more?
Voice Technologies are becoming ubiquitous today but there was a time when the only option was to let our fingers do the talking on keyboards. User interfaces have come a long way so that recent technological breakthroughs have allowed us to effectively “talk” to our technology to get things done. In the era of voice assistants people are using voice naturally to get directions, find recipes, listen to music, hear stories, play games, relax and more. Thanks to machine learning, Voice recognition is now accurate enough to make it a powerful option to materialize clever voice assistants besides a big range of other tasks. #BIGTH20 #AI #MachineLearning Session presented at Big Things Conference 2020 by Germán Viscuso, Sr. Technical Evangelist, Amazon 27th November 2020 Home Edition Do you want to know more?
Application of Computer Vision techniques together with advanced analytical techniques strongly supported by industrial knowledge that allow detecting the probability of interior and exterior corrosion in pipes, integrating different data sources such as flow rates, compositions, thickness measurements, … Made for 30 of the 3,500 lines at the Tarragona refinery, for internal corrosion, and 7 lines at the Tarragona refinery and 8 lines at the Coruña refinery for external corrosion Allows you to perform inspections more accurately and effectively. #BIGTH20 #AI #MachineLearning #DeepLearing #DataScience #Analytics #Visualization #ComputerVision #BigData Session presented at Big Things Conference 2020 by Emilio Martín Gallardo, Senior Data Scientist at Repsol and Elena Tomas, Senior Data Scientist at Repsol Data & Analytics Hub 17th November 2020 Home Edition Do you want to know more?
Nowadays, an increasing number of business problems rely on the analysis of real-time metrics. Typical use cases range from credit fraud detection to predictive maintenance. Also, we are moving towards an era where all sensors and devices are connected to the internet, I.e. IoT, which monitor the performance of different KPIs. For this reason, it is crucial to extend and refine real-time analytics to streaming data sources to reach fast-developing sectors such as: Smart Cities, Industry 4.0, Smart Healthcare, etc. In this talk we will focus on unsupervised real-time anomaly detection. For this type of setups, it is a standard practice to set up thresholds for the detection of anomalies. #BIGTH20 #AI #Analytics #IoT #MachineLearning #DeepLearning #DataScience Session presented at Big Things Conference 2010 by Aitor Landete, Data Scientist at Telefónica and Pablo Mateos, Data Scientist at Telefónica 17th November 2020 Home Edition Do you want to know more?
The aviation sector is responsible of the 2% of Global Greenhouse Emissions (CO2) and the sector is committed to reduce, by 2050, its net CO2 emissions to 50% of what they were in 2005. Considering that in 2037 the number of passengers will be doubled, at constant 3,7% anual growth, the aviation sector must optimize its fuel efficiency by 2% every year to achieve the goal by: better air operations (air navigation procedures), better techonology (engines and materials) and better fuels (biofuels). In addition to that, airlines must offset their emissions to be CO2 neutral with environmental projects through offsetting schemes, for example EU-ETS in Europe, CORSIA, new ongoing programme promoted by United Nations and IATA. #BIGTH20 #AI #Visualization #Analytics #BigData, #TechforGood Session presented at Big Things Conference 2020 by Pedro García, Consultant at Isdefe and Alberto Uceda, Consultant at Isdefe 17th November 2020 Home Edition Do you want to know more?