Search results for: Artificial Neural network
3350 Translation Quality Assessment in Fansubbed English-Chinese Swearwords: A Corpus-Based Study of the Big Bang Theory
Authors: Qihang Jiang
Abstract:
Fansubbing, the combination of fan and subtitling, is one of the main branches of Audiovisual Translation (AVT) having kindled more and more interest of researchers into the AVT field in recent decades. In particular, the quality of so-called non-professional translation seems questionable due to the non-transparent qualification of subtitlers in a huge community network. This paper attempts to figure out how YYeTs aka 'ZiMuZu', the largest fansubbing group in China, translates swearwords from English to Chinese for its fans of the prevalent American sitcom The Big Bang Theory, taking cultural, social and political elements into account in the context of China. By building a bilingual corpus containing both the source and target texts, this paper found that most of the original swearwords were translated in a toned-down manner, probably due to Chinese audiences’ cultural and social network features as well as the strict censorship under the Chinese government. Additionally, House (2015)’s newly revised model of Translation Quality Assessment (TQA) was applied and examined. Results revealed that most of the subtitled swearwords achieved their pragmatic functions and exerted a communicative effect for audiences. In conclusion, this paper enriches the empirical research concerning House’s new TQA model, gives a full picture of the subtitling of swearwords in AVT field and provides a practical guide for the practitioners in their career of subtitling.Keywords: corpus-based approach, fansubbing, pragmatic functions, swearwords, translation quality assessment
Procedia PDF Downloads 1493349 Global Navigation Satellite System and Precise Point Positioning as Remote Sensing Tools for Monitoring Tropospheric Water Vapor
Authors: Panupong Makvichian
Abstract:
Global Navigation Satellite System (GNSS) is nowadays a common technology that improves navigation functions in our life. Additionally, GNSS is also being employed on behalf of an accurate atmospheric sensor these times. Meteorology is a practical application of GNSS, which is unnoticeable in the background of people’s life. GNSS Precise Point Positioning (PPP) is a positioning method that requires data from a single dual-frequency receiver and precise information about satellite positions and satellite clocks. In addition, careful attention to mitigate various error sources is required. All the above data are combined in a sophisticated mathematical algorithm. At this point, the research is going to demonstrate how GNSS and PPP method is capable to provide high-precision estimates, such as 3D positions or Zenith tropospheric delays (ZTDs). ZTDs combined with pressure and temperature information allows us to estimate the water vapor in the atmosphere as precipitable water vapor (PWV). If the process is replicated for a network of GNSS sensors, we can create thematic maps that allow extract water content information in any location within the network area. All of the above are possible thanks to the advances in GNSS data processing. Therefore, we are able to use GNSS data for climatic trend analysis and acquisition of the further knowledge about the atmospheric water content.Keywords: GNSS, precise point positioning, Zenith tropospheric delays, precipitable water vapor
Procedia PDF Downloads 2023348 Design of a Real Time Closed Loop Simulation Test Bed on a General Purpose Operating System: Practical Approaches
Authors: Pratibha Srivastava, Chithra V. J., Sudhakar S., Nitin K. D.
Abstract:
A closed-loop system comprises of a controller, a response system, and an actuating system. The controller, which is the system under test for us, excites the actuators based on feedback from the sensors in a periodic manner. The sensors should provide the feedback to the System Under Test (SUT) within a deterministic time post excitation of the actuators. Any delay or miss in the generation of response or acquisition of excitation pulses may lead to control loop controller computation errors, which can be catastrophic in certain cases. Such systems categorised as hard real-time systems that need special strategies. The real-time operating systems available in the market may be the best solutions for such kind of simulations, but they pose limitations like the availability of the X Windows system, graphical interfaces, other user tools. In this paper, we present strategies that can be used on a general purpose operating system (Bare Linux Kernel) to achieve a deterministic deadline and hence have the added advantages of a GPOS with real-time features. Techniques shall be discussed how to make the time-critical application run with the highest priority in an uninterrupted manner, reduced network latency for distributed architecture, real-time data acquisition, data storage, and retrieval, user interactions, etc.Keywords: real time data acquisition, real time kernel preemption, scheduling, network latency
Procedia PDF Downloads 1513347 Analysis of Histogram Asymmetry for Waste Recognition
Authors: Janusz Bobulski, Kamila Pasternak
Abstract:
Despite many years of effort and research, the problem of waste management is still current. So far, no fully effective waste management system has been developed. Many programs and projects improve statistics on the percentage of waste recycled every year. In these efforts, it is worth using modern Computer Vision techniques supported by artificial intelligence. In the article, we present a method of identifying plastic waste based on the asymmetry analysis of the histogram of the image containing the waste. The method is simple but effective (94%), which allows it to be implemented on devices with low computing power, in particular on microcomputers. Such de-vices will be used both at home and in waste sorting plants.Keywords: waste management, environmental protection, image processing, computer vision
Procedia PDF Downloads 1243346 Assignment of Legal Personality to Robots: A Premature Meditation
Authors: Solomon Okorley
Abstract:
With the emergence of artificial intelligence, a proposition that has been made with increasing conviction is the need to assign legal personhood to robots. A major problem that arises when dealing with robots is the issue of liability: who do it hold liable when a robot causes harm? The suggestion to assign legal personality to robots has been made to aid in the assignment of liability. This paper contends that it is premature to assign legal personhood to robots. The paper employed the doctrinal and comparative research methodology. The paper first discusses the various theories that underpin the granting of legal personhood to juridical personalities to ascertain whether these theories can aid in the proposition to assign legal personhood to robots. These theories include fiction theory, aggregate theory, realist theory, and organism theory. Except for the aggregate theory, the fiction theory, the realist theory and the organism theory provide a good foundation to the proposal for legal personhood to be assigned to robots. The paper considers whether robots should be assigned legal personhood from a jurisprudential approach. The legal positivists assert that no metaphysical presuppositions are needed to determine who could be a legal person: the sole deciding factor is the engagement in legal relations and this prerequisite could be fulfilled by robots. However, rationalists, religionists and naturalists assert that the satisfaction of the metaphysical criteria is the basis of legal personality and since robots do not possess this feature, they cannot be assigned legal personhood. This differing perspective shows that the jurisprudential school of thought to which one belongs influences the decision whether to assign legal personhood to robots. The paper makes arguments for and against the assigning of legal personhood to robots. Assigning legal personhood to robots is necessary for the assigning of liability; and since robots are independent in their operation, they should be assigned legal personhood. However, it is argued that the degree of autonomy is insufficient. Robots do not understand legal obligations; they do not have a will of their own and the purported autonomy that they possess is an ‘imputed autonomy’. A crucial question to be asked is ‘whether it is desirable to confer legal personhood on robots’ and not ‘whether legal personhood should be assigned to robots’. This is due to the subjective nature of the responses to such a question as well as the peculiarities of countries in response to this question. The main argument in support of assigning legal personhood to robots is to aid in assigning liability. However, it is argued conferring legal personhood on robots is not the only way to deal with liability issues. Since any of the stakeholders involved with the robot system can be held liable for an accident, it is not desirable to assign legal personhood to robot. It is forecasted that in the epoch of strong artificial intelligence, granting robots legal personhood is plausible; however, in the current era, it is premature.Keywords: autonomy, legal personhood, premature, jurisprudential
Procedia PDF Downloads 753345 Homeless Population Modeling and Trend Prediction Through Identifying Key Factors and Machine Learning
Authors: Shayla He
Abstract:
Background and Purpose: According to Chamie (2017), it’s estimated that no less than 150 million people, or about 2 percent of the world’s population, are homeless. The homeless population in the United States has grown rapidly in the past four decades. In New York City, the sheltered homeless population has increased from 12,830 in 1983 to 62,679 in 2020. Knowing the trend on the homeless population is crucial at helping the states and the cities make affordable housing plans, and other community service plans ahead of time to better prepare for the situation. This study utilized the data from New York City, examined the key factors associated with the homelessness, and developed systematic modeling to predict homeless populations of the future. Using the best model developed, named HP-RNN, an analysis on the homeless population change during the months of 2020 and 2021, which were impacted by the COVID-19 pandemic, was conducted. Moreover, HP-RNN was tested on the data from Seattle. Methods: The methodology involves four phases in developing robust prediction methods. Phase 1 gathered and analyzed raw data of homeless population and demographic conditions from five urban centers. Phase 2 identified the key factors that contribute to the rate of homelessness. In Phase 3, three models were built using Linear Regression, Random Forest, and Recurrent Neural Network (RNN), respectively, to predict the future trend of society's homeless population. Each model was trained and tuned based on the dataset from New York City for its accuracy measured by Mean Squared Error (MSE). In Phase 4, the final phase, the best model from Phase 3 was evaluated using the data from Seattle that was not part of the model training and tuning process in Phase 3. Results: Compared to the Linear Regression based model used by HUD et al (2019), HP-RNN significantly improved the prediction metrics of Coefficient of Determination (R2) from -11.73 to 0.88 and MSE by 99%. HP-RNN was then validated on the data from Seattle, WA, which showed a peak %error of 14.5% between the actual and the predicted count. Finally, the modeling results were collected to predict the trend during the COVID-19 pandemic. It shows a good correlation between the actual and the predicted homeless population, with the peak %error less than 8.6%. Conclusions and Implications: This work is the first work to apply RNN to model the time series of the homeless related data. The Model shows a close correlation between the actual and the predicted homeless population. There are two major implications of this result. First, the model can be used to predict the homeless population for the next several years, and the prediction can help the states and the cities plan ahead on affordable housing allocation and other community service to better prepare for the future. Moreover, this prediction can serve as a reference to policy makers and legislators as they seek to make changes that may impact the factors closely associated with the future homeless population trend.Keywords: homeless, prediction, model, RNN
Procedia PDF Downloads 1233344 Gesture-Controlled Interface Using Computer Vision and Python
Authors: Vedant Vardhan Rathour, Anant Agrawal
Abstract:
The project aims to provide a touchless, intuitive interface for human-computer interaction, enabling users to control their computer using hand gestures and voice commands. The system leverages advanced computer vision techniques using the MediaPipe framework and OpenCV to detect and interpret real time hand gestures, transforming them into mouse actions such as clicking, dragging, and scrolling. Additionally, the integration of a voice assistant powered by the Speech Recognition library allows for seamless execution of tasks like web searches, location navigation and gesture control on the system through voice commands.Keywords: gesture recognition, hand tracking, machine learning, convolutional neural networks
Procedia PDF Downloads 253343 Use of Artificial Intelligence and Two Object-Oriented Approaches (k-NN and SVM) for the Detection and Characterization of Wetlands in the Centre-Val de Loire Region, France
Authors: Bensaid A., Mostephaoui T., Nedjai R.
Abstract:
Nowadays, wetlands are the subject of contradictory debates opposing scientific, political and administrative meanings. Indeed, given their multiple services (drinking water, irrigation, hydrological regulation, mineral, plant and animal resources...), wetlands concentrate many socio-economic and biodiversity issues. In some regions, they can cover vast areas (>100 thousand ha) of the landscape, such as the Camargue area in the south of France, inside the Rhone delta. The high biological productivity of wetlands, the strong natural selection pressures and the diversity of aquatic environments have produced many species of plants and animals that are found nowhere else. These environments are tremendous carbon sinks and biodiversity reserves depending on their age, composition and surrounding environmental conditions, wetlands play an important role in global climate projections. Covering more than 3% of the earth's surface, wetlands have experienced since the beginning of the 1990s a tremendous revival of interest, which has resulted in the multiplication of inventories, scientific studies and management experiments. The geographical and physical characteristics of the wetlands of the central region conceal a large number of natural habitats that harbour a great biological diversity. These wetlands, one of the natural habitats, are still influenced by human activities, especially agriculture, which affects its layout and functioning. In this perspective, decision-makers need to delimit spatial objects (natural habitats) in a certain way to be able to take action. Thus, wetlands are no exception to this rule even if it seems to be a difficult exercise to delimit a type of environment as whose main characteristic is often to occupy the transition between aquatic and terrestrial environment. However, it is possible to map wetlands with databases, derived from the interpretation of photos and satellite images, such as the European database Corine Land cover, which allows quantifying and characterizing for each place the characteristic wetland types. Scientific studies have shown limitations when using high spatial resolution images (SPOT, Landsat, ASTER) for the identification and characterization of small wetlands (1 hectare). To address this limitation, it is important to note that these wetlands generally represent spatially complex features. Indeed, the use of very high spatial resolution images (>3m) is necessary to map small and large areas. However, with the recent evolution of artificial intelligence (AI) and deep learning methods for satellite image processing have shown a much better performance compared to traditional processing based only on pixel structures. Our research work is also based on spectral and textural analysis on THR images (Spot and IRC orthoimage) using two object-oriented approaches, the nearest neighbour approach (k-NN) and the Super Vector Machine approach (SVM). The k-NN approach gave good results for the delineation of wetlands (wet marshes and moors, ponds, artificial wetlands water body edges, ponds, mountain wetlands, river edges and brackish marshes) with a kappa index higher than 85%.Keywords: land development, GIS, sand dunes, segmentation, remote sensing
Procedia PDF Downloads 763342 Upgrades for Hydric Supply in Water System Distribution: Use of the Bayesian Network and Technical Expedients
Authors: Elena Carcano, James Ball
Abstract:
This work details the strategies adopted by the Italian Water Utilities during the distribution of water in emergency conditions which glide from earthquakes and droughts to floods and fires. Several water bureaus located over the national territory have been interviewed, and the collected information has been used in a database of potential interventions to be taken. The work discusses the actions adopted by water utilities. These are generally prioritized in order to minimize the social, temporal, and economic burden that the damaged and nearby areas need to support. Actions are defined relying on the Bayesian Network Approach, which constitutes the hard core of any decision support system. The Bayesian Networks give answers to interventions to real and most likely risky cases. The added value of this research consists in supplying the National Bureau, namely Protezione Civile, in charge of managing havoc and catastrophic situations with a univocal plot outline so as to be able to handle actions uniformly at the expense of different local laws or contradictory customs which squander any recovery conditions, proper technical service, and economic aids. The paper is organized as follows: in section 1, the introduction is stated; section 2 provides a brief discussion of BNNs (Bayesian Networks), section 3 introduces the adopted methodology; and in the last sections, results are presented, and conclusions are drawn.Keywords: hierarchical process, strategic plan, water emergency conditions, water supply
Procedia PDF Downloads 1673341 An Extended Domain-Specific Modeling Language for Marine Observatory Relying on Enterprise Architecture
Authors: Charbel Aoun, Loic Lagadec
Abstract:
A Sensor Network (SN) is considered as an operation of two phases: (1) the observation/measuring, which means the accumulation of the gathered data at each sensor node; (2) transferring the collected data to some processing center (e.g., Fusion Servers) within the SN. Therefore, an underwater sensor network can be defined as a sensor network deployed underwater that monitors underwater activity. The deployed sensors, such as Hydrophones, are responsible for registering underwater activity and transferring it to more advanced components. The process of data exchange between the aforementioned components perfectly defines the Marine Observatory (MO) concept which provides information on ocean state, phenomena and processes. The first step towards the implementation of this concept is defining the environmental constraints and the required tools and components (Marine Cables, Smart Sensors, Data Fusion Server, etc). The logical and physical components that are used in these observatories perform some critical functions such as the localization of underwater moving objects. These functions can be orchestrated with other services (e.g. military or civilian reaction). In this paper, we present an extension to our MO meta-model that is used to generate a design tool (ArchiMO). We propose new constraints to be taken into consideration at design time. We illustrate our proposal with an example from the MO domain. Additionally, we generate the corresponding simulation code using our self-developed domain-specific model compiler. On the one hand, this illustrates our approach in relying on Enterprise Architecture (EA) framework that respects: multiple views, perspectives of stakeholders, and domain specificity. On the other hand, it helps reducing both complexity and time spent in design activity, while preventing from design modeling errors during porting this activity in the MO domain. As conclusion, this work aims to demonstrate that we can improve the design activity of complex system based on the use of MDE technologies and a domain-specific modeling language with the associated tooling. The major improvement is to provide an early validation step via models and simulation approach to consolidate the system design.Keywords: smart sensors, data fusion, distributed fusion architecture, sensor networks, domain specific modeling language, enterprise architecture, underwater moving object, localization, marine observatory, NS-3, IMS
Procedia PDF Downloads 1823340 Pedestrian Areas, Development Stimulus in Urban Old Fabrics; Analyzing Stroget, Pedestrian Street in Copenhagen
Authors: Kiomars Habibi, Mostafa Behzadfar, Airin Jaberi
Abstract:
Designing appropriate places for the comfort of pedestrians is one of the most important aspects of modern urbanization and renovation and rehabilitation stimulus of urban old fabrics. So, that special cities designed for pedestrians with a complete network of streets without cars, can be considered as one of the best habitations in the world. The number of these cities with a network of streets and squares in which beauty, enjoyment and comfort are mostly concerned for the pedestrians designed regions is increasing around the world, such as Stockholm, Copenhagen, Munich, Frankfurt, Venice, Rome, etc. In this paper, we are going to explain the influential factors regarding the efficiency of these cities by identifying one of the most important pedestrian ways of the world; Strøget is a car free zone in Copenhagen, Denmark. This popular tourist attraction in the center of town is the longest pedestrian shopping area in Europe. Analyses indicate that world-wide experience concerning the renovation and rehabilitation of old fabrics has many advantages in exploiting the idea of pedestrian way for regeneration of old fabrics. Transforming the streets to appropriate places for the comfort of pedestrians, expanding the public spaces such as city squares, and decreasing the masses of building alongside the brought comfort and peace is the main reason in the success of Strøget pedestrian street in urban old fabrics of Copenhagen. Hypothesis: The Strøget pedestrian street has been the development stimulus in Copenhagen and the urban old fabrics development as a resultKeywords: development, stimulus, pedestrian street, urban landscape, Stroget
Procedia PDF Downloads 1153339 Investigations of the Crude Oil Distillation Preheat Section in Unit 100 of Abadan Refinery and Its Recommendation
Authors: Mahdi GoharRokhi, Mohammad H. Ruhipour, Mohammad R. ZamaniZadeh, Mohsen Maleki, Yusef Shamsayi, Mahdi FarhaniNejad, Farzad FarrokhZadeh
Abstract:
Possessing massive resources of natural gas and petroleum, Iran has a special place among all other oil producing countries, according to international institutions of energy. In order to use these resources, development and functioning optimization of refineries and industrial units is mandatory. Heat exchanger is one of the most important and strategic equipment which its key role in the process of production is clear to everyone. For instance, if the temperature of a processing fluid is not set as needed by heat exchangers, the specifications of desired product can change profoundly. Crude oil enters a network of heat exchangers in atmospheric distillation section before getting into the distillation tower; in this case, well-functioning of heat exchangers can significantly affect the operation of distillation tower. In this paper, different scenarios for pre-heating of oil are studied using oil and gas simulation software, and the results are discussed. As we reviewed various scenarios, adding a heat exchanger to pre-heating network is proposed as the most efficient factor in improving all governing parameters of the tower i.e. temperature, pressure, and reflux rate. This exchanger is embedded in crude oil’s path. Crude oil enters the exchanger after E-101 and exchanges heat with discharging kerosene pump around from E-136. As depicted in the results, it will efficiently assist the improvement of process operation and side expenses.Keywords: atmospheric distillation unit, heat exchanger, preheat, simulation
Procedia PDF Downloads 6643338 Transforming Breast Density Measurement with Artificial Intelligence: Population-Level Insights from BreastScreen NSW
Authors: Douglas Dunn, Ricahrd Walton, Matthew Warner-Smith, Chirag Mistry, Kan Ren, David Roder
Abstract:
Introduction: Breast density is a risk factor for breast cancer, both due to increased fibro glandular tissue that can harbor malignancy and the masking of lesions on mammography. Therefore, evaluation of breast density measurement is useful for risk stratification on an individual and population level. This study investigates the performance of Lunit INSIGHT MMG for automated breast density measurement. We analyze the reliability of Lunit compared to breast radiologists, explore density variations across the BreastScreen NSW population, and examine the impact of breast implants on density measurements. Methods: 15,518 mammograms were utilized for a comparative analysis of intra- and inter-reader reliability between Lunit INSIGHT MMG and breast radiologists. Subsequently, Lunit was used to evaluate 624,113 mammograms for investigation of density variations according to age and birth country, providing insights into diverse population subgroups. Finally, we compared breast density in 4,047 clients with implants to clients without implants, controlling for age and birth country. Results: Inter-reader variability between Lunit and Breast Radiologists weighted kappa coefficient was 0.72 (95%CI 0.71-0.73). Highest breast densities were seen in women with a North-East Asia background, whilst those of Aboriginal background had the lowest density. Across all backgrounds, density was demonstrated to reduce with age, though at different rates according to country of birth. Clients with implants had higher density relative to the age-matched no-implant strata. Conclusion: Lunit INSIGHT MMG demonstrates reasonable inter- and intra-observer reliability for automated breast density measurement. The scale of this study is significantly larger than any previous study assessing breast density due to the ability to process large volumes of data using AI. As a result, it provides valuable insights into population-level density variations. Our findings highlight the influence of age, birth country, and breast implants on density, emphasizing the need for personalized risk assessment and screening approaches. The large-scale and diverse nature of this study enhances the generalisability of our results, offering valuable information for breast cancer screening programs internationally.Keywords: breast cancer, screening, breast density, artificial intelligence, mammography
Procedia PDF Downloads 193337 Reducing Hazardous Materials Releases from Railroad Freights through Dynamic Trip Plan Policy
Authors: Omar A. Abuobidalla, Mingyuan Chen, Satyaveer S. Chauhan
Abstract:
Railroad transportation of hazardous materials freights is important to the North America economics that supports the national’s supply chain. This paper introduces various extensions of the dynamic hazardous materials trip plan problems. The problem captures most of the operational features of a real-world railroad transportations systems that dynamically initiates a set of blocks and assigns each shipment to a single block path or multiple block paths. The dynamic hazardous materials trip plan policies have distinguishing features that are integrating the blocking plan, and the block activation decisions. We also present a non-linear mixed integer programming formulation for each variant and present managerial insights based on a hypothetical railroad network. The computation results reveal that the dynamic car scheduling policies are not only able to take advantage of the capacity of the network but also capable of diminishing the population, and environment risks by rerouting the active blocks along the least risky train services without sacrificing the cost advantage of the railroad. The empirical results of this research illustrate that the issue of integrating the blocking plan, and the train makeup of the hazardous materials freights must receive closer attentions.Keywords: dynamic car scheduling, planning and scheduling hazardous materials freights, airborne hazardous materials, gaussian plume model, integrated blocking and routing plans, box model
Procedia PDF Downloads 2093336 Web-Based Decision Support Systems and Intelligent Decision-Making: A Systematic Analysis
Authors: Serhat Tüzün, Tufan Demirel
Abstract:
Decision Support Systems (DSS) have been investigated by researchers and technologists for more than 35 years. This paper analyses the developments in the architecture and software of these systems, provides a systematic analysis for different Web-based DSS approaches and Intelligent Decision-making Technologies (IDT), with the suggestion for future studies. Decision Support Systems literature begins with building model-oriented DSS in the late 1960s, theory developments in the 1970s, and the implementation of financial planning systems and Group DSS in the early and mid-80s. Then it documents the origins of Executive Information Systems, online analytic processing (OLAP) and Business Intelligence. The implementation of Web-based DSS occurred in the mid-1990s. With the beginning of the new millennia, intelligence is the main focus on DSS studies. Web-based technologies are having a major impact on design, development and implementation processes for all types of DSS. Web technologies are being utilized for the development of DSS tools by leading developers of decision support technologies. Major companies are encouraging its customers to port their DSS applications, such as data mining, customer relationship management (CRM) and OLAP systems, to a web-based environment. Similarly, real-time data fed from manufacturing plants are now helping floor managers make decisions regarding production adjustment to ensure that high-quality products are produced and delivered. Web-based DSS are being employed by organizations as decision aids for employees as well as customers. A common usage of Web-based DSS has been to assist customers configure product and service according to their needs. These systems allow individual customers to design their own products by choosing from a menu of attributes, components, prices and delivery options. The Intelligent Decision-making Technologies (IDT) domain is a fast growing area of research that integrates various aspects of computer science and information systems. This includes intelligent systems, intelligent technology, intelligent agents, artificial intelligence, fuzzy logic, neural networks, machine learning, knowledge discovery, computational intelligence, data science, big data analytics, inference engines, recommender systems or engines, and a variety of related disciplines. Innovative applications that emerge using IDT often have a significant impact on decision-making processes in government, industry, business, and academia in general. This is particularly pronounced in finance, accounting, healthcare, computer networks, real-time safety monitoring and crisis response systems. Similarly, IDT is commonly used in military decision-making systems, security, marketing, stock market prediction, and robotics. Even though lots of research studies have been conducted on Decision Support Systems, a systematic analysis on the subject is still missing. Because of this necessity, this paper has been prepared to search recent articles about the DSS. The literature has been deeply reviewed and by classifying previous studies according to their preferences, taxonomy for DSS has been prepared. With the aid of the taxonomic review and the recent developments over the subject, this study aims to analyze the future trends in decision support systems.Keywords: decision support systems, intelligent decision-making, systematic analysis, taxonomic review
Procedia PDF Downloads 2833335 Integrating Inference, Simulation and Deduction in Molecular Domain Analysis and Synthesis with Peculiar Attention to Drug Discovery
Authors: Diego Liberati
Abstract:
Standard molecular modeling is traditionally done through Schroedinger equations via the help of powerful tools helping to manage them atom by atom, often needing High Performance Computing. Here, a full portfolio of new tools, conjugating statistical inference in the so called eXplainable Artificial Intelligence framework (in the form of Machine Learning of understandable rules) to the more traditional modeling and simulation control theory of mixed dynamic logic hybrid processes, is offered as quite a general purpose even if making an example to a popular chemical physics set of problems.Keywords: understandable rules ML, k-means, PCA, PieceWise Affine Auto Regression with eXogenous input
Procedia PDF Downloads 353334 An Evaluative Microbiological Risk Assessment of Drinking Water Supply in the Carpathian Region: Identification of Occurrent Hazardous Bacteria with Quantitative Microbial Risk Assessment Method
Authors: Anikó Kaluzsa
Abstract:
The article's author aims to introduce and analyze those microbiological safety hazards which indicate the presence of secondary contamination in the water supply system. Since drinking water belongs to primary foods and is the basic condition of life, special attention should be paid on its quality. There are such indicators among the microbiological features can be found in water, which are clear evidence of the presence of water contamination, and based on this there is no need to perform other diagnostics, because they prove properly the contamination of the given water supply section. Laboratory analysis can help - both technologically and temporally – to identify contamination, but it does matter how long takes the removal and if the disinfection process takes place in time. The identification of the factors that often occur in the same places or the chance of their occurrence is greater than the average, facilitates our work. The pathogen microbiological risk assessment by the help of several features determines the most likely occurring microbiological features in the Carpathian basin. From among all the microbiological indicators, that are recommended targets for routine inspection by the World Health Organization, there is a paramount importance of the appearance of Escherichia coli in the water network, as its presence indicates the potential ubietiy of enteric pathogens or other contaminants in the water network. In addition, the author presents the steps of microbiological risk assessment analyzing those pathogenic micro-organisms registered to be the most critical.Keywords: drinking water, E. coli, microbiological indicators, risk assessment, water safety plan
Procedia PDF Downloads 3383333 A Machine Learning Approach for Classification of Directional Valve Leakage in the Hydraulic Final Test
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
Due to increasing cost pressure in global markets, artificial intelligence is becoming a technology that is decisive for competition. Predictive quality enables machinery and plant manufacturers to ensure product quality by using data-driven forecasts via machine learning models as a decision-making basis for test results. The use of cross-process Bosch production data along the value chain of hydraulic valves is a promising approach to classifying the quality characteristics of workpieces.Keywords: predictive quality, hydraulics, machine learning, classification, supervised learning
Procedia PDF Downloads 2353332 Duality of Leagility and Governance: A New Normal Demand Network Management Paradigm under Pandemic
Authors: Jacky Hau
Abstract:
The prevalence of emerging technologies disrupts various industries as well as consumer behavior. Data collection has been in the fingertip and inherited through enabled Internet-of-things (IOT) devices. Big data analytics (BDA) becomes possible and allows real-time demand network management (DNM) through leagile supply chain. To enhance further on its resilience and predictability, governance is going to be examined to promote supply chain transparency and trust in an efficient manner. Leagility combines lean thinking and agile techniques in supply chain management. It aims at reducing costs and waste, as well as maintaining responsiveness to any volatile consumer demand by means of adjusting the decoupling point where the product flow changes from push to pull. Leagility would only be successful when collaborative planning, forecasting, and replenishment (CPFR) process or alike is in place throughout the supply chain business entities. Governance and procurement of the supply chain, however, is crucial and challenging for the execution of CPFR as every entity has to walk-the-talk generously for the sake of overall benefits of supply chain performance, not to mention the complexity of exercising the polices at both of within across various supply chain business entities on account of organizational behavior and mutual trust. Empirical survey results showed that the effective timespan on demand forecasting had been drastically shortening in the magnitude of months to weeks planning horizon, thus agility shall come first and preferably following by lean approach in a timely manner.Keywords: governance, leagility, procure-to-pay, source-to-contract
Procedia PDF Downloads 1173331 State Estimator Performance Enhancement: Methods for Identifying Errors in Modelling and Telemetry
Authors: M. Ananthakrishnan, Sunil K Patil, Koti Naveen, Inuganti Hemanth Kumar
Abstract:
State estimation output of EMS forms the base case for all other advanced applications used in real time by a power system operator. Ensuring tuning of state estimator is a repeated process and cannot be left once a good solution is obtained. This paper attempts to demonstrate methods to improve state estimator solution by identifying incorrect modelling and telemetry inputs to the application. In this work, identification of database topology modelling error by plotting static network using node-to-node connection details is demonstrated with examples. Analytical methods to identify wrong transmission parameters, incorrect limits and mistakes in pseudo load and generator modelling are explained with various cases observed. Further, methods used for active and reactive power tuning using bus summation display, reactive power absorption summary, and transformer tap correction are also described. In a large power system, verifying all network static data and modelling parameter on regular basis is difficult .The proposed tuning methods can be easily used by operators to quickly identify errors to obtain the best possible state estimation performance. This, in turn, can lead to improved decision-support capabilities, ultimately enhancing the safety and reliability of the power grid.Keywords: active power tuning, database modelling, reactive power, state estimator
Procedia PDF Downloads 163330 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow
Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat
Abstract:
Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.Keywords: affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, student engagement
Procedia PDF Downloads 973329 A Comparative Study of Deep Learning Methods for COVID-19 Detection
Authors: Aishrith Rao
Abstract:
COVID 19 is a pandemic which has resulted in thousands of deaths around the world and a huge impact on the global economy. Testing is a huge issue as the test kits have limited availability and are expensive to manufacture. Using deep learning methods on radiology images in the detection of the coronavirus as these images contain information about the spread of the virus in the lungs is extremely economical and time-saving as it can be used in areas with a lack of testing facilities. This paper focuses on binary classification and multi-class classification of COVID 19 and other diseases such as pneumonia, tuberculosis, etc. Different deep learning methods such as VGG-19, COVID-Net, ResNET+ SVM, Deep CNN, DarkCovidnet, etc., have been used, and their accuracy has been compared using the Chest X-Ray dataset.Keywords: deep learning, computer vision, radiology, COVID-19, ResNet, VGG-19, deep neural networks
Procedia PDF Downloads 1653328 Leveraging xAPI in a Corporate e-Learning Environment to Facilitate the Tracking, Modelling, and Predictive Analysis of Learner Behaviour
Authors: Libor Zachoval, Daire O Broin, Oisin Cawley
Abstract:
E-learning platforms, such as Blackboard have two major shortcomings: limited data capture as a result of the limitations of SCORM (Shareable Content Object Reference Model), and lack of incorporation of Artificial Intelligence (AI) and machine learning algorithms which could lead to better course adaptations. With the recent development of Experience Application Programming Interface (xAPI), a large amount of additional types of data can be captured and that opens a window of possibilities from which online education can benefit. In a corporate setting, where companies invest billions on the learning and development of their employees, some learner behaviours can be troublesome for they can hinder the knowledge development of a learner. Behaviours that hinder the knowledge development also raise ambiguity about learner’s knowledge mastery, specifically those related to gaming the system. Furthermore, a company receives little benefit from their investment if employees are passing courses without possessing the required knowledge and potential compliance risks may arise. Using xAPI and rules derived from a state-of-the-art review, we identified three learner behaviours, primarily related to guessing, in a corporate compliance course. The identified behaviours are: trying each option for a question, specifically for multiple-choice questions; selecting a single option for all the questions on the test; and continuously repeating tests upon failing as opposed to going over the learning material. These behaviours were detected on learners who repeated the test at least 4 times before passing the course. These findings suggest that gauging the mastery of a learner from multiple-choice questions test scores alone is a naive approach. Thus, next steps will consider the incorporation of additional data points, knowledge estimation models to model knowledge mastery of a learner more accurately, and analysis of the data for correlations between knowledge development and identified learner behaviours. Additional work could explore how learner behaviours could be utilised to make changes to a course. For example, course content may require modifications (certain sections of learning material may be shown to not be helpful to many learners to master the learning outcomes aimed at) or course design (such as the type and duration of feedback).Keywords: artificial intelligence, corporate e-learning environment, knowledge maintenance, xAPI
Procedia PDF Downloads 1263327 A Network Economic Analysis of Friendship, Cultural Activity, and Homophily
Authors: Siming Xie
Abstract:
In social networks, the term homophily refers to the tendency of agents with similar characteristics to link with one another and is so robustly observed across many contexts and dimensions. The starting point of my research is the observation that the “type” of agents is not a single exogenous variable. Agents, despite their differences in race, religion, and other hard to alter characteristics, may share interests and engage in activities that cut across those predetermined lines. This research aims to capture the interactions of homophily effects in a model where agents have two-dimension characteristics (i.e., race and personal hobbies such as basketball, which one either likes or dislikes) and with biases in meeting opportunities and in favor of same-type friendships. A novel feature of my model is providing a matching process with biased meeting probability on different dimensions, which could help to understand the structuring process in multidimensional networks without missing layer interdependencies. The main contribution of this study is providing a welfare based matching process for agents with multi-dimensional characteristics. In particular, this research shows that the biases in meeting opportunities on one dimension would lead to the emergence of homophily on the other dimension. The objective of this research is to determine the pattern of homophily in network formations, which will shed light on our understanding of segregation and its remedies. By constructing a two-dimension matching process, this study explores a method to describe agents’ homophilous behavior in a social network with multidimension and construct a game in which the minorities and majorities play different strategies in a society. It also shows that the optimal strategy is determined by the relative group size, where society would suffer more from social segregation if the two racial groups have a similar size. The research also has political implications—cultivating the same characteristics among agents helps diminishing social segregation, but only if the minority group is small enough. This research includes both theoretical models and empirical analysis. Providing the friendship formation model, the author first uses MATLAB to perform iteration calculations, then derives corresponding mathematical proof on previous results, and last shows that the model is consistent with empirical evidence from high school friendships. The anonymous data comes from The National Longitudinal Study of Adolescent Health (Add Health).Keywords: homophily, multidimension, social networks, friendships
Procedia PDF Downloads 1743326 From Ride-Hailing App to Diversified and Sustainable Platform Business Model
Authors: Ridwan Dewayanto Rusli
Abstract:
We show how prisoner's dilemma-type competition problems can be mitigated through rapid platform diversification and ecosystem expansion. We analyze a ride-hailing company in Southeast Asia, Gojek, whose network grew to more than 170 million users comprising consumers, partner drivers, merchants, and complementors within a few years and has already achieved higher contribution margins than ride-hailing peers Uber and Lyft. Its ecosystem integrates ride-hailing, food delivery and logistics, merchant solutions, e-commerce, marketplace and advertising, payments, and fintech offerings. The company continues growing its network of complementors and App developers, expanding content and gaining critical mass in consumer data analytics and advertising. We compare the company's growth and diversification trajectory with those of its main international rivals and peers. The company's rapid growth and future potential are analyzed using Cusumano's (2012) Staying Power and Six Principles, Hax and Wilde's (2003) and Hax's (2010) The Delta Model as well as Santos' (2016) home-market advantages frameworks. The recently announced multi-billion-dollar merger with one of Southeast Asia's largest e-commerce majors lends additional support to the above arguments.Keywords: ride-hailing, prisoner's dilemma, platform and ecosystem strategy, digital applications, diversification, home market advantages, e-commerce
Procedia PDF Downloads 1013325 Assessing Climate-Induced Species Range Shifts and Their Impacts on the Protected Seascape on Canada’s East Coast Using Species Distribution Models and Future Projections
Authors: Amy L. Irvine, Gabriel Reygondeau, Derek P. Tittensor
Abstract:
Marine protected areas (MPAs) within Canada’s exclusive economic zone help ensure the conservation and sustainability of marine ecosystems and the continued provision of ecosystem services to society (e.g., food, carbon sequestration). With ongoing and accelerating climate change, however, MPAs may become undermined in terms of their effectiveness at fulfilling these outcomes. Many populations of species, especially those at their thermal range limits, may shift to cooler waters or become extirpated due to climate change, resulting in new species compositions and ecological interactions within static MPA boundaries. While Canadian MPA management follows international guidelines for marine conservation, no consistent approach exists for adapting MPA networks to climate change and the resulting altered ecosystem conditions. To fill this gap, projected climate-driven shifts in species distributions on Canada’s east coast were analyzed to identify when native species emigrate and novel species immigrate within the network and how high mitigation and carbon emission scenarios influence these timelines. Indicators of the ecological changes caused by these species' shifts in the biological community were also developed. Overall, our research provides projections of climate change impacts and helps to guide adaptive management responses within the Canadian east coast MPA network.Keywords: climate change, ecosystem modeling, marine protected areas, management
Procedia PDF Downloads 1083324 Advancements in AI Training and Education for a Future-Ready Healthcare System
Authors: Shamie Kumar
Abstract:
Background: Radiologists and radiographers (RR) need to educate themselves and their colleagues to ensure that AI is integrated safely, useful, and in a meaningful way with the direction it always benefits the patients. AI education and training are fundamental to the way RR work and interact with it, such that they feel confident using it as part of their clinical practice in a way they understand it. Methodology: This exploratory research will outline the current educational and training gaps for radiographers and radiologists in AI radiology diagnostics. It will review the status, skills, challenges of educating and teaching. Understanding the use of artificial intelligence within daily clinical practice, why it is fundamental, and justification on why learning about AI is essential for wider adoption. Results: The current knowledge among RR is very sparse, country dependent, and with radiologists being the majority of the end-users for AI, their targeted training and learning AI opportunities surpass the ones available to radiographers. There are many papers that suggest there is a lack of knowledge, understanding, and training of AI in radiology amongst RR, and because of this, they are unable to comprehend exactly how AI works, integrates, benefits of using it, and its limitations. There is an indication they wish to receive specific training; however, both professions need to actively engage in learning about it and develop the skills that enable them to effectively use it. There is expected variability amongst the profession on their degree of commitment to AI as most don’t understand its value; this only adds to the need to train and educate RR. Currently, there is little AI teaching in either undergraduate or postgraduate study programs, and it is not readily available. In addition to this, there are other training programs, courses, workshops, and seminars available; most of these are short and one session rather than a continuation of learning which cover a basic understanding of AI and peripheral topics such as ethics, legal, and potential of AI. There appears to be an obvious gap between the content of what the training program offers and what the RR needs and wants to learn. Due to this, there is a risk of ineffective learning outcomes and attendees feeling a lack of clarity and depth of understanding of the practicality of using AI in a clinical environment. Conclusion: Education, training, and courses need to have defined learning outcomes with relevant concepts, ensuring theory and practice are taught as a continuation of the learning process based on use cases specific to a clinical working environment. Undergraduate and postgraduate courses should be developed robustly, ensuring the delivery of it is with expertise within that field; in addition, training and other programs should be delivered as a way of continued professional development and aligned with accredited institutions for a degree of quality assurance.Keywords: artificial intelligence, training, radiology, education, learning
Procedia PDF Downloads 923323 Hidro-IA: An Artificial Intelligent Tool Applied to Optimize the Operation Planning of Hydrothermal Systems with Historical Streamflow
Authors: Thiago Ribeiro de Alencar, Jacyro Gramulia Junior, Patricia Teixeira Leite
Abstract:
The area of the electricity sector that deals with energy needs by the hydroelectric in a coordinated manner is called Operation Planning of Hydrothermal Power Systems (OPHPS). The purpose of this is to find a political operative to provide electrical power to the system in a given period, with reliability and minimal cost. Therefore, it is necessary to determine an optimal schedule of generation for each hydroelectric, each range, so that the system meets the demand reliably, avoiding rationing in years of severe drought, and that minimizes the expected cost of operation during the planning, defining an appropriate strategy for thermal complementation. Several optimization algorithms specifically applied to this problem have been developed and are used. Although providing solutions to various problems encountered, these algorithms have some weaknesses, difficulties in convergence, simplification of the original formulation of the problem, or owing to the complexity of the objective function. An alternative to these challenges is the development of techniques for simulation optimization and more sophisticated and reliable, it can assist the planning of the operation. Thus, this paper presents the development of a computational tool, namely Hydro-IA for solving optimization problem identified and to provide the User an easy handling. Adopted as intelligent optimization technique is Genetic Algorithm (GA) and programming language is Java. First made the modeling of the chromosomes, then implemented the function assessment of the problem and the operators involved, and finally the drafting of the graphical interfaces for access to the User. The results with the Genetic Algorithms were compared with the optimization technique nonlinear programming (NLP). Tests were conducted with seven hydroelectric plants interconnected hydraulically with historical stream flow from 1953 to 1955. The results of comparison between the GA and NLP techniques shows that the cost of operating the GA becomes increasingly smaller than the NLP when the number of hydroelectric plants interconnected increases. The program has managed to relate a coherent performance in problem resolution without the need for simplification of the calculations together with the ease of manipulating the parameters of simulation and visualization of output results.Keywords: energy, optimization, hydrothermal power systems, artificial intelligence and genetic algorithms
Procedia PDF Downloads 4243322 The Postcognitivist Era in Cognitive Psychology
Authors: C. Jameke
Abstract:
During the cognitivist era in cognitive psychology, a theory of internal rules and symbolic representations was posited as an account of human cognition. This type of cognitive architecture had its heyday during the 1970s and 80s, but it has now been largely abandoned in favour of subsymbolic architectures (e.g. connectionism), non-representational frameworks (e.g. dynamical systems theory), and statistical approaches such as Bayesian theory. In this presentation I describe this changing landscape of research, and comment on the increasing influence of neuroscience on cognitive psychology. I then briefly review a few recent developments in connectionism, and neurocomputation relevant to cognitive psychology, and critically discuss the assumption made by some researchers in these frameworks that higher-level aspects of human cognition are simply emergent properties of massively large distributed neural networksKeywords: connectionism, emergentism, postocgnitivist, representations, subsymbolic archiitecture
Procedia PDF Downloads 5833321 Neural Correlates of Arabic Digits Naming
Authors: Fernando Ojedo, Alejandro Alvarez, Pedro Macizo
Abstract:
In the present study, we explored electrophysiological correlates of Arabic digits naming to determine semantic processing of numbers. Participants named Arabic digits grouped by category or intermixed with exemplars of other semantic categories while the N400 event-related potential was examined. Around 350-450 ms after the presentation of Arabic digits, brain waves were more positive in anterior regions and more negative in posterior regions when stimuli were grouped by category relative to the mixed condition. Contrary to what was found in other studies, electrophysiological results suggested that the production of numerals involved semantic mediation.Keywords: Arabic digit naming, event-related potentials, semantic processing, number production
Procedia PDF Downloads 587