Search results for: artificial neural networks controller
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5582

Search results for: artificial neural networks controller

3272 Extension of Moral Agency to Artificial Agents

Authors: Sofia Quaglia, Carmine Di Martino, Brendan Tierney

Abstract:

Artificial Intelligence (A.I.) constitutes various aspects of modern life, from the Machine Learning algorithms predicting the stocks on Wall streets to the killing of belligerents and innocents alike on the battlefield. Moreover, the end goal is to create autonomous A.I.; this means that the presence of humans in the decision-making process will be absent. The question comes naturally: when an A.I. does something wrong when its behavior is harmful to the community and its actions go against the law, which is to be held responsible? This research’s subject matter in A.I. and Robot Ethics focuses mainly on Robot Rights and its ultimate objective is to answer the questions: (i) What is the function of rights? (ii) Who is a right holder, what is personhood and the requirements needed to be a moral agent (therefore, accountable for responsibility)? (iii) Can an A.I. be a moral agent? (ontological requirements) and finally (iv) if it ought to be one (ethical implications). With the direction to answer this question, this research project was done via a collaboration between the School of Computer Science in the Technical University of Dublin that oversaw the technical aspects of this work, as well as the Department of Philosophy in the University of Milan, who supervised the philosophical framework and argumentation of the project. Firstly, it was found that all rights are positive and based on consensus; they change with time based on circumstances. Their function is to protect the social fabric and avoid dangerous situations. The same goes for the requirements considered necessary to be a moral agent: those are not absolute; in fact, they are constantly redesigned. Hence, the next logical step was to identify what requirements are regarded as fundamental in real-world judicial systems, comparing them to that of ones used in philosophy. Autonomy, free will, intentionality, consciousness and responsibility were identified as the requirements to be considered a moral agent. The work went on to build a symmetrical system between personhood and A.I. to enable the emergence of the ontological differences between the two. Each requirement is introduced, explained in the most relevant theories of contemporary philosophy, and observed in its manifestation in A.I. Finally, after completing the philosophical and technical analysis, conclusions were drawn. As underlined in the research questions, there are two issues regarding the assignment of moral agency to artificial agent: the first being that all the ontological requirements must be present and secondly being present or not, whether an A.I. ought to be considered as an artificial moral agent. From an ontological point of view, it is very hard to prove that an A.I. could be autonomous, free, intentional, conscious, and responsible. The philosophical accounts are often very theoretical and inconclusive, making it difficult to fully detect these requirements on an experimental level of demonstration. However, from an ethical point of view it makes sense to consider some A.I. as artificial moral agents, hence responsible for their own actions. When considering artificial agents as responsible, there can be applied already existing norms in our judicial system such as removing them from society, and re-educating them, in order to re-introduced them to society. This is in line with how the highest profile correctional facilities ought to work. Noticeably, this is a provisional conclusion and research must continue further. Nevertheless, the strength of the presented argument lies in its immediate applicability to real world scenarios. To refer to the aforementioned incidents, involving the murderer of innocents, when this thesis is applied it is possible to hold an A.I. accountable and responsible for its actions. This infers removing it from society by virtue of its un-usability, re-programming it and, only when properly functioning, re-introducing it successfully

Keywords: artificial agency, correctional system, ethics, natural agency, responsibility

Procedia PDF Downloads 170
3271 Examining Predictive Coding in the Hierarchy of Visual Perception in the Autism Spectrum Using Fast Periodic Visual Stimulation

Authors: Min L. Stewart, Patrick Johnston

Abstract:

Predictive coding has been proposed as a general explanatory framework for understanding the neural mechanisms of perception. As such, an underweighting of perceptual priors has been hypothesised to underpin a range of differences in inferential and sensory processing in autism spectrum disorders. However, empirical evidence to support this has not been well established. The present study uses an electroencephalography paradigm involving changes of facial identity and person category (actors etc.) to explore how levels of autistic traits (AT) affect predictive coding at multiple stages in the visual processing hierarchy. The study uses a rapid serial presentation of faces, with hierarchically structured sequences involving both periodic and aperiodic repetitions of different stimulus attributes (i.e., person identity and person category) in order to induce contextual expectations relating to these attributes. It investigates two main predictions: (1) significantly larger and late neural responses to change of expected visual sequences in high-relative to low-AT, and (2) significantly reduced neural responses to violations of contextually induced expectation in high- relative to low-AT. Preliminary frequency analysis data comparing high and low-AT show greater and later event-related-potentials (ERPs) in occipitotemporal areas and prefrontal areas in high-AT than in low-AT for periodic changes of facial identity and person category but smaller ERPs over the same areas in response to aperiodic changes of identity and category. The research advances our understanding of how abnormalities in predictive coding might underpin aberrant perceptual experience in autism spectrum. This is the first stage of a research project that will inform clinical practitioners in developing better diagnostic tests and interventions for people with autism.

Keywords: hierarchical visual processing, face processing, perceptual hierarchy, prediction error, predictive coding

Procedia PDF Downloads 97
3270 Iterative Segmentation and Application of Hausdorff Dilation Distance in Defect Detection

Authors: S. Shankar Bharathi

Abstract:

Inspection of surface defects on metallic components has always been challenging due to its specular property. Occurrences of defects such as scratches, rust, pitting are very common in metallic surfaces during the manufacturing process. These defects if unchecked can hamper the performance and reduce the life time of such component. Many of the conventional image processing algorithms in detecting the surface defects generally involve segmentation techniques, based on thresholding, edge detection, watershed segmentation and textural segmentation. They later employ other suitable algorithms based on morphology, region growing, shape analysis, neural networks for classification purpose. In this paper the work has been focused only towards detecting scratches. Global and other thresholding techniques were used to extract the defects, but it proved to be inaccurate in extracting the defects alone. However, this paper does not focus on comparison of different segmentation techniques, but rather describes a novel approach towards segmentation combined with hausdorff dilation distance. The proposed algorithm is based on the distribution of the intensity levels, that is, whether a certain gray level is concentrated or evenly distributed. The algorithm is based on extraction of such concentrated pixels. Defective images showed higher level of concentration of some gray level, whereas in non-defective image, there seemed to be no concentration, but were evenly distributed. This formed the basis in detecting the defects in the proposed algorithm. Hausdorff dilation distance based on mathematical morphology was used to strengthen the segmentation of the defects.

Keywords: metallic surface, scratches, segmentation, hausdorff dilation distance, machine vision

Procedia PDF Downloads 410
3269 Enhancer: An Effective Transformer Architecture for Single Image Super Resolution

Authors: Pitigalage Chamath Chandira Peiris

Abstract:

A widely researched domain in the field of image processing in recent times has been single image super-resolution, which tries to restore a high-resolution image from a single low-resolution image. Many more single image super-resolution efforts have been completed utilizing equally traditional and deep learning methodologies, as well as a variety of other methodologies. Deep learning-based super-resolution methods, in particular, have received significant interest. As of now, the most advanced image restoration approaches are based on convolutional neural networks; nevertheless, only a few efforts have been performed using Transformers, which have demonstrated excellent performance on high-level vision tasks. The effectiveness of CNN-based algorithms in image super-resolution has been impressive. However, these methods cannot completely capture the non-local features of the data. Enhancer is a simple yet powerful Transformer-based approach for enhancing the resolution of images. A method for single image super-resolution was developed in this study, which utilized an efficient and effective transformer design. This proposed architecture makes use of a locally enhanced window transformer block to alleviate the enormous computational load associated with non-overlapping window-based self-attention. Additionally, it incorporates depth-wise convolution in the feed-forward network to enhance its ability to capture local context. This study is assessed by comparing the results obtained for popular datasets to those obtained by other techniques in the domain.

Keywords: single image super resolution, computer vision, vision transformers, image restoration

Procedia PDF Downloads 91
3268 A Picture is worth a Billion Bits: Real-Time Image Reconstruction from Dense Binary Pixels

Authors: Tal Remez, Or Litany, Alex Bronstein

Abstract:

The pursuit of smaller pixel sizes at ever increasing resolution in digital image sensors is mainly driven by the stringent price and form-factor requirements of sensors and optics in the cellular phone market. Recently, Eric Fossum proposed a novel concept of an image sensor with dense sub-diffraction limit one-bit pixels (jots), which can be considered a digital emulation of silver halide photographic film. This idea has been recently embodied as the EPFL Gigavision camera. A major bottleneck in the design of such sensors is the image reconstruction process, producing a continuous high dynamic range image from oversampled binary measurements. The extreme quantization of the Poisson statistics is incompatible with the assumptions of most standard image processing and enhancement frameworks. The recently proposed maximum-likelihood (ML) approach addresses this difficulty, but suffers from image artifacts and has impractically high computational complexity. In this work, we study a variant of a sensor with binary threshold pixels and propose a reconstruction algorithm combining an ML data fitting term with a sparse synthesis prior. We also show an efficient hardware-friendly real-time approximation of this inverse operator. Promising results are shown on synthetic data as well as on HDR data emulated using multiple exposures of a regular CMOS sensor.

Keywords: binary pixels, maximum likelihood, neural networks, sparse coding

Procedia PDF Downloads 185
3267 Argon/Oxygen Plasma Surface Modification of Biopolymers for Improvement of Wettability and Wear Resistance

Authors: Binnur Sagbas

Abstract:

Artificial joint replacements such as total knee and total hip prosthesis have been applied to the patients who affected by osteoarthritis. Although different material combinations are used for these joints, biopolymers are most commonly preferred materials especially for acetabular cup and tibial component of hip and knee joints respectively. The main limitation that shortens the service life of these prostheses is wear. Wear is complicated phenomena and it must be considered with friction and lubrication. In this study, micro wave (MW) induced argon+oxygen plasma surface modification were applied on ultra-high molecular weight polyethylene (UHMWPE) and vitamin E blended UHMWPE (VE-UHMWPE) biopolymer surfaces to improve surface wettability and wear resistance of the surfaces. Contact angel measurement method was used for determination of wettability. Ball-on-disc wear test was applied under 25% bovine serum lubrication conditions. The results show that surface wettability and wear resistance of both material samples were increased by plasma surface modification.

Keywords: artificial joints, plasma surface modification, UHMWPE, vitamin E, wear

Procedia PDF Downloads 295
3266 Capturing Public Voices: The Role of Social Media in Heritage Management

Authors: Mahda Foroughi, Bruno de Anderade, Ana Pereira Roders

Abstract:

Social media platforms have been increasingly used by locals and tourists to express their opinions about buildings, cities, and built heritage in particular. Most recently, scholars have been using social media to conduct innovative research on built heritage and heritage management. Still, the application of artificial intelligence (AI) methods to analyze social media data for heritage management is seldom explored. This paper investigates the potential of short texts (sentences and hashtags) shared through social media as a data source and artificial intelligence methods for data analysis for revealing the cultural significance (values and attributes) of built heritage. The city of Yazd, Iran, was taken as a case study, with a particular focus on windcatchers, key attributes conveying outstanding universal values, as inscribed on the UNESCO World Heritage List. This paper has three subsequent phases: 1) state of the art on the intersection of public participation in heritage management and social media research; 2) methodology of data collection and data analysis related to coding people's voices from Instagram and Twitter into values of windcatchers over the last ten-years; 3) preliminary findings on the comparison between opinions of locals and tourists, sentiment analysis, and its association with the values and attributes of windcatchers. Results indicate that the age value is recognized as the most important value by all interest groups, while the political value is the least acknowledged. Besides, the negative sentiments are scarcely reflected (e.g., critiques) in social media. Results confirm the potential of social media for heritage management in terms of (de)coding and measuring the cultural significance of built heritage for windcatchers in Yazd. The methodology developed in this paper can be applied to other attributes in Yazd and also to other case studies.

Keywords: social media, artificial intelligence, public participation, cultural significance, heritage, sentiment analysis

Procedia PDF Downloads 100
3265 Automation of Pneumatic Seed Planter for System of Rice Intensification

Authors: Tukur Daiyabu Abdulkadir, Wan Ishak Wan Ismail, Muhammad Saufi Mohd Kassim

Abstract:

Seed singulation and accuracy in seed spacing are the major challenges associated with the adoption of mechanical seeder for system of rice intensification. In this research the metering system of a pneumatic planter was modified and automated for increase precision to meet the demand of system of rice intensification SRI. The chain and sprocket mechanism of a conventional vacuum planter were now replaced with an electro mechanical system made up of a set of servo motors, limit switch, micro controller and a wheel divided into 10 equal angles. The circumference of the planter wheel was determined based on which seed spacing was computed and mapped to the angles of the metering wheel. A program was then written and uploaded to arduino micro controller and it automatically turns the seed plates for seeding upon covering the required distance. The servo motor was calibrated with the aid of labVIEW. The machine was then calibrated using a grease belt and varying the servo rpm through voltage variation between 37 rpm to 47 rpm until an optimum value of 40 rpm was obtained with a forward speed of 5 kilometers per hour. A pressure of 1.5 kpa was found to be optimum under which no skip or double was recorded. Precision in spacing (coefficient of variation), miss index, multiple index, doubles and skips were investigated. No skip or double was recorded both at laboratory and field levels. The operational parameters under consideration were both evaluated at laboratory and field. Even though there was little variation between the laboratory and field values of precision in spacing, multiple index and miss index, the different is not significant as both laboratory and field values fall within the acceptable range.

Keywords: automation, calibration, pneumatic seed planter, system of rice intensification

Procedia PDF Downloads 624
3264 Innovative Predictive Modeling and Characterization of Composite Material Properties Using Machine Learning and Genetic Algorithms

Authors: Hamdi Beji, Toufik Kanit, Tanguy Messager

Abstract:

This study aims to construct a predictive model proficient in foreseeing the linear elastic and thermal characteristics of composite materials, drawing on a multitude of influencing parameters. These parameters encompass the shape of inclusions (circular, elliptical, square, triangle), their spatial coordinates within the matrix, orientation, volume fraction (ranging from 0.05 to 0.4), and variations in contrast (spanning from 10 to 200). A variety of machine learning techniques are deployed, including decision trees, random forests, support vector machines, k-nearest neighbors, and an artificial neural network (ANN), to facilitate this predictive model. Moreover, this research goes beyond the predictive aspect by delving into an inverse analysis using genetic algorithms. The intent is to unveil the intrinsic characteristics of composite materials by evaluating their thermomechanical responses. The foundation of this research lies in the establishment of a comprehensive database that accounts for the array of input parameters mentioned earlier. This database, enriched with this diversity of input variables, serves as a bedrock for the creation of machine learning and genetic algorithm-based models. These models are meticulously trained to not only predict but also elucidate the mechanical and thermal conduct of composite materials. Remarkably, the coupling of machine learning and genetic algorithms has proven highly effective, yielding predictions with remarkable accuracy, boasting scores ranging between 0.97 and 0.99. This achievement marks a significant breakthrough, demonstrating the potential of this innovative approach in the field of materials engineering.

Keywords: machine learning, composite materials, genetic algorithms, mechanical and thermal proprieties

Procedia PDF Downloads 46
3263 Screening of Rice Genotypes in Methane and Carbon Dioxide Emissions Under Different Water Regimes

Authors: Mthiyane Pretty, Mitsui Toshiake, Nagano Hirohiko, Aycan Murat

Abstract:

Among the most significant greenhouse gases released from rice fields are methane and carbon dioxide. The primary focus of this research was to quantify CH₄ and CO₂ gas using different 4 rice cultivars, two water regimes, and a recording of soil moisture and temperature. In this study, we hypothesized that paddy field soils may directly affect soil enzymatic activities and physicochemical properties in the rhizosphere soil of paddy fields and subsequently indirectly affect the activity, abundance, diversity, and community composition of methanogens, ultimately affecting CH₄ flux. The experiment was laid out in the randomized block design with two treatments and three replications for each genotype. In two treatments, paddy fields and artificial soil were used. 35 days after planting (DAP), continuous flooding irrigation, Alternate wetting, and drying (AWD) were applied during the vegetative stage. The highest recorded measurements of soil and environmental parameters were soil moisture at 76%, soil temperature at 28.3℃, Bulk EC at 0.99 ds/m, and pore water EC at 1,25, using HydraGO portable soil sensor system. Gas samples were carried out once on a weekly basis at 09:00 am and 12: 00 pm to obtain the mean GHG flux. Gas Chromatography (GC, Shimadzu, GC-2010, Japan) was used for the analysis of CH4 and CO₂. The treatments with paddy field soil had a 1.3℃ higher temperature than artificial soil. The overall changes in Bulk EC were not significant across the treatment. The CH₄ emission patterns were observed in all rice genotypes, although they were less in treatments with AWD with artificial soil. This shows that AWD creates oxic conditions in the rice soil. CO₂ was also quantified, but it was in minute quantities, as rice plants were using CO₂ for photosynthesis. The highest tillering number was 7, and the lowest was 3 in cultivars grown. The rice varieties to be used for breeding are Norin 24, with showed a high number of tillers with less CH₄.

Keywords: greenhouse gases, methane, morphological characterization, alternating wetting and drying

Procedia PDF Downloads 63
3262 Detection of Safety Goggles on Humans in Industrial Environment Using Faster-Region Based on Convolutional Neural Network with Rotated Bounding Box

Authors: Ankit Kamboj, Shikha Talwar, Nilesh Powar

Abstract:

To successfully deliver our products in the market, the employees need to be in a safe environment, especially in an industrial and manufacturing environment. The consequences of delinquency in wearing safety glasses while working in industrial plants could be high risk to employees, hence the need to develop a real-time automatic detection system which detects the persons (violators) not wearing safety glasses. In this study a convolutional neural network (CNN) algorithm called faster region based CNN (Faster RCNN) with rotated bounding box has been used for detecting safety glasses on persons; the algorithm has an advantage of detecting safety glasses with different orientation angles on the persons. The proposed method of rotational bounding boxes with a convolutional neural network first detects a person from the images, and then the method detects whether the person is wearing safety glasses or not. The video data is captured at the entrance of restricted zones of the industrial environment (manufacturing plant), which is further converted into images at 2 frames per second. In the first step, the CNN with pre-trained weights on COCO dataset is used for person detection where the detections are cropped as images. Then the safety goggles are labelled on the cropped images using the image labelling tool called roLabelImg, which is used to annotate the ground truth values of rotated objects more accurately, and the annotations obtained are further modified to depict four coordinates of the rectangular bounding box. Next, the faster RCNN with rotated bounding box is used to detect safety goggles, which is then compared with traditional bounding box faster RCNN in terms of detection accuracy (average precision), which shows the effectiveness of the proposed method for detection of rotatory objects. The deep learning benchmarking is done on a Dell workstation with a 16GB Nvidia GPU.

Keywords: CNN, deep learning, faster RCNN, roLabelImg rotated bounding box, safety goggle detection

Procedia PDF Downloads 120
3261 Distributed Multi-Agent Based Approach on Intelligent Transportation Network

Authors: Xiao Yihong, Yu Kexin, Burra Venkata Durga Kumar

Abstract:

With the accelerating process of urbanization, the problem of urban road congestion is becoming more and more serious. Intelligent transportation system combining distributed and artificial intelligence has become a research hotspot. As the core development direction of the intelligent transportation system, Cooperative Intelligent Transportation System (C-ITS) integrates advanced information technology and communication methods and realizes the integration of humans, vehicle, roadside infrastructure, and other elements through the multi-agent distributed system. By analyzing the system architecture and technical characteristics of C-ITS, the report proposes a distributed multi-agent C-ITS. The system consists of Roadside Sub-system, Vehicle Sub-system, and Personal Sub-system. At the same time, we explore the scalability of the C-ITS and put forward incorporating local rewards in the centralized training decentralized execution paradigm, hoping to add a scalable value decomposition method. In addition, we also suggest introducing blockchain to improve the safety of the traffic information transmission process. The system is expected to improve vehicle capacity and traffic safety.

Keywords: distributed system, artificial intelligence, multi-agent, cooperative intelligent transportation system

Procedia PDF Downloads 193
3260 A Contemporary Advertising Strategy on Social Networking Sites

Authors: M. S. Aparna, Pushparaj Shetty D.

Abstract:

Nowadays social networking sites have become so popular that the producers or the sellers look for these sites as one of the best options to target the right audience to market their products. There are several tools available to monitor or analyze the social networks. Our task is to identify the right community web pages and find out the behavior analysis of the members by using these tools and formulate an appropriate strategy to market the products or services to achieve the set goals. The advertising becomes more effective when the information of the product/ services come from a known source. The strategy explores great buying influence in the audience on referral marketing. Our methodology proceeds with critical budget analysis and promotes viral influence propagation. In this context, we encompass the vital bits of budget evaluation such as the number of optimal seed nodes or primary influential users activated onset, an estimate coverage spread of nodes and maximum influence propagating distance from an initial seed to an end node. Our proposal for Buyer Prediction mathematical model arises from the urge to perform complex analysis when the probability density estimates of reliable factors are not known or difficult to calculate. Order Statistics and Buyer Prediction mapping function guarantee the selection of optimal influential users at each level. We exercise an efficient tactics of practicing community pages and user behavior to determine the product enthusiasts on social networks. Our approach is promising and should be an elementary choice when there is little or no prior knowledge on the distribution of potential buyers on social networks. In this strategy, product news propagates to influential users on or surrounding networks. By applying the same technique, a user can search friends who are capable to advise better or give referrals, if a product interests him.

Keywords: viral marketing, social network analysis, community web pages, buyer prediction, influence propagation, budget constraints

Procedia PDF Downloads 244
3259 Transmission Line Congestion Management Using Hybrid Fish-Bee Algorithm with Unified Power Flow Controller

Authors: P. Valsalal, S. Thangalakshmi

Abstract:

There is a widespread changeover in the electrical power industry universally from old-style monopolistic outline towards a horizontally distributed competitive structure to come across the demand of rising consumption. When the transmission lines of derestricted system are incapable to oblige the entire service needs, the lines are overloaded or congested. The governor between customer and power producer is nominated as Independent System Operator (ISO) to lessen the congestion without obstructing transmission line restrictions. Among the existing approaches for congestion management, the frequently used approaches are reorganizing the generation and load curbing. There is a boundary for reorganizing the generators, and further loads may not be supplemented with the prevailing resources unless more private power producers are added in the system by considerably raising the cost. Hence, congestion is relaxed by appropriate Flexible AC Transmission Systems (FACTS) devices which boost the existing transfer capacity of transmission lines. The FACTs device, namely, Unified Power Flow Controller (UPFC) is preferred, and the correct placement of UPFC is more vital and should be positioned in the highly congested line. Hence, the weak line is identified by using power flow performance index with the new objective function with proposed hybrid Fish – Bee algorithm. Further, the location of UPFC at appropriate line reduces the branch loading and minimizes the voltage deviation. The power transfer capacity of lines is determined with and without UPFC in the identified congested line of IEEE 30 bus structure and the simulated results are compared with prevailing algorithms. It is observed that the transfer capacity of existing line is increased with the presented algorithm and thus alleviating the congestion.

Keywords: available line transfer capability, congestion management, FACTS device, Hybrid Fish-Bee Algorithm, ISO, UPFC

Procedia PDF Downloads 365
3258 Design Study for the Rehabilitation of a Retaining Structure and Water Intake on Site

Authors: Yu-Lin Shen, Ming-Kuen Chang

Abstract:

In addition to a considerable amount of machinery and equipment, intricacies of the transmission pipeline exist in Petrochemical plants. Long term corrosion may lead to pipeline thinning and rupture, causing serious safety concerns. With the advances in non-destructive testing technology, more rapid and long-range ultrasonic detection techniques are often used for pipeline inspection, EMAT without coupling to detect, it is a non-contact ultrasonic, suitable for detecting elevated temperature or roughened e surface of line. In this study, we prepared artificial defects in pipeline for Electromagnetic Acoustic Transducer testing (EMAT) to survey the relationship between the defect location, sizing and the EMAT signal. It was found that the signal amplitude of EMAT exhibited greater signal attenuation with larger defect depth and length. In addition, with bigger flat hole diameter, greater amplitude attenuation was obtained. In summary, signal amplitude attenuation of EMAT was affected by the defect depth, defect length and the hole diameter and size.

Keywords: EMAT, artificial defect, NDT, ultrasonic testing

Procedia PDF Downloads 329
3257 A Comparative Soft Computing Approach to Supplier Performance Prediction Using GEP and ANN Models: An Automotive Case Study

Authors: Seyed Esmail Seyedi Bariran, Khairul Salleh Mohamed Sahari

Abstract:

In multi-echelon supply chain networks, optimal supplier selection significantly depends on the accuracy of suppliers’ performance prediction. Different methods of multi criteria decision making such as ANN, GA, Fuzzy, AHP, etc have been previously used to predict the supplier performance but the “black-box” characteristic of these methods is yet a major concern to be resolved. Therefore, the primary objective in this paper is to implement an artificial intelligence-based gene expression programming (GEP) model to compare the prediction accuracy with that of ANN. A full factorial design with %95 confidence interval is initially applied to determine the appropriate set of criteria for supplier performance evaluation. A test-train approach is then utilized for the ANN and GEP exclusively. The training results are used to find the optimal network architecture and the testing data will determine the prediction accuracy of each method based on measures of root mean square error (RMSE) and correlation coefficient (R2). The results of a case study conducted in Supplying Automotive Parts Co. (SAPCO) with more than 100 local and foreign supply chain members revealed that, in comparison with ANN, gene expression programming has a significant preference in predicting supplier performance by referring to the respective RMSE and R-squared values. Moreover, using GEP, a mathematical function was also derived to solve the issue of ANN black-box structure in modeling the performance prediction.

Keywords: Supplier Performance Prediction, ANN, GEP, Automotive, SAPCO

Procedia PDF Downloads 405
3256 A Hybrid MAC Protocol for Delay Constrained Mobile Wireless Sensor Networks

Authors: Hanefi Cinar, Musa Cibuk, Ismail Erturk, Fikri Aggun, Munip Geylani

Abstract:

Mobile Wireless Sensor Networks (MWSNs) carry heterogeneous data traffic with different urgency and quality of service (QoS) requirements. There are a lot of studies made on energy efficiency, bandwidth, and communication methods in literature. But delay, high throughput, utility parameters are not well considered. Increasing demand for real-time data transfer makes these parameters more important. In this paper we design new MAC protocol which is delay constrained and targets for improving delay, utility, and throughput performance of the network and finding solutions on collision and interference problems. Protocol improving QoS requirements by using TDMA, FDM, and OFDMA hybrid communication methods with multi-channel communication.

Keywords: MWSN, delay, hybrid MAC, TDMA, FDM, OFDMA

Procedia PDF Downloads 461
3255 Power Energy Management For A Grid-Connected PV System Using Rule-Base Fuzzy Logic

Authors: Nousheen Hashmi, Shoab Ahmad Khan

Abstract:

Active collaboration among the green energy sources and the load demand leads to serious issues related to power quality and stability. The growing number of green energy resources and Distributed-Generators need newer strategies to be incorporated for their operations to keep the power energy stability among green energy resources and micro-grid/Utility Grid. This paper presents a novel technique for energy power management in Grid-Connected Photovoltaic with energy storage system under set of constraints including weather conditions, Load Shedding Hours, Peak pricing Hours by using rule-based fuzzy smart grid controller to schedule power coming from multiple Power sources (photovoltaic, grid, battery) under the above set of constraints. The technique fuzzifies all the inputs and establishes fuzzify rule set from fuzzy outputs before defuzzification. Simulations are run for 24 hours period and rule base power scheduler is developed. The proposed fuzzy controller control strategy is able to sense the continuous fluctuations in Photovoltaic power generation, Load Demands, Grid (load Shedding patterns) and Battery State of Charge in order to make correct and quick decisions.The suggested Fuzzy Rule-based scheduler can operate well with vague inputs thus doesn’t not require any exact numerical model and can handle nonlinearity. This technique provides a framework for the extension to handle multiple special cases for optimized working of the system.

Keywords: photovoltaic, power, fuzzy logic, distributed generators, state of charge, load shedding, membership functions

Procedia PDF Downloads 468
3254 Technological Enhancements in Supply Chain Management Post COVID-19

Authors: Miran Ismail

Abstract:

COVID-19 has caused widespread disruption in all economical sectors and industries around the world. The COVID-19 lockdown measures have resulted in production halts, restrictions on persons and goods movement, border closures, logistical constraints, and a slowdown in trade and economic activity. The main subject of this paper is to leverage technology to manage the supply chain effectively and efficiently through the usage of artificial intelligence. The research methodology is based on empirical data collected through a questionnaire survey. One of the approaches utilized is a case study of industrial organizations that face obstacles such as high operational costs, large inventory levels, a lack of well-established supplier relationships, human behavior, and system issues. The main contribution of this research to the body of knowledge is the empirical insights and on supply chain sustainability performance measurement. The results provide guidelines for the selection of advanced technologies to support supply chain processes and for the design of sustainable performance measurement systems.

Keywords: information technology, artificial intelligence, supply chain management, industrial organizations

Procedia PDF Downloads 110
3253 Random Access in IoT Using Naïve Bayes Classification

Authors: Alhusein Almahjoub, Dongyu Qiu

Abstract:

This paper deals with the random access procedure in next-generation networks and presents the solution to reduce total service time (TST) which is one of the most important performance metrics in current and future internet of things (IoT) based networks. The proposed solution focuses on the calculation of optimal transmission probability which maximizes the success probability and reduces TST. It uses the information of several idle preambles in every time slot, and based on it, it estimates the number of backlogged IoT devices using Naïve Bayes estimation which is a type of supervised learning in the machine learning domain. The estimation of backlogged devices is necessary since optimal transmission probability depends on it and the eNodeB does not have information about it. The simulations are carried out in MATLAB which verify that the proposed solution gives excellent performance.

Keywords: random access, LTE/LTE-A, 5G, machine learning, Naïve Bayes estimation

Procedia PDF Downloads 133
3252 Task Validity in Neuroimaging Studies: Perspectives from Applied Linguistics

Authors: L. Freeborn

Abstract:

Recent years have seen an increasing number of neuroimaging studies related to language learning as imaging techniques such as fMRI and EEG have become more widely accessible to researchers. By using a variety of structural and functional neuroimaging techniques, these studies have already made considerable progress in terms of our understanding of neural networks and processing related to first and second language acquisition. However, the methodological designs employed in neuroimaging studies to test language learning have been questioned by applied linguists working within the field of second language acquisition (SLA). One of the major criticisms is that tasks designed to measure language learning gains rarely have a communicative function, and seldom assess learners’ ability to use the language in authentic situations. This brings the validity of many neuroimaging tasks into question. The fundamental reason why people learn a language is to communicate, and it is well-known that both first and second language proficiency are developed through meaningful social interaction. With this in mind, the SLA field is in agreement that second language acquisition and proficiency should be measured through learners’ ability to communicate in authentic real-life situations. Whilst authenticity is not always possible to achieve in a classroom environment, the importance of task authenticity should be reflected in the design of language assessments, teaching materials, and curricula. Tasks that bear little relation to how language is used in real-life situations can be considered to lack construct validity. This paper first describes the typical tasks used in neuroimaging studies to measure language gains and proficiency, then analyses to what extent these tasks can validly assess these constructs.

Keywords: neuroimaging studies, research design, second language acquisition, task validity

Procedia PDF Downloads 117
3251 Love and Loss: The Emergence of Shame in Romantic Information Communication Technology

Authors: C. Caudwell, R. Syed, C. Lacey

Abstract:

While the development and advancement of information communication technologies (ICTs) offers powerful opportunities for meaningful connections and relationships, shame is a significant barrier to social and cultural acceptance. In particular, artificial intelligence and socially oriented robots are increasingly becoming partners in romantic relationships with people, offering bonding, support, comfort, growth, and reciprocity. However, these relationships suffer hierarchical, anthropocentric shame that is a significant barrier to their success and longevity. This paper will present case studies of human and artificially intelligent agent relationships, in the context of internal and external shame, as cultivated, propagated, and communicated through ICT. Using an interdisciplinary methodology we aim to present a framework for technological shame, building on the experimental and emergent psychoanalytical theories of emotions. Our study finds principally that socialization is a powerful factor in the vectors of shame as experienced by humans. On a wider scale, we contribute understanding of social emotion and the phenomenon of shame proliferated through ICTs, which is at present under-explored, but vital, as society and culture is increasingly mediated through this medium.

Keywords: shame, artificial intelligence, romance, society

Procedia PDF Downloads 118
3250 Vibration Control of a Horizontally Supported Rotor System by Using a Radial Active Magnetic Bearing

Authors: Vishnu A., Ashesh Saha

Abstract:

The operation of high-speed rotating machinery in industries is accompanied by rotor vibrations due to many factors. One of the primary instability mechanisms in a rotor system is the centrifugal force induced due to the eccentricity of the center of mass away from the center of rotation. These unwanted vibrations may lead to catastrophic fatigue failure. So, there is a need to control these rotor vibrations. In this work, control of rotor vibrations by using a 4-pole Radial Active Magnetic Bearing (RAMB) as an actuator is analysed. A continuous rotor system model is considered for the analysis. Several important factors, like the gyroscopic effect and rotary inertia of the shaft and disc, are incorporated into this model. The large deflection of the shaft and the restriction to axial motion of the shaft at the bearings result in nonlinearities in the system governing equation. The rotor system is modeled in such a way that the system dynamics can be related to the geometric and material properties of the shaft and disc. The mathematical model of the rotor system is developed by incorporating the control forces generated by the RAMB. A simple PD controller is used for the attenuation of system vibrations. An analytical expression for the amplitude and phase equations is derived using the Method of Multiple Scales (MMS). Analytical results are verified with the numerical results obtained using an ‘ode’ solver in-built into MATLAB Software. The control force is found to be effective in attenuating the system vibrations. The multi-valued solutions leading to the jump phenomenon are also eliminated with a proper choice of control gains. Most interestingly, the shape of the backbone curves can also be altered for certain values of control parameters.

Keywords: rotor dynamics, continuous rotor system model, active magnetic bearing, PD controller, method of multiple scales, backbone curve

Procedia PDF Downloads 69
3249 The Importance of Artificial Intelligence in Various Healthcare Applications

Authors: Joshna Rani S., Ahmadi Banu

Abstract:

Artificial Intelligence (AI) has a significant task to carry out in the medical care contributions of things to come. As AI, it is the essential capacity behind the advancement of accuracy medication, generally consented to be a painfully required development in care. Albeit early endeavors at giving analysis and treatment proposals have demonstrated testing, we anticipate that AI will at last dominate that area too. Given the quick propels in AI for imaging examination, it appears to be likely that most radiology, what's more, pathology pictures will be inspected eventually by a machine. Discourse and text acknowledgment are now utilized for assignments like patient correspondence and catch of clinical notes, and their utilization will increment. The best test to AI in these medical services areas isn't regardless of whether the innovations will be sufficiently skilled to be valuable, but instead guaranteeing their appropriation in day by day clinical practice. For far reaching selection to happen, AI frameworks should be affirmed by controllers, coordinated with EHR frameworks, normalized to an adequate degree that comparative items work likewise, instructed to clinicians, paid for by open or private payer associations, and refreshed over the long haul in the field. These difficulties will, at last, be survived, yet they will take any longer to do as such than it will take for the actual innovations to develop. Therefore, we hope to see restricted utilization of AI in clinical practice inside 5 years and more broad use inside 10 years. It likewise appears to be progressively evident that AI frameworks won't supplant human clinicians for a huge scope, yet rather will increase their endeavors to really focus on patients. Over the long haul, human clinicians may advance toward errands and work plans that draw on remarkably human abilities like sympathy, influence, and higher perspective mix. Maybe the lone medical services suppliers who will chance their professions over the long run might be the individuals who will not work close by AI

Keywords: artificial intellogence, health care, breast cancer, AI applications

Procedia PDF Downloads 165
3248 Wavelength Conversion of Dispersion Managed Solitons at 100 Gbps through Semiconductor Optical Amplifier

Authors: Kadam Bhambri, Neena Gupta

Abstract:

All optical wavelength conversion is essential in present day optical networks for transparent interoperability, contention resolution, and wavelength routing. The incorporation of all optical wavelength convertors leads to better utilization of the network resources and hence improves the efficiency of optical networks. Wavelength convertors that can work with Dispersion Managed (DM) solitons are attractive due to their superior transmission capabilities. In this paper, wavelength conversion for dispersion managed soliton signals was demonstrated at 100 Gbps through semiconductor optical amplifier and an optical filter. The wavelength conversion was achieved for a 1550 nm input signal to1555nm output signal. The output signal was measured in terms of BER, Q factor and system margin.    

Keywords: all optical wavelength conversion, dispersion managed solitons, semiconductor optical amplifier, cross gain modultation

Procedia PDF Downloads 437
3247 3D Object Model Reconstruction Based on Polywogs Wavelet Network Parametrization

Authors: Mohamed Othmani, Yassine Khlifi

Abstract:

This paper presents a technique for compact three dimensional (3D) object model reconstruction using wavelet networks. It consists to transform an input surface vertices into signals,and uses wavelet network parameters for signal approximations. To prove this, we use a wavelet network architecture founded on several mother wavelet families. POLYnomials WindOwed with Gaussians (POLYWOG) wavelet families are used to maximize the probability to select the best wavelets which ensure the good generalization of the network. To achieve a better reconstruction, the network is trained several iterations to optimize the wavelet network parameters until the error criterion is small enough. Experimental results will shown that our proposed technique can effectively reconstruct an irregular 3D object models when using the optimized wavelet network parameters. We will prove that an accurateness reconstruction depends on the best choice of the mother wavelets.

Keywords: 3d object, optimization, parametrization, polywog wavelets, reconstruction, wavelet networks

Procedia PDF Downloads 267
3246 Breeding for Hygienic Behavior in Honey Bees

Authors: Michael Eickermann, Juergen Junk

Abstract:

The Western honey (Apis mellifera) is threatened by a number of parasites, especially the devastating Varroa mite (Varroa destructor) is responsible for a high level of mortality over winter, e.g., in Europe and USA. While the use of synthetic pesticides or organic acids has been preferred so far to control this parasite, breeding strategies for less susceptible honey bees are in early stages. Hygienic behavior can be an important tool for controlling Varroa destructor. Worker bees with a high level of this behavior are able to detect infested brood in the cells under the wax lid during pupation and remove them out of the hive. The underlying processes of this behavior are only partly investigated, but it is for sure that hygienic behavior is heritable and therefore, can be integrated into commercial breeding lines. In a first step, breeding lines with a high level of phenotypic hygienic behavior have been identified by using a bioassay for accurate assessment of this trait in a long-term national breeding program in Luxembourg since 2015. Based on the artificial infestation of nucleus colonies with 150 phoretic Varroa destructor mites, the level of phenotypic hygienic behavior was detected by counting the number of mites in all stages, twelve days after infestation. A nucleus with a high level of hygienic behavior was overwintered and used for breeding activities in the following years. Artificial insemination was used to combine different breeding lines. Buckfast lines, as well as Carnica lines, were used. While Carnica lines offered only a low increase of hygienic behavior up to maximum 62.5%, Buckfast lines performed much better with mean levels of more than 87.5%. Some mating ends up with a level of 100%. But even with a level of 82.5% Varroa mites are not able to reproduce in the colony anymore. In a final step, a nucleus with a high level of hygienic behavior were build up to full colonies and located at two places in Luxembourg to build up a drone congregation area. Local beekeepers can bring their nucleus to this location for mating the queens with drones offering a high level of hygienic behavior.

Keywords: agiculture, artificial insemination, honey bee, varroa destructor

Procedia PDF Downloads 115
3245 Ensemble Machine Learning Approach for Estimating Missing Data from CO₂ Time Series

Authors: Atbin Mahabbati, Jason Beringer, Matthias Leopold

Abstract:

To address the global challenges of climate and environmental changes, there is a need for quantifying and reducing uncertainties in environmental data, including observations of carbon, water, and energy. Global eddy covariance flux tower networks (FLUXNET), and their regional counterparts (i.e., OzFlux, AmeriFlux, China Flux, etc.) were established in the late 1990s and early 2000s to address the demand. Despite the capability of eddy covariance in validating process modelling analyses, field surveys and remote sensing assessments, there are some serious concerns regarding the challenges associated with the technique, e.g. data gaps and uncertainties. To address these concerns, this research has developed an ensemble model to fill the data gaps of CO₂ flux to avoid the limitations of using a single algorithm, and therefore, provide less error and decline the uncertainties associated with the gap-filling process. In this study, the data of five towers in the OzFlux Network (Alice Springs Mulga, Calperum, Gingin, Howard Springs and Tumbarumba) during 2013 were used to develop an ensemble machine learning model, using five feedforward neural networks (FFNN) with different structures combined with an eXtreme Gradient Boosting (XGB) algorithm. The former methods, FFNN, provided the primary estimations in the first layer, while the later, XGB, used the outputs of the first layer as its input to provide the final estimations of CO₂ flux. The introduced model showed slight superiority over each single FFNN and the XGB, while each of these two methods was used individually, overall RMSE: 2.64, 2.91, and 3.54 g C m⁻² yr⁻¹ respectively (3.54 provided by the best FFNN). The most significant improvement happened to the estimation of the extreme diurnal values (during midday and sunrise), as well as nocturnal estimations, which is generally considered as one of the most challenging parts of CO₂ flux gap-filling. The towers, as well as seasonality, showed different levels of sensitivity to improvements provided by the ensemble model. For instance, Tumbarumba showed more sensitivity compared to Calperum, where the differences between the Ensemble model on the one hand and the FFNNs and XGB, on the other hand, were the least of all 5 sites. Besides, the performance difference between the ensemble model and its components individually were more significant during the warm season (Jan, Feb, Mar, Oct, Nov, and Dec) compared to the cold season (Apr, May, Jun, Jul, Aug, and Sep) due to the higher amount of photosynthesis of plants, which led to a larger range of CO₂ exchange. In conclusion, the introduced ensemble model slightly improved the accuracy of CO₂ flux gap-filling and robustness of the model. Therefore, using ensemble machine learning models is potentially capable of improving data estimation and regression outcome when it seems to be no more room for improvement while using a single algorithm.

Keywords: carbon flux, Eddy covariance, extreme gradient boosting, gap-filling comparison, hybrid model, OzFlux network

Procedia PDF Downloads 120
3244 A Design of the Infrastructure and Computer Network for Distance Education, Online Learning via New Media, E-Learning and Blended Learning

Authors: Sumitra Nuanmeesri

Abstract:

The research focus on study, analyze and design the model of the infrastructure and computer networks for distance education, online learning via new media, e-learning and blended learning. The collected information from study and analyze process that information was evaluated by the index of item objective congruence (IOC) by 9 specialists to design model. The results of evaluate the model with the mean and standard deviation by the sample of 9 specialists value is 3.85. The results showed that the infrastructure and computer networks are designed to be appropriate to a great extent appropriate to a great extent.

Keywords: blended learning, new media, infrastructure and computer network, tele-education, online learning

Procedia PDF Downloads 389
3243 Cultivating Responsible AI: For Cultural Heritage Preservation in India

Authors: Varsha Rainson

Abstract:

Artificial intelligence (AI) has great potential and can be used as a powerful tool of application in various domains and sectors. But with the application of AI, there comes a wide spectrum of concerns around bias, accountability, transparency, and privacy. Hence, there is a need for responsible AI, which can uphold ethical and accountable practices to ensure that things are transparent and fair. The paper is a combination of AI and cultural heritage preservation, with a greater focus on India because of the rich cultural legacy that it holds. India’s cultural heritage in itself contributes to its identity and the economy. In this paper, along with discussing the impact culture holds on the Indian economy, we will discuss the threats that the cultural heritage is exposed to due to pollution, climate change and urbanization. Furthermore, the paper reviews some of the exciting applications of AI in cultural heritage preservation, such as 3-D scanning, photogrammetry, and other techniques which have led to the reconstruction of cultural artifacts and sites. The paper eventually moves into the potential risks and challenges that AI poses in cultural heritage preservation. These include ethical, legal, and social issues which are to be addressed by organizations and government authorities. Overall, the paper strongly argues the need for responsible AI and the important role it can play in preserving India’s cultural heritage while holding importance to value and diversity.

Keywords: responsible AI, cultural heritage, artificial intelligence, biases, transparency

Procedia PDF Downloads 168