Search results for: graphical user interference
782 The Comparation of Limits of Detection of Lateral Flow Immunochromatographic Strips of Different Types of Mycotoxins
Authors: Xinyi Zhao, Furong Tian
Abstract:
Mycotoxins are secondary metabolic products of fungi. These are poisonous, carcinogens and mutagens in nature and pose a serious health threat to both humans and animals, causing severe illnesses and even deaths. The rapid, simple and cheap detection methods of mycotoxins are of immense importance and in great demand in the food and beverage industry as well as in agriculture and environmental monitoring. Lateral flow immunochromatographic strips (ICSTs) have been widely used in food safety, environment monitoring. Forty-six papers were identified and reviewed on Google Scholar and Scopus for their limit of detection and nanomaterial on Lateral flow immunochromatographic strips on different types of mycotoxins. The papers were dated 2001-2021. Twenty five papers were compared to identify the lowest limit of detection of among different mycotoxins (Aflatoxin B1: 10, Zearalenone:5, Fumonisin B1: 5, Trichothecene-A: 5). Most of these highly sensitive strips are competitive. Sandwich structure are usually used in large scale detection. In conclusion, the mycotoxin receives that most researches is aflatoxin B1 and its limit of detection is the lowest. Gold-nanopaticle based immunochromatographic test strips has the lowest limit of detection. Five papers involve smartphone detection and they all detect aflatoxin B1 with gold nanoparticles. In these papers, quantitative concentration results can be obtained when the user uploads the photograph of test lines using the smartphone application.Keywords: aflatoxin B1, limit of detection, gold nanoparticle, lateral flow immunochromatographic strips, mycotoxins
Procedia PDF Downloads 195781 AI Ethical Values as Dependent on the Role and Perspective of the Ethical AI Code Founder- A Mapping Review
Authors: Moshe Davidian, Shlomo Mark, Yotam Lurie
Abstract:
With the rapid development of technology and the concomitant growth in the capability of Artificial Intelligence (AI) systems and their power, the ethical challenges involved in these systems are also evolving and increasing. In recent years, various organizations, including governments, international institutions, professional societies, civic organizations, and commercial companies, have been choosing to address these various challenges by publishing ethical codes for AI systems. However, despite the apparent agreement that AI should be “ethical,” there is debate about the definition of “ethical artificial intelligence.” This study investigates the various AI ethical codes and their key ethical values. From the vast collection of codes that exist, it analyzes and compares 25 ethical codes that were found to be representative of different types of organizations. In addition, as part of its literature review, the study overviews data collected in three recent reviews of AI codes. The results of the analyses demonstrate a convergence around seven key ethical values. However, the key finding is that the different AI ethical codes eventually reflect the type of organization that designed the code; i.e., the organizations’ role as regulator, user, or developer affects the view of what ethical AI is. The results show a relationship between the organization’s role and the dominant values in its code. The main contribution of this study is the development of a list of the key values for all AI systems and specific values that need to impact the development and design of AI systems, but also allowing for differences according to the organization for which the system is being developed. This will allow an analysis of AI values in relation to stakeholders.Keywords: artificial intelligence, ethical codes, principles, values
Procedia PDF Downloads 107780 Problems and Solutions in the Application of ICP-MS for Analysis of Trace Elements in Various Samples
Authors: Béla Kovács, Éva Bódi, Farzaneh Garousi, Szilvia Várallyay, Áron Soós, Xénia Vágó, Dávid Andrási
Abstract:
In agriculture for analysis of elements in different food and food raw materials, moreover environmental samples generally flame atomic absorption spectrometers (FAAS), graphite furnace atomic absorption spectrometers (GF-AAS), inductively coupled plasma optical emission spectrometers (ICP-OES) and inductively coupled plasma mass spectrometers (ICP-MS) are routinely applied. An inductively coupled plasma mass spectrometer (ICP-MS) is capable for analysis of 70-80 elements in multielemental mode, from 1-5 cm3 volume of a sample, moreover the detection limits of elements are in µg/kg-ng/kg (ppb-ppt) concentration range. All the analytical instruments have different physical and chemical interfering effects analysing the above types of samples. The smaller the concentration of an analyte and the larger the concentration of the matrix the larger the interfering effects. Nowadays there is very important to analyse growingly smaller concentrations of elements. From the above analytical instruments generally the inductively coupled plasma mass spectrometer is capable of analysing the smallest concentration of elements. The applied ICP-MS instrument has Collision Cell Technology (CCT) also. Using CCT mode certain elements have better (smaller) detection limits with 1-3 magnitudes comparing to a normal ICP-MS analytical method. The CCT mode has better detection limits mainly for analysis of selenium, arsenic, germanium, vanadium and chromium. To elaborate an analytical method for trace elements with an inductively coupled plasma mass spectrometer the most important interfering effects (problems) were evaluated: 1) Physical interferences; 2) Spectral interferences (elemental and molecular isobaric); 3) Effect of easily ionisable elements; 4) Memory interferences. Analysing food and food raw materials, moreover environmental samples an other (new) interfering effect emerged in ICP-MS, namely the effect of various matrixes having different evaporation and nebulization effectiveness, moreover having different quantity of carbon content of food and food raw materials, moreover environmental samples. In our research work the effect of different water-soluble compounds furthermore the effect of various quantity of carbon content (as sample matrix) were examined on changes of intensity of the applied elements. So finally we could find “opportunities” to decrease or eliminate the error of the analyses of applied elements (Cr, Co, Ni, Cu, Zn, Ge, As, Se, Mo, Cd, Sn, Sb, Te, Hg, Pb, Bi). To analyse these elements in the above samples, the most appropriate inductively coupled plasma mass spectrometer is a quadrupole instrument applying a collision cell technique (CCT). The extent of interfering effect of carbon content depends on the type of compounds. The carbon content significantly affects the measured concentration (intensities) of the above elements, which can be corrected using different internal standards.Keywords: elements, environmental and food samples, ICP-MS, interference effects
Procedia PDF Downloads 504779 Visual Template Detection and Compositional Automatic Regular Expression Generation for Business Invoice Extraction
Authors: Anthony Proschka, Deepak Mishra, Merlyn Ramanan, Zurab Baratashvili
Abstract:
Small and medium-sized businesses receive over 160 billion invoices every year. Since these documents exhibit many subtle differences in layout and text, extracting structured fields such as sender name, amount, and VAT rate from them automatically is an open research question. In this paper, existing work in template-based document extraction is extended, and a system is devised that is able to reliably extract all required fields for up to 70% of all documents in the data set, more than any other previously reported method. The approaches are described for 1) detecting through visual features which template a given document belongs to, 2) automatically generating extraction rules for a given new template by composing regular expressions from multiple components, and 3) computing confidence scores that indicate the accuracy of the automatic extractions. The system can generate templates with as little as one training sample and only requires the ground truth field values instead of detailed annotations such as bounding boxes that are hard to obtain. The system is deployed and used inside a commercial accounting software.Keywords: data mining, information retrieval, business, feature extraction, layout, business data processing, document handling, end-user trained information extraction, document archiving, scanned business documents, automated document processing, F1-measure, commercial accounting software
Procedia PDF Downloads 130778 Feasibility of Simulating External Vehicle Aerodynamics Using Spalart-Allmaras Turbulence Model with Adjoint Method in OpenFOAM and Fluent
Authors: Arpit Panwar, Arvind Deshpande
Abstract:
The study of external vehicle aerodynamics using Spalart-Allmaras turbulence model with adjoint method was conducted. The accessibility and ease of working with the Fluent module of ANSYS and OpenFOAM were considered. The objective of the study was to understand and analyze the possibility of bringing high-level aerodynamic simulation to the average consumer vehicle. A form-factor of BMW M6 vehicle was designed in Solidworks, which was analyzed in OpenFOAM and Fluent. The turbulence model being a single equation provides much faster convergence rate when clubbed with the adjoint method. Fluent being commercial software still does not allow us to solve Spalart-Allmaras turbulence model using the adjoint method. Hence, the turbulence model was solved using the SIMPLE method in Fluent. OpenFOAM being an open source provide flexibility in simulation but is not user-friendly. It supports solving the defined turbulence model with the adjoint method. The result generated from the simulation gives us acceptable values of drag, when validated with the result of percentage error in drag values for a notch-back vehicle model on an extensive simulation produced at 6th ANSA and μETA conference, Greece. The success of this approach will allow us to bring more aerodynamic vehicle body design to all segments of the automobile and not limiting it to just the high-end sports cars.Keywords: Spalart-Allmaras turbulence model, OpenFOAM, adjoint method, SIMPLE method, vehicle aerodynamic design
Procedia PDF Downloads 200777 A Comparative Analysis of the Application and Use of Information and Communication Technologies (ICTS) in Selected Manufacturing Industries for Development in Nigeria
Authors: Kolawole Taiwo Olabode
Abstract:
This is a comparative study of ICTs adoption and use in selected manufacturing industries in for development. This study was carried out 2004 and was repeated 2013 (nine years after) using the same selected manufacturing industries to assess the level, improvement and extent ICT facilities used in these companies. The theory of modernization was explored to explain some developmental issues in this study. The same semi-structured questionnaire and IDI were used to elicit data on the subject matter. About 24.9% of the total workers (1,247) were sampled for this study using quota sampling technique. SPSS was used to analysis the quantitative data. The qualitative data was used to buttress the quantitative data. Findings indicated that Seven-Up Bottling Company and Frigoglass Glass Industry still remained Intensive ICT Users while only Niger Match Nigeria Limited still remained Non-Intensive ICT User while unfortunately, Askar Paint Nigeria Limited has gone liquidated. It is also important to discover that only the Intensive ICT users improved on relevant ICT facilities. The existing problems of ICT adoption and used in these companies remained the same in Niger Match Limited. The study concluded that for a society to be developed, management and government at all levels must do all things necessary to ensure that all existing organisations must be ICT compliance for workers and organisational performance and to enhance nation’s development in order to compete with other companies for global standard or recognition.Keywords: ICT, intensive ICT-users, entrepreneurial, manufacturing industries, industries and development
Procedia PDF Downloads 302776 Exploring Public Opinions Toward the Use of Generative Artificial Intelligence Chatbot in Higher Education: An Insight from Topic Modelling and Sentiment Analysis
Authors: Samer Muthana Sarsam, Abdul Samad Shibghatullah, Chit Su Mon, Abd Aziz Alias, Hosam Al-Samarraie
Abstract:
Generative Artificial Intelligence chatbots (GAI chatbots) have emerged as promising tools in various domains, including higher education. However, their specific role within the educational context and the level of legal support for their implementation remain unclear. Therefore, this study aims to investigate the role of Bard, a newly developed GAI chatbot, in higher education. To achieve this objective, English tweets were collected from Twitter's free streaming Application Programming Interface (API). The Latent Dirichlet Allocation (LDA) algorithm was applied to extract latent topics from the collected tweets. User sentiments, including disgust, surprise, sadness, anger, fear, joy, anticipation, and trust, as well as positive and negative sentiments, were extracted using the NRC Affect Intensity Lexicon and SentiStrength tools. This study explored the benefits, challenges, and future implications of integrating GAI chatbots in higher education. The findings shed light on the potential power of such tools, exemplified by Bard, in enhancing the learning process and providing support to students throughout their educational journey.Keywords: generative artificial intelligence chatbots, bard, higher education, topic modelling, sentiment analysis
Procedia PDF Downloads 83775 Requirements Management in Agile
Authors: Ravneet Kaur
Abstract:
The concept of Agile Requirements Engineering and Management is not new. However, the struggle to figure out how traditional Requirements Management Process fits within an Agile framework remains complex. This paper talks about a process that can merge the organization’s traditional Requirements Management Process nicely into the Agile Software Development Process. This process provides Traceability of the Product Backlog to the external documents on one hand and User Stories on the other hand. It also gives sufficient evidence that the system will deliver the right functionality with good quality in the form of various statistics and reports. In the nutshell, by overlaying a process on top of Agile, without disturbing the Agility, we are able to get synergic benefits in terms of productivity, profitability, its reporting, and end to end visibility to all Stakeholders. The framework can be used for just-in-time requirements definition or to build a repository of requirements for future use. The goal is to make sure that the business (specifically, the product owner) can clearly articulate what needs to be built and define what is of high quality. To accomplish this, the requirements cycle follows a Scrum-like process that mirrors the development cycle but stays two to three steps ahead. The goal is to create a process by which requirements can be thoroughly vetted, organized, and communicated in a manner that is iterative, timely, and quality-focused. Agile is quickly becoming the most popular way of developing software because it fosters continuous improvement, time-boxed development cycles, and more quickly delivering value to the end users. That value will be driven to a large extent by the quality and clarity of requirements that feed the software development process. An agile, lean, and timely approach to requirements as the starting point will help to ensure that the process is optimized.Keywords: requirements management, Agile
Procedia PDF Downloads 370774 Simplified Linear Regression Model to Quantify the Thermal Resilience of Office Buildings in Three Different Power Outage Day Times
Authors: Nagham Ismail, Djamel Ouahrani
Abstract:
Thermal resilience in the built environment reflects the building's capacity to adapt to extreme climate changes. In hot climates, power outages in office buildings pose risks to the health and productivity of workers. Therefore, it is of interest to quantify the thermal resilience of office buildings by developing a user-friendly simplified model. This simplified model begins with creating an assessment metric of thermal resilience that measures the duration between the power outage and the point at which the thermal habitability condition is compromised, considering different power interruption times (morning, noon, and afternoon). In this context, energy simulations of an office building are conducted for Qatar's summer weather by changing different parameters that are related to the (i) wall characteristics, (ii) glazing characteristics, (iii) load, (iv) orientation and (v) air leakage. The simulation results are processed using SPSS to derive linear regression equations, aiding stakeholders in evaluating the performance of commercial buildings during different power interruption times. The findings reveal the significant influence of glazing characteristics on thermal resilience, with the morning power outage scenario posing the most detrimental impact in terms of the shortest duration before compromising thermal resilience.Keywords: thermal resilience, thermal envelope, energy modeling, building simulation, thermal comfort, power disruption, extreme weather
Procedia PDF Downloads 75773 A Machine Learning Model for Predicting Students’ Academic Performance in Higher Institutions
Authors: Emmanuel Osaze Oshoiribhor, Adetokunbo MacGregor John-Otumu
Abstract:
There has been a need in recent years to predict student academic achievement prior to graduation. This is to assist them in improving their grades, especially for those who have struggled in the past. The purpose of this research is to use supervised learning techniques to create a model that predicts student academic progress. Many scholars have developed models that predict student academic achievement based on characteristics including smoking, demography, culture, social media, parent educational background, parent finances, and family background, to mention a few. This element, as well as the model used, could have misclassified the kids in terms of their academic achievement. As a prerequisite to predicting if the student will perform well in the future on related courses, this model is built using a logistic regression classifier with basic features such as the previous semester's course score, attendance to class, class participation, and the total number of course materials or resources the student is able to cover per semester. With a 96.7 percent accuracy, the model outperformed other classifiers such as Naive bayes, Support vector machine (SVM), Decision Tree, Random forest, and Adaboost. This model is offered as a desktop application with user-friendly interfaces for forecasting student academic progress for both teachers and students. As a result, both students and professors are encouraged to use this technique to predict outcomes better.Keywords: artificial intelligence, ML, logistic regression, performance, prediction
Procedia PDF Downloads 109772 Describing the Fine Electronic Structure and Predicting Properties of Materials with ATOMIC MATTERS Computation System
Authors: Rafal Michalski, Jakub Zygadlo
Abstract:
We present the concept and scientific methods and algorithms of our computation system called ATOMIC MATTERS. This is the first presentation of the new computer package, that allows its user to describe physical properties of atomic localized electron systems subject to electromagnetic interactions. Our solution applies to situations where an unclosed electron 2p/3p/3d/4d/5d/4f/5f subshell interacts with an electrostatic potential of definable symmetry and external magnetic field. Our methods are based on Crystal Electric Field (CEF) approach, which takes into consideration the electrostatic ligands field as well as the magnetic Zeeman effect. The application allowed us to predict macroscopic properties of materials such as: Magnetic, spectral and calorimetric as a result of physical properties of their fine electronic structure. We emphasize the importance of symmetry of charge surroundings of atom/ion, spin-orbit interactions (spin-orbit coupling) and the use of complex number matrices in the definition of the Hamiltonian. Calculation methods, algorithms and convention recalculation tools collected in ATOMIC MATTERS were chosen to permit the prediction of magnetic and spectral properties of materials in isostructural series.Keywords: atomic matters, crystal electric field (CEF) spin-orbit coupling, localized states, electron subshell, fine electronic structure
Procedia PDF Downloads 319771 Micro Celebrities in Social Media Instagram and Their Personal Influence in Business Perspective
Authors: Yoga Maulana Putra, Herry Hudrasyah
Abstract:
The Internet has now become an important part of human life; it can be accessed through a computer or even a smartphone almost anywhere and anytime. The Internet has created many social media such as Facebook, Twitter, and Instagram. Instagram has been acquired by Facebook in 2012. Since then, Instagram is growing fast. And now, Instagram is transforming from photo-sharing social media into business tools. As the result, some new behavior has been discovered. Some of Instagram user is becoming popular. These people also being called minor celebrity and they are also being used as marketing tools by many companies to influencing or promoting their product or service. This minor celebrity is existing because of their behavior in using Instagram. The company is using the personal influence of the minor celebrity to promoting and influencing their product or service, and the minor celebrity gets paid as much as their rate card. And their rate card based on their followers and insight. This research is using a qualitative method. An interview is being done to 6 minor celebrities from many different categories such as photographer, travel blogger, lifestyle, food blogger, fashion, and healthcare. Theory of reasoned behavior is being used as the grounded theory to discover the reason for their behavior and personal influence to describe their way to influencing people. The result of the interview is most of the minor celebrities is influenced by their friend’s circle in the process of using Instagram. They also had a different way to use their personal influence to affect their followers when the company employs them.Keywords: humanities and social sciences, Instagram, minor celebrity, social media
Procedia PDF Downloads 166770 Unstructured-Data Content Search Based on Optimized EEG Signal Processing and Multi-Objective Feature Extraction
Authors: Qais M. Yousef, Yasmeen A. Alshaer
Abstract:
Over the last few years, the amount of data available on the globe has been increased rapidly. This came up with the emergence of recent concepts, such as the big data and the Internet of Things, which have furnished a suitable solution for the availability of data all over the world. However, managing this massive amount of data remains a challenge due to their large verity of types and distribution. Therefore, locating the required file particularly from the first trial turned to be a not easy task, due to the large similarities of names for different files distributed on the web. Consequently, the accuracy and speed of search have been negatively affected. This work presents a method using Electroencephalography signals to locate the files based on their contents. Giving the concept of natural mind waves processing, this work analyses the mind wave signals of different people, analyzing them and extracting their most appropriate features using multi-objective metaheuristic algorithm, and then classifying them using artificial neural network to distinguish among files with similar names. The aim of this work is to provide the ability to find the files based on their contents using human thoughts only. Implementing this approach and testing it on real people proved its ability to find the desired files accurately within noticeably shorter time and retrieve them as a first choice for the user.Keywords: artificial intelligence, data contents search, human active memory, mind wave, multi-objective optimization
Procedia PDF Downloads 175769 Routing Medical Images with Tabu Search and Simulated Annealing: A Study on Quality of Service
Authors: Mejía M. Paula, Ramírez L. Leonardo, Puerta A. Gabriel
Abstract:
In telemedicine, the image repository service is important to increase the accuracy of diagnostic support of medical personnel. This study makes comparison between two routing algorithms regarding the quality of service (QoS), to be able to analyze the optimal performance at the time of loading and/or downloading of medical images. This study focused on comparing the performance of Tabu Search with other heuristic and metaheuristic algorithms that improve QoS in telemedicine services in Colombia. For this, Tabu Search and Simulated Annealing heuristic algorithms are chosen for their high usability in this type of applications; the QoS is measured taking into account the following metrics: Delay, Throughput, Jitter and Latency. In addition, routing tests were carried out on ten images in digital image and communication in medicine (DICOM) format of 40 MB. These tests were carried out for ten minutes with different traffic conditions, reaching a total of 25 tests, from a server of Universidad Militar Nueva Granada (UMNG) in Bogotá-Colombia to a remote user in Universidad de Santiago de Chile (USACH) - Chile. The results show that Tabu search presents a better QoS performance compared to Simulated Annealing, managing to optimize the routing of medical images, a basic requirement to offer diagnostic images services in telemedicine.Keywords: medical image, QoS, simulated annealing, Tabu search, telemedicine
Procedia PDF Downloads 219768 Proposed Framework based on Classification of Vertical Handover Decision Strategies in Heterogeneous Wireless Networks
Authors: Shidrokh Goudarzi, Wan Haslina Hassan
Abstract:
Heterogeneous wireless networks are converging towards an all-IP network as part of the so-called next-generation network. In this paradigm, different access technologies need to be interconnected; thus, vertical handovers or vertical handoffs are necessary for seamless mobility. In this paper, we conduct a review of existing vertical handover decision-making mechanisms that aim to provide ubiquitous connectivity to mobile users. To offer a systematic comparison, we categorize these vertical handover measurement and decision structures based on their respective methodology and parameters. Subsequently, we analyze several vertical handover approaches in the literature and compare them according to their advantages and weaknesses. The paper compares the algorithms based on the network selection methods, complexity of the technologies used and efficiency in order to introduce our vertical handover decision framework. We find that vertical handovers on heterogeneous wireless networks suffer from the lack of a standard and efficient method to satisfy both user and network quality of service requirements at different levels including architectural, decision-making and protocols. Also, the consolidation of network terminal, cross-layer information, multi packet casting and intelligent network selection algorithm appears to be an optimum solution for achieving seamless service continuity in order to facilitate seamless connectivity.Keywords: heterogeneous wireless networks, vertical handovers, vertical handover metric, decision-making algorithms
Procedia PDF Downloads 393767 On Exploring Search Heuristics for improving the efficiency in Web Information Extraction
Authors: Patricia Jiménez, Rafael Corchuelo
Abstract:
Nowadays the World Wide Web is the most popular source of information that relies on billions of on-line documents. Web mining is used to crawl through these documents, collect the information of interest and process it by applying data mining tools in order to use the gathered information in the best interest of a business, what enables companies to promote theirs. Unfortunately, it is not easy to extract the information a web site provides automatically when it lacks an API that allows to transform the user-friendly data provided in web documents into a structured format that is machine-readable. Rule-based information extractors are the tools intended to extract the information of interest automatically and offer it in a structured format that allow mining tools to process it. However, the performance of an information extractor strongly depends on the search heuristic employed since bad choices regarding how to learn a rule may easily result in loss of effectiveness and/or efficiency. Improving search heuristics regarding efficiency is of uttermost importance in the field of Web Information Extraction since typical datasets are very large. In this paper, we employ an information extractor based on a classical top-down algorithm that uses the so-called Information Gain heuristic introduced by Quinlan and Cameron-Jones. Unfortunately, the Information Gain relies on some well-known problems so we analyse an intuitive alternative, Termini, that is clearly more efficient; we also analyse other proposals in the literature and conclude that none of them outperforms the previous alternative.Keywords: information extraction, search heuristics, semi-structured documents, web mining.
Procedia PDF Downloads 335766 Poli4SDG: An Application for Environmental Crises Management and Gender Support
Authors: Angelica S. Valeriani, Lorenzo Biasiolo
Abstract:
In recent years, the scale of the impact of climate change and its related side effects has become ever more massive and devastating. Sustainable Development Goals (SDGs), promoted by United Nations, aim to front issues related to climate change, among others. In particular, the project CROWD4SDG focuses on a bunch of SDGs since it promotes environmental activities and climate-related issues. In this context, we developed a prototype of an application, under advanced development considering web design, that focuses on SDG 13 (SDG on climate action) by providing users with useful instruments to face environmental crises and climate-related disasters. Our prototype is thought and structured for both web and mobile development. The main goal of the application, POLI4SDG, is to help users to get through emergency services. To this extent, an organized overview and classification prove to be very effective and helpful to people in need. A careful analysis of data related to environmental crises prompted us to integrate the user contribution, i.e., exploiting a core principle of Citizen Science, into the realization of a public catalog, available for consulting and organized according to typology and specific features. In addition, gender equality and opportunity features are considered in the prototype in order to allow women, often the most vulnerable category, to have direct support. The overall description of the application functionalities is detailed. Moreover, the implementation features and properties of the prototype are discussed.Keywords: crowdsourcing, social media, SDG, climate change, natural disasters, gender equality
Procedia PDF Downloads 112765 Reagentless Detection of Urea Based on ZnO-CuO Composite Thin Film
Authors: Neha Batra Bali, Monika Tomar, Vinay Gupta
Abstract:
A reagentless biosensor for detection of urea based on ZnO-CuO composite thin film is presented in following work. Biosensors have immense potential for varied applications ranging from environmental to clinical testing, health care, and cell analysis. Immense growth in the field of biosensors is due to the huge requirement in today’s world to develop techniques which are both cost effective and accurate for prevention of disease manifestation. The human body comprises of numerous biomolecules which in their optimum levels are essential for functioning. However mismanaged levels of these biomolecules result in major health issues. Urea is one of the key biomolecules of interest. Its estimation is of paramount significance not only for healthcare sector but also from environmental perspectives. If level of urea in human blood/serum is abnormal, i.e., above or below physiological range (15-40mg/dl)), it may lead to diseases like renal failure, hepatic failure, nephritic syndrome, cachexia, urinary tract obstruction, dehydration, shock, burns and gastrointestinal, etc. Various metal nanoparticles, conducting polymer, metal oxide thin films, etc. have been exploited to act as matrix to immobilize urease to fabricate urea biosensor. Amongst them, Zinc Oxide (ZnO), a semiconductor metal oxide with a wide band gap is of immense interest as an efficient matrix in biosensors by virtue of its natural abundance, biocompatibility, good electron communication feature and high isoelectric point (9.5). In spite of being such an attractive candidate, ZnO does not possess a redox couple of its own which necessitates the use of electroactive mediators for electron transfer between the enzyme and the electrode, thereby causing hindrance in realization of integrated and implantable biosensor. In the present work, an effort has been made to fabricate a matrix based on ZnO-CuO composite prepared by pulsed laser deposition (PLD) technique in order to incorporate redox properties in ZnO matrix and to utilize the same for reagentless biosensing applications. The prepared bioelectrode Urs/(ZnO-CuO)/ITO/glass exhibits high sensitivity (70µAmM⁻¹cm⁻²) for detection of urea (5-200 mg/dl) with high stability (shelf life ˃ 10 weeks) and good selectivity (interference ˂ 4%). The enhanced sensing response obtained for composite matrix is attributed to the efficient electron exchange between ZnO-CuO matrix and immobilized enzymes, and subsequently fast transfer of generated electrons to the electrode via matrix. The response is encouraging for fabricating reagentless urea biosensor based on ZnO-CuO matrix.Keywords: biosensor, reagentless, urea, ZnO-CuO composite
Procedia PDF Downloads 290764 Challenges and Opportunities of Utilization of Social Media by Business Education Students in Nigeria Universities
Authors: Titus Amodu Umoru
Abstract:
The global economy today is full of sophistication. All over the world, business and marketing practices are undergoing an unprecedented transformation. In realization of this fact, the federal government of Nigeria has put in place a robust transformation agenda in order to put Nigeria in a better position to be a competitive player and in the process transform all sectors of its economy. New technologies, especially the internet, are the driving force behind this transformation. However, technology has inadvertently affected the way businesses are done thus necessitating the acquisition of new skills. In developing countries like Nigeria, citizens are still battling with effective application of those technologies. Obviously, students of business education need to acquire relevant business knowledge to be able to transit into the world of work on graduation from school and compete favourably in the labour market. Therefore, effective utilization of social media by both teachers and students can help extensively in empowering students with the needed skills. Social media which is described as a group of internet-based applications that build on the ideological foundations of Web 2.0, and which allow the creation and exchange of user-generated content, if incorporated into the classroom experience may be the needed answer to unemployment and poverty in Nigeria as beneficiaries can easily connect with existing and potential enterprises and customers, engage with them and reinforce mutual business benefits. Challenges and benefits of social media use in education in Nigeria universities were revealed in this study.Keywords: business education, challenges, opportunities, utilization, social media
Procedia PDF Downloads 416763 talk2all: A Revolutionary Tool for International Medical Tourism
Authors: Madhukar Kasarla, Sumit Fogla, Kiran Panuganti, Gaurav Jain, Abhijit Ramanujam, Astha Jain, Shashank Kraleti, Sharat Musham, Arun Chaudhury
Abstract:
Patients have often chosen to travel for care — making pilgrimages to academic meccas and state-of-the-art hospitals for sophisticated surgery. This culture is still persistent in the landscape of US healthcare, with hundred thousand of visitors coming to the shores of United States to seek the high quality of medical care. One of the major challenges in this form of medical tourism has been the language barrier. Thus, an Iraqi patient, with immediate needs of communicating the healthcare needs to the treating team in the hospital, may face huge barrier in effective patient-doctor communication, delaying care and even at times reducing the quality. To circumvent these challenges, we are proposing the use of a state-of-the-art tool, Talk2All, which can translate nearly one hundred international languages (and even sign language) in real time. The tool is an easy to download app and highly user friendly. It builds on machine learning principles to decode different languages in real time. We suggest that the use of Talk2All will tremendously enhance communication in the hospital setting, effectively breaking the language barrier. We propose that vigorous incorporation of Talk2All shall overcome practical challenges in international medical and surgical tourism.Keywords: language translation, communication, machine learning, medical tourism
Procedia PDF Downloads 214762 D3Advert: Data-Driven Decision Making for Ad Personalization through Personality Analysis Using BiLSTM Network
Authors: Sandesh Achar
Abstract:
Personalized advertising holds greater potential for higher conversion rates compared to generic advertisements. However, its widespread application in the retail industry faces challenges due to complex implementation processes. These complexities impede the swift adoption of personalized advertisement on a large scale. Personalized advertisement, being a data-driven approach, necessitates consumer-related data, adding to its complexity. This paper introduces an innovative data-driven decision-making framework, D3Advert, which personalizes advertisements by analyzing personalities using a BiLSTM network. The framework utilizes the Myers–Briggs Type Indicator (MBTI) dataset for development. The employed BiLSTM network, specifically designed and optimized for D3Advert, classifies user personalities into one of the sixteen MBTI categories based on their social media posts. The classification accuracy is 86.42%, with precision, recall, and F1-Score values of 85.11%, 84.14%, and 83.89%, respectively. The D3Advert framework personalizes advertisements based on these personality classifications. Experimental implementation and performance analysis of D3Advert demonstrate a 40% improvement in impressions. D3Advert’s innovative and straightforward approach has the potential to transform personalized advertising and foster widespread personalized advertisement adoption in marketing.Keywords: personalized advertisement, deep Learning, MBTI dataset, BiLSTM network, NLP.
Procedia PDF Downloads 44761 A Novel Approach to Design and Implement Context Aware Mobile Phone
Authors: G. S. Thyagaraju, U. P. Kulkarni
Abstract:
Context-aware computing refers to a general class of computing systems that can sense their physical environment, and adapt their behaviour accordingly. Context aware computing makes systems aware of situations of interest, enhances services to users, automates systems and personalizes applications. Context-aware services have been introduced into mobile devices, such as PDA and mobile phones. In this paper we are presenting a novel approaches used to realize the context aware mobile. The context aware mobile phone (CAMP) proposed in this paper senses the users situation automatically and provides user context required services. The proposed system is developed by using artificial intelligence techniques like Bayesian Network, fuzzy logic and rough sets theory based decision table. Bayesian Network to classify the incoming call (high priority call, low priority call and unknown calls), fuzzy linguistic variables and membership degrees to define the context situations, the decision table based rules for service recommendation. To exemplify and demonstrate the effectiveness of the proposed methods, the context aware mobile phone is tested for college campus scenario including different locations like library, class room, meeting room, administrative building and college canteen.Keywords: context aware mobile, fuzzy logic, decision table, Bayesian probability
Procedia PDF Downloads 365760 Interaction between Cognitive Control and Language Processing in Non-Fluent Aphasia
Authors: Izabella Szollosi, Klara Marton
Abstract:
Aphasia can be defined as a weakness in accessing linguistic information. Accessing linguistic information is strongly related to information processing, which in turn is associated with the cognitive control system. According to the literature, a deficit in the cognitive control system interferes with language processing and contributes to non-fluent speech performance. The aim of our study was to explore this hypothesis by investigating how cognitive control interacts with language performance in participants with non-fluent aphasia. Cognitive control is a complex construct that includes working memory (WM) and the ability to resist proactive interference (PI). Based on previous research, we hypothesized that impairments in domain-general (DG) cognitive control abilities have negative effects on language processing. In contrast, better DG cognitive control functioning supports goal-directed behavior in language-related processes as well. Since stroke itself might slow down information processing, it is important to examine its negative effects on both cognitive control and language processing. Participants (N=52) in our study were individuals with non-fluent Broca’s aphasia (N = 13), with transcortical motor aphasia (N=13), individuals with stroke damage without aphasia (N=13), and unimpaired speakers (N = 13). All participants performed various computer-based tasks targeting cognitive control functions such as WM and resistance to PI in both linguistic and non-linguistic domains. Non-linguistic tasks targeted primarily DG functions, while linguistic tasks targeted more domain specific (DS) processes. The results showed that participants with Broca’s aphasia differed from the other three groups in the non-linguistic tasks. They performed significantly worse even in the baseline conditions. In contrast, we found a different performance profile in the linguistic domain, where the control group differed from all three stroke-related groups. The three groups with impairment performed more poorly than the controls but similar to each other in the verbal baseline condition. In the more complex verbal PI condition, however, participants with Broca’s aphasia performed significantly worse than all the other groups. Participants with Broca’s aphasia demonstrated the most severe language impairment and the highest vulnerability in tasks measuring DG cognitive control functions. Results support the notion that the more severe the cognitive control impairment, the more severe the aphasia. Thus, our findings suggest a strong interaction between cognitive control and language. Individuals with the most severe and most general cognitive control deficit - participants with Broca’s aphasia - showed the most severe language impairment. Individuals with better DG cognitive control functions demonstrated better language performance. While all participants with stroke damage showed impaired cognitive control functions in the linguistic domain, participants with better language skills performed also better in tasks that measured non-linguistic cognitive control functions. The overall results indicate that the level of cognitive control deficit interacts with the language functions in individuals along with the language spectrum (from severe to no impairment). However, future research is needed to determine any directionality.Keywords: cognitive control, information processing, language performance, non-fluent aphasia
Procedia PDF Downloads 122759 Design, Construction, Validation And Use Of A Novel Portable Fire Effluent Sampling Analyser
Authors: Gabrielle Peck, Ryan Hayes
Abstract:
Current large scale fire tests focus on flammability and heat release measurements. Smoke toxicity isn’t considered despite it being a leading cause of death and injury in unwanted fires. A key reason could be that the practical difficulties associated with quantifying individual toxic components present in a fire effluent often require specialist equipment and expertise. Fire effluent contains a mixture of unreactive and reactive gases, water, organic vapours and particulate matter, which interact with each other. This interferes with the operation of the analytical instrumentation and must be removed without changing the concentration of the target analyte. To mitigate the need for expensive equipment and time-consuming analysis, a portable gas analysis system was designed, constructed and tested for use in large-scale fire tests as a simpler and more robust alternative to online FTIR measurements. The novel equipment aimed to be easily portable and able to run on battery or mains electricity; be able to be calibrated at the test site; be capable of quantifying CO, CO2, O2, HCN, HBr, HCl, NOx and SO2 accurately and reliably; be capable of independent data logging; be capable of automated switchover of 7 bubblers; be able to withstand fire effluents; be simple to operate; allow individual bubbler times to be pre-set; be capable of being controlled remotely. To test the analysers functionality, it was used alongside the ISO/TS 19700 Steady State Tube Furnace (SSTF). A series of tests were conducted to assess the validity of the box analyser measurements and the data logging abilities of the apparatus. PMMA and PA 6.6 were used to assess the validity of the box analyser measurements. The data obtained from the bench-scale assessments showed excellent agreement. Following this, the portable analyser was used to monitor gas concentrations during large-scale testing using the ISO 9705 room corner test. The analyser was set up, calibrated and set to record smoke toxicity measurements in the doorway of the test room. The analyser was successful in operating without manual interference and successfully recorded data for 12 of the 12 tests conducted in the ISO room tests. At the end of each test, the analyser created a data file (formatted as .csv) containing the measured gas concentrations throughout the test, which do not require specialist knowledge to interpret. This validated the portable analyser’s ability to monitor fire effluent without operator intervention on both a bench and large-scale. The portable analyser is a validated and significantly more practical alternative to FTIR, proven to work for large-scale fire testing for quantification of smoke toxicity. The analyser is a cheaper, more accessible option to assess smoke toxicity, mitigating the need for expensive equipment and specialist operators.Keywords: smoke toxicity, large-scale tests, iso 9705, analyser, novel equipment
Procedia PDF Downloads 77758 Comparative Study between Inertial Navigation System and GPS in Flight Management System Application
Authors: Othman Maklouf, Matouk Elamari, M. Rgeai, Fateh Alej
Abstract:
In modern avionics the main fundamental component is the flight management system (FMS). An FMS is a specialized computer system that automates a wide variety of in-flight tasks, reducing the workload on the flight crew to the point that modern civilian aircraft no longer carry flight engineers or navigators. The main function of the FMS is in-flight management of the flight plan using various sensors such as Global Positioning System (GPS) and Inertial Navigation System (INS) to determine the aircraft's position and guide the aircraft along the flight plan. GPS which is satellite based navigation system, and INS which generally consists of inertial sensors (accelerometers and gyroscopes). GPS is used to locate positions anywhere on earth, it consists of satellites, control stations, and receivers. GPS receivers take information transmitted from the satellites and uses triangulation to calculate a user’s exact location. The basic principle of an INS is based on the integration of accelerations observed by the accelerometers on board the moving platform, the system will accomplish this task through appropriate processing of the data obtained from the specific force and angular velocity measurements. Thus, an appropriately initialized inertial navigation system is capable of continuous determination of vehicle position, velocity and attitude without the use of the external information. The main objective of article is to introduce a comparative study between the two systems under different conditions and scenarios using MATLAB with SIMULINK software.Keywords: flight management system, GPS, IMU, inertial navigation system
Procedia PDF Downloads 299757 Video Object Segmentation for Automatic Image Annotation of Ethernet Connectors with Environment Mapping and 3D Projection
Authors: Marrone Silverio Melo Dantas Pedro Henrique Dreyer, Gabriel Fonseca Reis de Souza, Daniel Bezerra, Ricardo Souza, Silvia Lins, Judith Kelner, Djamel Fawzi Hadj Sadok
Abstract:
The creation of a dataset is time-consuming and often discourages researchers from pursuing their goals. To overcome this problem, we present and discuss two solutions adopted for the automation of this process. Both optimize valuable user time and resources and support video object segmentation with object tracking and 3D projection. In our scenario, we acquire images from a moving robotic arm and, for each approach, generate distinct annotated datasets. We evaluated the precision of the annotations by comparing these with a manually annotated dataset, as well as the efficiency in the context of detection and classification problems. For detection support, we used YOLO and obtained for the projection dataset an F1-Score, accuracy, and mAP values of 0.846, 0.924, and 0.875, respectively. Concerning the tracking dataset, we achieved an F1-Score of 0.861, an accuracy of 0.932, whereas mAP reached 0.894. In order to evaluate the quality of the annotated images used for classification problems, we employed deep learning architectures. We adopted metrics accuracy and F1-Score, for VGG, DenseNet, MobileNet, Inception, and ResNet. The VGG architecture outperformed the others for both projection and tracking datasets. It reached an accuracy and F1-score of 0.997 and 0.993, respectively. Similarly, for the tracking dataset, it achieved an accuracy of 0.991 and an F1-Score of 0.981.Keywords: RJ45, automatic annotation, object tracking, 3D projection
Procedia PDF Downloads 167756 Modification of Magneto-Transport Properties of Ferrimagnetic Mn₄N Thin Films by Ni Substitution and Their Magnetic Compensation
Authors: Taro Komori, Toshiki Gushi, Akihito Anzai, Taku Hirose, Kaoru Toko, Shinji Isogami, Takashi Suemasu
Abstract:
Ferrimagnetic antiperovskite Mn₄₋ₓNiₓN thin film exhibits both small saturation magnetization and rather large perpendicular magnetic anisotropy (PMA) when x is small. Both of them are suitable features for application to current induced domain wall motion devices using spin transfer torque (STT). In this work, we successfully grew antiperovskite 30-nm-thick Mn₄₋ₓNiₓN epitaxial thin films on MgO(001) and STO(001) substrates by MBE in order to investigate their crystalline qualities and magnetic and magneto-transport properties. Crystalline qualities were investigated by X-ray diffraction (XRD). The magnetic properties were measured by vibrating sample magnetometer (VSM) at room temperature. Anomalous Hall effect was measured by physical properties measurement system. Both measurements were performed at room temperature. Temperature dependence of magnetization was measured by VSM-Superconducting quantum interference device. XRD patterns indicate epitaxial growth of Mn₄₋ₓNiₓN thin films on both substrates, ones on STO(001) especially have higher c-axis orientation thanks to greater lattice matching. According to VSM measurement, PMA was observed in Mn₄₋ₓNiₓN on MgO(001) when x ≤ 0.25 and on STO(001) when x ≤ 0.5, and MS decreased drastically with x. For example, MS of Mn₃.₉Ni₀.₁N on STO(001) was 47.4 emu/cm³. From the anomalous Hall resistivity (ρAH) of Mn₄₋ₓNiₓN thin films on STO(001) with the magnetic field perpendicular to the plane, we found out Mr/MS was about 1 when x ≤ 0.25, which suggests large magnetic domains in samples and suitable features for DW motion device application. In contrast, such square curves were not observed for Mn₄₋ₓNiₓN on MgO(001), which we attribute to difference in lattice matching. Furthermore, it’s notable that although the sign of ρAH was negative when x = 0 and 0.1, it reversed positive when x = 0.25 and 0.5. The similar reversal occurred for temperature dependence of magnetization. The magnetization of Mn₄₋ₓNiₓN on STO(001) increases with decreasing temperature when x = 0 and 0.1, while it decreases when x = 0.25. We considered that these reversals were caused by magnetic compensation which occurred in Mn₄₋ₓNiₓN between x = 0.1 and 0.25. We expect Mn atoms of Mn₄₋ₓNiₓN crystal have larger magnetic moments than Ni atoms do. The temperature dependence stated above can be explained if we assume that Ni atoms preferentially occupy the corner sites, and their magnetic moments have different temperature dependence from Mn atoms at the face-centered sites. At the compensation point, Mn₄₋ₓNiₓN is expected to show very efficient STT and ultrafast DW motion with small current density. What’s more, if angular momentum compensation is found, the efficiency will be best optimized. In order to prove the magnetic compensation, X-ray magnetic circular dichroism will be performed. Energy dispersive X-ray spectrometry is a candidate method to analyze the accurate composition ratio of samples.Keywords: compensation, ferrimagnetism, Mn₄N, PMA
Procedia PDF Downloads 134755 Deep Learning Application for Object Image Recognition and Robot Automatic Grasping
Authors: Shiuh-Jer Huang, Chen-Zon Yan, C. K. Huang, Chun-Chien Ting
Abstract:
Since the vision system application in industrial environment for autonomous purposes is required intensely, the image recognition technique becomes an important research topic. Here, deep learning algorithm is employed in image system to recognize the industrial object and integrate with a 7A6 Series Manipulator for object automatic gripping task. PC and Graphic Processing Unit (GPU) are chosen to construct the 3D Vision Recognition System. Depth Camera (Intel RealSense SR300) is employed to extract the image for object recognition and coordinate derivation. The YOLOv2 scheme is adopted in Convolution neural network (CNN) structure for object classification and center point prediction. Additionally, image processing strategy is used to find the object contour for calculating the object orientation angle. Then, the specified object location and orientation information are sent to robotic controller. Finally, a six-axis manipulator can grasp the specific object in a random environment based on the user command and the extracted image information. The experimental results show that YOLOv2 has been successfully employed to detect the object location and category with confidence near 0.9 and 3D position error less than 0.4 mm. It is useful for future intelligent robotic application in industrial 4.0 environment.Keywords: deep learning, image processing, convolution neural network, YOLOv2, 7A6 series manipulator
Procedia PDF Downloads 250754 Effective Nutrition Label Use on Smartphones
Authors: Vladimir Kulyukin, Tanwir Zaman, Sarat Kiran Andhavarapu
Abstract:
Research on nutrition label use identifies four factors that impede comprehension and retention of nutrition information by consumers: label’s location on the package, presentation of information within the label, label’s surface size, and surrounding visual clutter. In this paper, a system is presented that makes nutrition label use more effective for nutrition information comprehension and retention. The system’s front end is a smartphone application. The system’s back end is a four node Linux cluster for image recognition and data storage. Image frames captured on the smartphone are sent to the back end for skewed or aligned barcode recognition. When barcodes are recognized, corresponding nutrition labels are retrieved from a cloud database and presented to the user on the smartphone’s touchscreen. Each displayed nutrition label is positioned centrally on the touchscreen with no surrounding visual clutter. Wikipedia links to important nutrition terms are embedded to improve comprehension and retention of nutrition information. Standard touch gestures (e.g., zoom in/out) available on mainstream smartphones are used to manipulate the label’s surface size. The nutrition label database currently includes 200,000 nutrition labels compiled from public web sites by a custom crawler. Stress test experiments with the node cluster are presented. Implications for proactive nutrition management and food policy are discussed.Keywords: mobile computing, cloud computing, nutrition label use, nutrition management, barcode scanning
Procedia PDF Downloads 373753 Using Computational Fluid Dynamics (CFD) Modeling to Predict the Impact of Nuclear Reactor Mixed Tank Flows Using the Momentum Equation
Authors: Joseph Amponsah
Abstract:
This research proposes an equation to predict and determine the momentum source equation term after factoring in the radial friction between the fluid and the blades and the impeller's propulsive power. This research aims to look at how CFD software can be used to predict the effect of flows in nuclear reactor stirred tanks through a momentum source equation and the concentration distribution of tracers that have been introduced in reactor tanks. The estimated findings, including the dimensionless concentration curves, power, and pumping numbers, dimensionless velocity profiles, and mixing times 4, were contrasted with results from tests in stirred containers. The investigation was carried out in Part I for vessels that were agitated by one impeller on a central shaft. The two types of impellers employed were an ordinary Rushton turbine and a 6-bladed 45° pitched blade turbine. The simulations made use of numerous reference frame techniques and the common k-e turbulence model. The impact of the grid type was also examined; unstructured, structured, and unique user-defined grids were looked at. The CFD model was used to simulate the flow field within the Rushton turbine nuclear reactor stirred tank. This method was validated using experimental data that were available close to the impeller tip and in the bulk area. Additionally, analyses of the computational efficiency and time using MRF and SM were done.Keywords: Ansys fluent, momentum equation, CFD, prediction
Procedia PDF Downloads 79