Search results for: information processing model
27156 Testing Chat-GPT: An AI Application
Authors: Jana Ismail, Layla Fallatah, Maha Alshmaisi
Abstract:
ChatGPT, a cutting-edge language model built on the GPT-3.5 architecture, has garnered attention for its profound natural language processing capabilities, holding promise for transformative applications in customer service and content creation. This study delves into ChatGPT's architecture, aiming to comprehensively understand its strengths and potential limitations. Through systematic experiments across diverse domains, such as general knowledge and creative writing, we evaluated the model's coherence, context retention, and task-specific accuracy. While ChatGPT excels in generating human-like responses and demonstrates adaptability, occasional inaccuracies and sensitivity to input phrasing were observed. The study emphasizes the impact of prompt design on output quality, providing valuable insights for the nuanced deployment of ChatGPT in conversational AI and contributing to the ongoing discourse on the evolving landscape of natural language processing in artificial intelligence.Keywords: artificial Inelegance, chatGPT, open AI, NLP
Procedia PDF Downloads 7727155 Automatic Classification of Lung Diseases from CT Images
Authors: Abobaker Mohammed Qasem Farhan, Shangming Yang, Mohammed Al-Nehari
Abstract:
Pneumonia is a kind of lung disease that creates congestion in the chest. Such pneumonic conditions lead to loss of life of the severity of high congestion. Pneumonic lung disease is caused by viral pneumonia, bacterial pneumonia, or Covidi-19 induced pneumonia. The early prediction and classification of such lung diseases help to reduce the mortality rate. We propose the automatic Computer-Aided Diagnosis (CAD) system in this paper using the deep learning approach. The proposed CAD system takes input from raw computerized tomography (CT) scans of the patient's chest and automatically predicts disease classification. We designed the Hybrid Deep Learning Algorithm (HDLA) to improve accuracy and reduce processing requirements. The raw CT scans have pre-processed first to enhance their quality for further analysis. We then applied a hybrid model that consists of automatic feature extraction and classification. We propose the robust 2D Convolutional Neural Network (CNN) model to extract the automatic features from the pre-processed CT image. This CNN model assures feature learning with extremely effective 1D feature extraction for each input CT image. The outcome of the 2D CNN model is then normalized using the Min-Max technique. The second step of the proposed hybrid model is related to training and classification using different classifiers. The simulation outcomes using the publically available dataset prove the robustness and efficiency of the proposed model compared to state-of-art algorithms.Keywords: CT scan, Covid-19, deep learning, image processing, lung disease classification
Procedia PDF Downloads 15427154 Audio-Visual Co-Data Processing Pipeline
Authors: Rita Chattopadhyay, Vivek Anand Thoutam
Abstract:
Speech is the most acceptable means of communication where we can quickly exchange our feelings and thoughts. Quite often, people can communicate orally but cannot interact or work with computers or devices. It’s easy and quick to give speech commands than typing commands to computers. In the same way, it’s easy listening to audio played from a device than extract output from computers or devices. Especially with Robotics being an emerging market with applications in warehouses, the hospitality industry, consumer electronics, assistive technology, etc., speech-based human-machine interaction is emerging as a lucrative feature for robot manufacturers. Considering this factor, the objective of this paper is to design the “Audio-Visual Co-Data Processing Pipeline.” This pipeline is an integrated version of Automatic speech recognition, a Natural language model for text understanding, object detection, and text-to-speech modules. There are many Deep Learning models for each type of the modules mentioned above, but OpenVINO Model Zoo models are used because the OpenVINO toolkit covers both computer vision and non-computer vision workloads across Intel hardware and maximizes performance, and accelerates application development. A speech command is given as input that has information about target objects to be detected and start and end times to extract the required interval from the video. Speech is converted to text using the Automatic speech recognition QuartzNet model. The summary is extracted from text using a natural language model Generative Pre-Trained Transformer-3 (GPT-3). Based on the summary, essential frames from the video are extracted, and the You Only Look Once (YOLO) object detection model detects You Only Look Once (YOLO) objects on these extracted frames. Frame numbers that have target objects (specified objects in the speech command) are saved as text. Finally, this text (frame numbers) is converted to speech using text to speech model and will be played from the device. This project is developed for 80 You Only Look Once (YOLO) labels, and the user can extract frames based on only one or two target labels. This pipeline can be extended for more than two target labels easily by making appropriate changes in the object detection module. This project is developed for four different speech command formats by including sample examples in the prompt used by Generative Pre-Trained Transformer-3 (GPT-3) model. Based on user preference, one can come up with a new speech command format by including some examples of the respective format in the prompt used by the Generative Pre-Trained Transformer-3 (GPT-3) model. This pipeline can be used in many projects like human-machine interface, human-robot interaction, and surveillance through speech commands. All object detection projects can be upgraded using this pipeline so that one can give speech commands and output is played from the device.Keywords: OpenVINO, automatic speech recognition, natural language processing, object detection, text to speech
Procedia PDF Downloads 8027153 Classification of Cochannel Signals Using Cyclostationary Signal Processing and Deep Learning
Authors: Bryan Crompton, Daniel Giger, Tanay Mehta, Apurva Mody
Abstract:
The task of classifying radio frequency (RF) signals has seen recent success in employing deep neural network models. In this work, we present a combined signal processing and machine learning approach to signal classification for cochannel anomalous signals. The power spectral density and cyclostationary signal processing features of a captured signal are computed and fed into a neural net to produce a classification decision. Our combined signal preprocessing and machine learning approach allows for simpler neural networks with fast training times and small computational resource requirements for inference with longer preprocessing time.Keywords: signal processing, machine learning, cyclostationary signal processing, signal classification
Procedia PDF Downloads 10727152 Advances in Food Processing Using Extrusion Technology
Authors: Javeed Akhtar, R. K. Pandey, Z. R. Azaz Ahmad Azad
Abstract:
For the purpose of making different uses of food material for the development of extruded foods are produced using single and twin extruders. Extrusion cooking is a useful and economical tool for processing of novel food. This high temperature, short time processing technology causes chemical and physical changes that alter the nutritional and physical quality of the product. Extrusion processing of food ingredients characteristically depends on associating process conditions that influence the product qualities. The process parameters are optimized for extrusion of food material in order to obtain the maximum nutritive value by inactivating the anti-nutritional factors. The processing conditions such as moisture content, temperature and time are controlled to avoid over heating or under heating which otherwise would result in a product of lower nutritional quality.Keywords: extrusion processing, single and twin extruder, operating condition of extruders and extruded novel foods, food and agricultural engineering
Procedia PDF Downloads 38227151 Proposal of a Model Supporting Decision-Making on Information Security Risk Treatment
Authors: Ritsuko Kawasaki, Takeshi Hiromatsu
Abstract:
Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Therefore, this paper provides a model which supports the selection of measures by applying multi-objective analysis to find an optimal solution. Additionally, a list of measures is also provided to make the selection easier and more effective without any leakage of measures.Keywords: information security risk treatment, selection of risk measures, risk acceptance, multi-objective optimization
Procedia PDF Downloads 37927150 Foundation of the Information Model for Connected-Cars
Authors: Hae-Won Seo, Yong-Gu Lee
Abstract:
Recent progress in the next generation of automobile technology is geared towards incorporating information technology into cars. Collectively called smart cars are bringing intelligence to cars that provides comfort, convenience and safety. A branch of smart cars is connected-car system. The key concept in connected-cars is the sharing of driving information among cars through decentralized manner enabling collective intelligence. This paper proposes a foundation of the information model that is necessary to define the driving information for smart-cars. Road conditions are modeled through a unique data structure that unambiguously represent the time variant traffics in the streets. Additionally, the modeled data structure is exemplified in a navigational scenario and usage using UML. Optimal driving route searching is also discussed using the proposed data structure in a dynamically changing road conditions.Keywords: connected-car, data modeling, route planning, navigation system
Procedia PDF Downloads 37427149 Robust Image Design Based Steganographic System
Authors: Sadiq J. Abou-Loukh, Hanan M. Habbi
Abstract:
This paper presents a steganography to hide the transmitted information without excite suspicious and also illustrates the level of secrecy that can be increased by using cryptography techniques. The proposed system has been implemented firstly by encrypted image file one time pad key and secondly encrypted message that hidden to perform encryption followed by image embedding. Then the new image file will be created from the original image by using four triangles operation, the new image is processed by one of two image processing techniques. The proposed two processing techniques are thresholding and differential predictive coding (DPC). Afterwards, encryption or decryption keys are generated by functional key generator. The generator key is used one time only. Encrypted text will be hidden in the places that are not used for image processing and key generation system has high embedding rate (0.1875 character/pixel) for true color image (24 bit depth).Keywords: encryption, thresholding, differential predictive coding, four triangles operation
Procedia PDF Downloads 49327148 Automatic Algorithm for Processing and Analysis of Images from the Comet Assay
Authors: Yeimy L. Quintana, Juan G. Zuluaga, Sandra S. Arango
Abstract:
The comet assay is a method based on electrophoresis that is used to measure DNA damage in cells and has shown important results in the identification of substances with a potential risk to the human population as innumerable physical, chemical and biological agents. With this technique is possible to obtain images like a comet, in which the tail of these refers to damaged fragments of the DNA. One of the main problems is that the image has unequal luminosity caused by the fluorescence microscope and requires different processing to condition it as well as to know how many optimal comets there are per sample and finally to perform the measurements and determine the percentage of DNA damage. In this paper, we propose the design and implementation of software using Image Processing Toolbox-MATLAB that allows the automation of image processing. The software chooses the optimum comets and measuring the necessary parameters to detect the damage.Keywords: artificial vision, comet assay, DNA damage, image processing
Procedia PDF Downloads 31027147 Rheological Modeling for Shape-Memory Thermoplastic Polymers
Authors: H. Hosseini, B. V. Berdyshev, I. Iskopintsev
Abstract:
This paper presents a rheological model for producing shape-memory thermoplastic polymers. Shape-memory occurs as a result of internal rearrangement of the structural elements of a polymer. A non-linear viscoelastic model was developed that allows qualitative and quantitative prediction of the stress-strain behavior of shape-memory polymers during heating. This research was done to develop a technique to determine the maximum possible change in size of heat-shrinkable products during heating. The rheological model used in this work was particularly suitable for defining process parameters and constructive parameters of the processing equipment.Keywords: elastic deformation, heating, shape-memory polymers, stress-strain behavior, viscoelastic model
Procedia PDF Downloads 32127146 A Newspapers Expectations Indicator from Web Scraping
Authors: Pilar Rey del Castillo
Abstract:
This document describes the building of an average indicator of the general sentiments about the future exposed in the newspapers in Spain. The raw data are collected through the scraping of the Digital Periodical and Newspaper Library website. Basic tools of natural language processing are later applied to the collected information to evaluate the sentiment strength of each word in the texts using a polarized dictionary. The last step consists of summarizing these sentiments to produce daily indices. The results are a first insight into the applicability of these techniques to produce periodic sentiment indicators.Keywords: natural language processing, periodic indicator, sentiment analysis, web scraping
Procedia PDF Downloads 13227145 Assessment Of Factors Affecting Sustainability of Rice (Oryza sativa) Processing and Marketing in Ogun State, Nigeria
Authors: A. M. Omoare, O. O. Sofowora, W. O. Oyediran
Abstract:
The study was carried out to assess the factors affecting the sustainability of rice processing and marketing in Ogun State, Nigeria. Multi-stage sampling technique was used to select one hundred and twenty (120) respondents for the study. Descriptive statistics was used to describe the objectives while hypotheses were analyzed with Pearson Product Moment Correlation. The result showed that most (85%) of the respondents was less than 50 years old and had been in rice business for more than 6 years. The majority (66.67%) of the respondents got their capitals from cooperative societies. All (100%) the respondents used rice as household food security and source of income. However, efficient rice processing and marketing were affected by inadequate manpower capacity development and inputs. There was a positive and significant relationship between socio-economic characteristics and processing techniques (p < 0.05). It is hereby recommended that extension service providers should introduce improved rice processing systems to the rice millers traders in the study area.Keywords: sustainability, rice processing, marketing, constraints, millers traders
Procedia PDF Downloads 39227144 Customer Data Analysis Model Using Business Intelligence Tools in Telecommunication Companies
Authors: Monica Lia
Abstract:
This article presents a customer data analysis model using business intelligence tools for data modelling, transforming, data visualization and dynamic reports building. Economic organizational customer’s analysis is made based on the information from the transactional systems of the organization. The paper presents how to develop the data model starting for the data that companies have inside their own operational systems. The owned data can be transformed into useful information about customers using business intelligence tool. For a mature market, knowing the information inside the data and making forecast for strategic decision become more important. Business Intelligence tools are used in business organization as support for decision-making.Keywords: customer analysis, business intelligence, data warehouse, data mining, decisions, self-service reports, interactive visual analysis, and dynamic dashboards, use cases diagram, process modelling, logical data model, data mart, ETL, star schema, OLAP, data universes
Procedia PDF Downloads 43027143 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data
Authors: S. Nickolas, Shobha K.
Abstract:
The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing
Procedia PDF Downloads 27427142 Neural Network Mechanisms Underlying the Combination Sensitivity Property in the HVC of Songbirds
Authors: Zeina Merabi, Arij Dao
Abstract:
The temporal order of information processing in the brain is an important code in many acoustic signals, including speech, music, and animal vocalizations. Despite its significance, surprisingly little is known about its underlying cellular mechanisms and network manifestations. In the songbird telencephalic nucleus HVC, a subset of neurons shows temporal combination sensitivity (TCS). These neurons show a high temporal specificity, responding differently to distinct patterns of spectral elements and their combinations. HVC neuron types include basal-ganglia-projecting HVCX, forebrain-projecting HVCRA, and interneurons (HVC¬INT), each exhibiting distinct cellular, electrophysiological and functional properties. In this work, we develop conductance-based neural network models connecting the different classes of HVC neurons via different wiring scenarios, aiming to explore possible neural mechanisms that orchestrate the combination sensitivity property exhibited by HVCX, as well as replicating in vivo firing patterns observed when TCS neurons are presented with various auditory stimuli. The ionic and synaptic currents for each class of neurons that are presented in our networks and are based on pharmacological studies, rendering our networks biologically plausible. We present for the first time several realistic scenarios in which the different types of HVC neurons can interact to produce this behavior. The different networks highlight neural mechanisms that could potentially help to explain some aspects of combination sensitivity, including 1) interplay between inhibitory interneurons’ activity and the post inhibitory firing of the HVCX neurons enabled by T-type Ca2+ and H currents, 2) temporal summation of synaptic inputs at the TCS site of opposing signals that are time-and frequency- dependent, and 3) reciprocal inhibitory and excitatory loops as a potent mechanism to encode information over many milliseconds. The result is a plausible network model characterizing auditory processing in HVC. Our next step is to test the predictions of the model.Keywords: combination sensitivity, songbirds, neural networks, spatiotemporal integration
Procedia PDF Downloads 6527141 Emotion Processing Differences Between People
Authors: Elif Unveren, Ozlem Bozkurt
Abstract:
Emotion processing happens when someone has a negative, stressful experience and gets over it in time, and it is a different experience for every person. As to look into emotion processing can be categorised by intensity, awareness, coordination, speed, accuracy and response. It may vary depending on people’s age, sex and conditions. Each emotion processing shows different activation patterns in different brain regions. Activation is significantly higher in the right frontal areas. The highest activation happens in extended frontotemporal areas during the processing of happiness, sadness and disgust. Those emotions also show widely disturbed differences and get produced earlier than anger and fear. For different occasions, listed variables may have less or more importance. A borderline personality disorder is a condition that creates an unstable personality, sudden mood swings and unpredictability of actions. According to a study that was made with healthy people and people who had BPD, there were significant differences in some categories of emotion processing, such as intensity, awareness and accuracy. According to another study that was made to show the emotional processing differences between puberty and was made for only females who were between the ages of 11 and 17, it was perceived that for different ages and hormone levels, different parts of the brain are used to understand the given task. Also, in the different study that was made for kids that were between the age of 4 and 15, it was observed that the older kids were processing emotion more intensely and expressing it to a greater extent. There was a significant increase in fear and disgust in those matters. To sum up, we can say that the activity of undertaking negative experiences is a unique thing for everybody for many different reasons.Keywords: age, sex, conditions, brain regions, emotion processing
Procedia PDF Downloads 8527140 Possible Risks for Online Orders in the Furniture Industry - Customer and Entrepreneur Perspective
Authors: Justyna Żywiołek, Marek Matulewski
Abstract:
Data, is information processed by enterprises for primary and secondary purposes as processes. Thanks to processing, the sales process takes place; in the case of the surveyed companies, sales take place online. However, this indirect form of contact with the customer causes many problems for both customers and furniture manufacturers. The article presents solutions that would solve problems related to the analysis of data and information in the order fulfillment process sent to post-warranty service. The article also presents an analysis of threats to the security of this information, both for customers and the enterprise.Keywords: ordering furniture online, information security, furniture industry, enterprise security, risk analysis
Procedia PDF Downloads 4827139 AgriFood Model in Ankara Regional Innovation Strategy
Authors: Coskun Serefoglu
Abstract:
The study aims to analyse how a traditional sector such as agri-food could be mobilized through regional innovation strategies. A principal component analysis as well as qualitative information, such as in-depth interviews, focus group and surveys, were employed to find the priority sectors. An agri-food model was developed which includes both a linear model and interactive model. The model consists of two main components, one of which is technological integration and the other one is agricultural extension which is based on Land-grant university approach of U.S. which is not a common practice in Turkey.Keywords: regional innovation strategy, interactive model, agri-food sector, local development, planning, regional development
Procedia PDF Downloads 14927138 A Hebbian Neural Network Model of the Stroop Effect
Authors: Vadim Kulikov
Abstract:
The classical Stroop effect is the phenomenon that it takes more time to name the ink color of a printed word if the word denotes a conflicting color than if it denotes the same color. Over the last 80 years, there have been many variations of the experiment revealing various mechanisms behind semantic, attentional, behavioral and perceptual processing. The Stroop task is known to exhibit asymmetry. Reading the words out loud is hardly dependent on the ink color, but naming the ink color is significantly influenced by the incongruent words. This asymmetry is reversed, if instead of naming the color, one has to point at a corresponding color patch. Another debated aspects are the notions of automaticity and how much of the effect is due to semantic and how much due to response stage interference. Is automaticity a continuous or an all-or-none phenomenon? There are many models and theories in the literature tackling these questions which will be discussed in the presentation. None of them, however, seems to capture all the findings at once. A computational model is proposed which is based on the philosophical idea developed by the author that the mind operates as a collection of different information processing modalities such as different sensory and descriptive modalities, which produce emergent phenomena through mutual interaction and coherence. This is the framework theory where ‘framework’ attempts to generalize the concepts of modality, perspective and ‘point of view’. The architecture of this computational model consists of blocks of neurons, each block corresponding to one framework. In the simplest case there are four: visual color processing, text reading, speech production and attention selection modalities. In experiments where button pressing or pointing is required, a corresponding block is added. In the beginning, the weights of the neural connections are mostly set to zero. The network is trained using Hebbian learning to establish connections (corresponding to ‘coherence’ in framework theory) between these different modalities. The amount of data fed into the network is supposed to mimic the amount of practice a human encounters, in particular it is assumed that converting written text into spoken words is a more practiced skill than converting visually perceived colors to spoken color-names. After the training, the network performs the Stroop task. The RT’s are measured in a canonical way, as these are continuous time recurrent neural networks (CTRNN). The above-described aspects of the Stroop phenomenon along with many others are replicated. The model is similar to some existing connectionist models but as will be discussed in the presentation, has many advantages: it predicts more data, the architecture is simpler and biologically more plausible.Keywords: connectionism, Hebbian learning, artificial neural networks, philosophy of mind, Stroop
Procedia PDF Downloads 26427137 Physical Activity and Cognitive Functioning Relationship in Children
Authors: Comfort Mokgothu
Abstract:
This study investigated the relation between processing information and fitness level of active (fit) and sedentary (unfit) children drawn from rural and urban areas in Botswana. It was hypothesized that fit children would display faster simple reaction time (SRT), choice reaction times (CRT) and movement times (SMT). 60, third grade children (7.0 – 9.0 years) were initially selected and based upon fitness testing, 45 participated in the study (15 each of fit urban, unfit urban, fit rural). All children completed anthropometric measures, skinfold testing and submaximal cycle ergometer testing. The cognitive testing included SRT, CRT, SMT and Choice Movement Time (CMT) and memory sequence length. Results indicated that the rural fit group exhibited faster SMT than the urban fit and unfit groups. For CRT, both fit groups were faster than the unfit group. Collectively, the study shows that the relationship that exists between physical fitness and cognitive function amongst the elderly can tentatively be extended to the pediatric population. Physical fitness could be a factor in the speed at which we process information, including decision making, even in children.Keywords: decision making, fitness, information processing, reaction time, cognition movement time
Procedia PDF Downloads 14527136 Quantification of the Variables of the Information Model for the Use of School Terminology from 1884 to 2014 in Dalmatia
Authors: Vinko Vidučić, Tanja Brešan Ančić, Marijana Tomelić Ćurlin
Abstract:
Prior to quantifying the variables of the information model for using school terminology in Croatia's region of Dalmatia from 1884 to 2014, the most relevant model variables had to be determined: historical circumstances, standard of living, education system, linguistic situation, and media. The research findings show that there was no significant transfer of the 1884 school terms into 1949 usage; likewise, the 1949 school terms were not widely used in 2014. On the other hand, the research revealed that the meaning of school terms changed over the decades. The quantification of the variables will serve as the groundwork for creating an information model for using school terminology in Dalmatia from 1884 to 2014 and for defining direct growth rates in further research.Keywords: education system, historical circumstances, linguistic situation, media, school terminology, standard of living
Procedia PDF Downloads 21527135 Designing a Model for Preparing Reports on the Automatic Earned Value Management Progress by the Integration of Primavera P6, SQL Database, and Power BI: A Case Study of a Six-Storey Concrete Building in Mashhad, Iran
Authors: Hamed Zolfaghari, Mojtaba Kord
Abstract:
Project planners and controllers are frequently faced with the challenge of inadequate software for the preparation of automatic project progress reports based on actual project information updates. They usually make dashboards in Microsoft Excel, which is local and not applicable online. Another shortcoming is that it is not linked to planning software such as Microsoft Project, which lacks the database required for data storage. This study aimed to propose a model for the preparation of reports on automatic online project progress based on actual project information updates by the integration of Primavera P6, SQL database, and Power BI for a construction project. The designed model could be applicable to project planners and controller agents by enabling them to prepare project reports automatically and immediately after updating the project schedule using actual information. To develop the model, the data were entered into P6, and the information was stored on the SQL database. The proposed model could prepare a wide range of reports, such as earned value management, HR reports, and financial, physical, and risk reports automatically on the Power BI application. Furthermore, the reports could be published and shared online.Keywords: primavera P6, SQL, Power BI, EVM, integration management
Procedia PDF Downloads 10827134 A Mixed Approach to Assess Information System Risk, Operational Risk, and Congolese Microfinance Institutions Performance
Authors: Alfred Kamate Siviri, Angelus Mafikiri Tsongo, Jean Robert Kala Kamdjoug
Abstract:
Digitalization and information systems well organized have been selected as relevant measures to mitigate operational risks within organizations. Unfortunately, information system comes with new threats that can cause severe damage and quick organization lockout. This study aims to measure perceived information system risks and their effects on operational risks within the microfinance institution in D.R. Congo. Also, the factors influencing the operational risk are identified, and the link between operational risk with other risks and performance is to be assessed. The study proposes a research model drawn on the combination of Resources-Based-View, dynamic capabilities, the agency theory, the Information System Security Model, and social theories of risk. Therefore, we suggest adopting a mixed methods research with the sole aim of increasing the literature that already exists on perceived operational risk assessment and its link with other risk and performance, a focus on IT risk.Keywords: Democratic Republic Congo, information system risk, microfinance performance, operational risk
Procedia PDF Downloads 22427133 SVM-RBN Model with Attentive Feature Culling Method for Early Detection of Fruit Plant Diseases
Authors: Piyush Sharma, Devi Prasad Sharma, Sulabh Bansal
Abstract:
Diseases are fairly common in fruits and vegetables because of the changing climatic and environmental circumstances. Crop diseases, which are frequently difficult to control, interfere with the growth and output of the crops. Accurate disease detection and timely disease control measures are required to guarantee high production standards and good quality. In India, apples are a common crop that may be afflicted by a variety of diseases on the fruit, stem, and leaves. It is fungi, bacteria, and viruses that trigger the early symptoms of leaf diseases. In order to assist farmers and take the appropriate action, it is important to develop an automated system that can be used to detect the type of illnesses. Machine learning-based image processing can be used to: this research suggested a system that can automatically identify diseases in apple fruit and apple plants. Hence, this research utilizes the hybrid SVM-RBN model. As a consequence, the model may produce results that are more effective in terms of accuracy, precision, recall, and F1 Score, with respective values of 96%, 99%, 94%, and 93%.Keywords: fruit plant disease, crop disease, machine learning, image processing, SVM-RBN
Procedia PDF Downloads 6427132 An Analysis of the Temporal Aspects of Visual Attention Processing Using Rapid Series Visual Processing (RSVP) Data
Authors: Shreya Borthakur, Aastha Vartak
Abstract:
This Electroencephalogram (EEG) project on Rapid Visual Serial Processing (RSVP) paradigm explores the temporal dynamics of visual attention processing in response to rapidly presented visual stimuli. The study builds upon previous research that used real-world images in RSVP tasks to understand the emergence of object representations in the human brain. The objectives of the research include investigating the differences in accuracy and reaction times between 5 Hz and 20 Hz presentation rates, as well as examining the prominent brain waves, particularly alpha and beta waves, associated with the attention task. The pre-processing and data analysis involves filtering EEG data, creating epochs for target stimuli, and conducting statistical tests using MATLAB, EEGLAB, Chronux toolboxes, and R. The results support the hypotheses, revealing higher accuracy at a slower presentation rate, faster reaction times for less complex targets, and the involvement of alpha and beta waves in attention and cognitive processing. This research sheds light on how short-term memory and cognitive control affect visual processing and could have practical implications in fields like education.Keywords: RSVP, attention, visual processing, attentional blink, EEG
Procedia PDF Downloads 6927131 Relation between Sensory Processing Patterns and Working Memory in Autistic Children
Authors: Abbas Nesayan
Abstract:
Background: In recent years, autism has been under consideration in public and research area. Autistic children have dysfunction in communication, socialization, repetitive and stereotyped behaviors. In addition, they clinically suffer from difficulty in attention, challenge with familiar behaviors and sensory processing problems. Several variables are linked to sensory processing problems in autism, one of these variables is working memory. Working memory is part of the executive function which provides the necessary ability to completing multiple stages tasks. Method: This study has categorized in correlational research methods. After determining of entry criteria, according to purposive sampling method, 50 children were selected. Dunn’s sensory profile school companion was used for assessment of sensory processing patterns; behavioral rating inventory of executive functions was used (BRIEF) for assessment of working memory. Pearson correlation coefficient and linear regression were used for data analyzing. Results: The results showed the significant relationship between sensory processing patterns (low registration, sensory seeking, sensory sensitivity and sensory avoiding) with working memory in autistic children. Conclusion: According to the findings, there is the significant relationship between the patterns of sensory processing and working memory. So, in order to improve the working memory could be used some interventions based on the sensory processing.Keywords: sensory processing patterns, working memory, autism, autistic children
Procedia PDF Downloads 22327130 Enhancing Word Meaning Retrieval Using FastText and Natural Language Processing Techniques
Authors: Sankalp Devanand, Prateek Agasimani, Shamith V. S., Rohith Neeraje
Abstract:
Machine translation has witnessed significant advancements in recent years, but the translation of languages with distinct linguistic characteristics, such as English and Sanskrit, remains a challenging task. This research presents the development of a dedicated English-to-Sanskrit machine translation model, aiming to bridge the linguistic and cultural gap between these two languages. Using a variety of natural language processing (NLP) approaches, including FastText embeddings, this research proposes a thorough method to improve word meaning retrieval. Data preparation, part-of-speech tagging, dictionary searches, and transliteration are all included in the methodology. The study also addresses the implementation of an interpreter pattern and uses a word similarity task to assess the quality of word embeddings. The experimental outcomes show how the suggested approach may be used to enhance word meaning retrieval tasks with greater efficacy, accuracy, and adaptability. Evaluation of the model's performance is conducted through rigorous testing, comparing its output against existing machine translation systems. The assessment includes quantitative metrics such as BLEU scores, METEOR scores, Jaccard Similarity, etc.Keywords: machine translation, English to Sanskrit, natural language processing, word meaning retrieval, fastText embeddings
Procedia PDF Downloads 4427129 A Review on Cloud Computing and Internet of Things
Authors: Sahar S. Tabrizi, Dogan Ibrahim
Abstract:
Cloud Computing is a convenient model for on-demand networks that uses shared pools of virtual configurable computing resources, such as servers, networks, storage devices, applications, etc. The cloud serves as an environment for companies and organizations to use infrastructure resources without making any purchases and they can access such resources wherever and whenever they need. Cloud computing is useful to overcome a number of problems in various Information Technology (IT) domains such as Geographical Information Systems (GIS), Scientific Research, e-Governance Systems, Decision Support Systems, ERP, Web Application Development, Mobile Technology, etc. Companies can use Cloud Computing services to store large amounts of data that can be accessed from anywhere on Earth and also at any time. Such services are rented by the client companies where the actual rent depends upon the amount of data stored on the cloud and also the amount of processing power used in a given time period. The resources offered by the cloud service companies are flexible in the sense that the user companies can increase or decrease their storage requirements or the processing power requirements at any time, thus minimizing the overall rental cost of the service they receive. In addition, the Cloud Computing service providers offer fast processors and applications software that can be shared by their clients. This is especially important for small companies with limited budgets which cannot afford to purchase their own expensive hardware and software. This paper is an overview of the Cloud Computing, giving its types, principles, advantages, and disadvantages. In addition, the paper gives some example engineering applications of Cloud Computing and makes suggestions for possible future applications in the field of engineering.Keywords: cloud computing, cloud systems, cloud services, IaaS, PaaS, SaaS
Procedia PDF Downloads 23327128 How Western Donors Allocate Official Development Assistance: New Evidence From a Natural Language Processing Approach
Authors: Daniel Benson, Yundan Gong, Hannah Kirk
Abstract:
Advancement in national language processing techniques has led to increased data processing speeds, and reduced the need for cumbersome, manual data processing that is often required when processing data from multilateral organizations for specific purposes. As such, using named entity recognition (NER) modeling and the Organisation of Economically Developed Countries (OECD) Creditor Reporting System database, we present the first geotagged dataset of OECD donor Official Development Assistance (ODA) projects on a global, subnational basis. Our resulting data contains 52,086 ODA projects geocoded to subnational locations across 115 countries, worth a combined $87.9bn. This represents the first global, OECD donor ODA project database with geocoded projects. We use this new data to revisit old questions of how ‘well’ donors allocate ODA to the developing world. This understanding is imperative for policymakers seeking to improve ODA effectiveness.Keywords: international aid, geocoding, subnational data, natural language processing, machine learning
Procedia PDF Downloads 7827127 Comparison of the Logistic and the Gompertz Growth Functions Considering a Periodic Perturbation in the Model Parameters
Authors: Avan Al-Saffar, Eun-Jin Kim
Abstract:
Both the logistic growth model and the gompertz growth model are used to describe growth processes. Both models driven by perturbations in different cases are investigated using information theory as a useful measure of sustainability and the variability. Specifically, we study the effect of different oscillatory modulations in the system's parameters on the evolution of the system and Probability Density Function (PDF). We show the maintenance of the initial conditions for a long time. We offer Fisher information analysis in positive and/or negative feedback and explain its implications for the sustainability of population dynamics. We also display a finite amplitude solution due to the purely fluctuating growth rate whereas the periodic fluctuations in negative feedback can lead to break down the system's self-regulation with an exponentially growing solution. In the cases tested, the gompertz and logistic systems show similar behaviour in terms of information and sustainability although they develop differently in time.Keywords: dynamical systems, fisher information, probability density function (pdf), sustainability
Procedia PDF Downloads 431