Search results for: information processing model
26662 Model-Based Field Extraction from Different Class of Administrative Documents
Authors: Jinen Daghrir, Anis Kricha, Karim Kalti
Abstract:
The amount of incoming administrative documents is massive and manually processing these documents is a costly task especially on the timescale. In fact, this problem has led an important amount of research and development in the context of automatically extracting fields from administrative documents, in order to reduce the charges and to increase the citizen satisfaction in administrations. In this matter, we introduce an administrative document understanding system. Given a document in which a user has to select fields that have to be retrieved from a document class, a document model is automatically built. A document model is represented by an attributed relational graph (ARG) where nodes represent fields to extract, and edges represent the relation between them. Both of vertices and edges are attached with some feature vectors. When another document arrives to the system, the layout objects are extracted and an ARG is generated. The fields extraction is translated into a problem of matching two ARGs which relies mainly on the comparison of the spatial relationships between layout objects. Experimental results yield accuracy rates from 75% to 100% tested on eight document classes. Our proposed method has a good performance knowing that the document model is constructed using only one single document.Keywords: administrative document understanding, logical labelling, logical layout analysis, fields extraction from administrative documents
Procedia PDF Downloads 21026661 Consortium Blockchain-based Model for Data Management Applications in the Healthcare Sector
Authors: Teo Hao Jing, Shane Ho Ken Wae, Lee Jin Yu, Burra Venkata Durga Kumar
Abstract:
Current distributed healthcare systems face the challenge of interoperability of health data. Storing electronic health records (EHR) in local databases causes them to be fragmented. This problem is aggravated as patients visit multiple healthcare providers in their lifetime. Existing solutions are unable to solve this issue and have caused burdens to healthcare specialists and patients alike. Blockchain technology was found to be able to increase the interoperability of health data by implementing digital access rules, enabling uniformed patient identity, and providing data aggregation. Consortium blockchain was found to have high read throughputs, is more trustworthy, more secure against external disruptions and accommodates transactions without fees. Therefore, this paper proposes a blockchain-based model for data management applications. In this model, a consortium blockchain is implemented by using a delegated proof of stake (DPoS) as its consensus mechanism. This blockchain allows collaboration between users from different organizations such as hospitals and medical bureaus. Patients serve as the owner of their information, where users from other parties require authorization from the patient to view their information. Hospitals upload the hash value of patients’ generated data to the blockchain, whereas the encrypted information is stored in a distributed cloud storage.Keywords: blockchain technology, data management applications, healthcare, interoperability, delegated proof of stake
Procedia PDF Downloads 13626660 Information Needs and Information Usage of the Older Person Club’s Members in Bangkok
Authors: Siriporn Poolsuwan
Abstract:
This research aims to explore the information needs, information usages, and problems of information usage of the older people club’s members in Dusit District, Bangkok. There are 12 clubs and 746 club’s members in this district. The research results use for older person service in this district. Data is gathered from 252 club’s members by using questionnaires. The quantitative approach uses in research by percentage, means and standard deviation. The results are as follows (1) The older people need Information for entertainment, occupation and academic in the field of short story, computer work, and religion and morality. (2) The participants use Information from various sources. (3) The Problem of information usage is their language skills because of the older people’s literacy problem.Keywords: information behavior, older person, information seeking, knowledge discovery and data mining
Procedia PDF Downloads 26826659 Multi-source Question Answering Framework Using Transformers for Attribute Extraction
Authors: Prashanth Pillai, Purnaprajna Mangsuli
Abstract:
Oil exploration and production companies invest considerable time and efforts to extract essential well attributes (like well status, surface, and target coordinates, wellbore depths, event timelines, etc.) from unstructured data sources like technical reports, which are often non-standardized, multimodal, and highly domain-specific by nature. It is also important to consider the context when extracting attribute values from reports that contain information on multiple wells/wellbores. Moreover, semantically similar information may often be depicted in different data syntax representations across multiple pages and document sources. We propose a hierarchical multi-source fact extraction workflow based on a deep learning framework to extract essential well attributes at scale. An information retrieval module based on the transformer architecture was used to rank relevant pages in a document source utilizing the page image embeddings and semantic text embeddings. A question answering framework utilizingLayoutLM transformer was used to extract attribute-value pairs incorporating the text semantics and layout information from top relevant pages in a document. To better handle context while dealing with multi-well reports, we incorporate a dynamic query generation module to resolve ambiguities. The extracted attribute information from various pages and documents are standardized to a common representation using a parser module to facilitate information comparison and aggregation. Finally, we use a probabilistic approach to fuse information extracted from multiple sources into a coherent well record. The applicability of the proposed approach and related performance was studied on several real-life well technical reports.Keywords: natural language processing, deep learning, transformers, information retrieval
Procedia PDF Downloads 19226658 Improving Grade Control Turnaround Times with In-Pit Hyperspectral Assaying
Authors: Gary Pattemore, Michael Edgar, Andrew Job, Marina Auad, Kathryn Job
Abstract:
As critical commodities become more scarce, significant time and resources have been used to better understand complicated ore bodies and extract their full potential. These challenging ore bodies provide several pain points for geologists and engineers to overcome, poor handling of these issues flows downs stream to the processing plant affecting throughput rates and recovery. Many open cut mines utilise blast hole drilling to extract additional information to feed back into the modelling process. This method requires samples to be collected during or after blast hole drilling. Samples are then sent for assay with turnaround times varying from 1 to 12 days. This method is time consuming, costly, requires human exposure on the bench and collects elemental data only. To address this challenge, research has been undertaken to utilise hyperspectral imaging across a broad spectrum to scan samples, collars or take down hole measurements for minerals and moisture content and grade abundances. Automation of this process using unmanned vehicles and on-board processing reduces human in pit exposure to ensure ongoing safety. On-board processing allows data to be integrated into modelling workflows with immediacy. The preliminary results demonstrate numerous direct and indirect benefits from this new technology, including rapid and accurate grade estimates, moisture content and mineralogy. These benefits allow for faster geo modelling updates, better informed mine scheduling and improved downstream blending and processing practices. The paper presents recommendations for implementation of the technology in open cut mining environments.Keywords: grade control, hyperspectral scanning, artificial intelligence, autonomous mining, machine learning
Procedia PDF Downloads 11126657 The Use of TRIZ to Map the Evolutive Pattern of Products
Authors: Fernando C. Labouriau, Ricardo M. Naveiro
Abstract:
This paper presents a model for mapping the evolutive pattern of products in order to generate new ideas, to perceive emerging technologies and to manage product’s portfolios in new product development (NPD). According to the proposed model, the information extracted from the patent system is filtered and analyzed with TRIZ tools to produce the input information to the NPD process. The authors acknowledge that the NPD process is well integrated within the enterprises business strategic planning and that new products are vital in the competitive market nowadays. In the other hand, it has been observed the proactive use of patent information in some methodologies for selecting projects, mapping technological change and generating product concepts. And one of these methodologies is TRIZ, a theory created to favor innovation and to improve product design that provided the analytical framework for the model. Initially, it is presented an introduction to TRIZ mainly focused on the patterns of evolution of technical systems and its strategic uses, a brief and absolutely non-comprehensive description as the theory has several others tools being widely employed in technical and business applications. Then, it is introduced the model for mapping the products evolutive pattern with its three basic pillars, namely patent information, TRIZ and NPD, and the methodology for implementation. Following, a case study of a Brazilian bike manufacturing is presented to proceed the mapping of a product evolutive pattern by decomposing and analyzing one of its assemblies along ten evolution lines in order to envision opportunities for further product development. Some of these lines are illustrated in more details to evaluate the features of the product in relation to the TRIZ concepts using a comparison perspective with patents in the state of the art to validate the product’s evolutionary potential. As a result, the case study provided several opportunities for a product improvement development program in different project categories, identifying technical and business impacts as well as indicating the lines of evolution that can mostly benefit from each opportunity.Keywords: product development, patents, product strategy, systems evolution
Procedia PDF Downloads 50026656 Model Driven Architecture Methodologies: A Review
Authors: Arslan Murtaza
Abstract:
Model Driven Architecture (MDA) is technique presented by OMG (Object Management Group) for software development in which different models are proposed and converted them into code. The main plan is to identify task by using PIM (Platform Independent Model) and transform it into PSM (Platform Specific Model) and then converted into code. In this review paper describes some challenges and issues that are faced in MDA, type and transformation of models (e.g. CIM, PIM and PSM), and evaluation of MDA-based methodologies.Keywords: OMG, model driven rrchitecture (MDA), computation independent model (CIM), platform independent model (PIM), platform specific model(PSM), MDA-based methodologies
Procedia PDF Downloads 45626655 Cirrhosis Mortality Prediction as Classification using Frequent Subgraph Mining
Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride
Abstract:
In this work, we use machine learning and novel data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. To the best of our knowledge, this is the first work to apply modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning
Procedia PDF Downloads 13226654 Using Optical Character Recognition to Manage the Unstructured Disaster Data into Smart Disaster Management System
Authors: Dong Seop Lee, Byung Sik Kim
Abstract:
In the 4th Industrial Revolution, various intelligent technologies have been developed in many fields. These artificial intelligence technologies are applied in various services, including disaster management. Disaster information management does not just support disaster work, but it is also the foundation of smart disaster management. Furthermore, it gets historical disaster information using artificial intelligence technology. Disaster information is one of important elements of entire disaster cycle. Disaster information management refers to the act of managing and processing electronic data about disaster cycle from its’ occurrence to progress, response, and plan. However, information about status control, response, recovery from natural and social disaster events, etc. is mainly managed in the structured and unstructured form of reports. Those exist as handouts or hard-copies of reports. Such unstructured form of data is often lost or destroyed due to inefficient management. It is necessary to manage unstructured data for disaster information. In this paper, the Optical Character Recognition approach is used to convert handout, hard-copies, images or reports, which is printed or generated by scanners, etc. into electronic documents. Following that, the converted disaster data is organized into the disaster code system as disaster information. Those data are stored in the disaster database system. Gathering and creating disaster information based on Optical Character Recognition for unstructured data is important element as realm of the smart disaster management. In this paper, Korean characters were improved to over 90% character recognition rate by using upgraded OCR. In the case of character recognition, the recognition rate depends on the fonts, size, and special symbols of character. We improved it through the machine learning algorithm. These converted structured data is managed in a standardized disaster information form connected with the disaster code system. The disaster code system is covered that the structured information is stored and retrieve on entire disaster cycle such as historical disaster progress, damages, response, and recovery. The expected effect of this research will be able to apply it to smart disaster management and decision making by combining artificial intelligence technologies and historical big data.Keywords: disaster information management, unstructured data, optical character recognition, machine learning
Procedia PDF Downloads 12726653 Solar Power Forecasting for the Bidding Zones of the Italian Electricity Market with an Analog Ensemble Approach
Authors: Elena Collino, Dario A. Ronzio, Goffredo Decimi, Maurizio Riva
Abstract:
The rapid increase of renewable energy in Italy is led by wind and solar installations. The 2017 Italian energy strategy foresees a further development of these sustainable technologies, especially solar. This fact has resulted in new opportunities, challenges, and different problems to deal with. The growth of renewables allows to meet the European requirements regarding energy and environmental policy, but these types of sources are difficult to manage because they are intermittent and non-programmable. Operationally, these characteristics can lead to instability on the voltage profile and increasing uncertainty on energy reserve scheduling. The increasing renewable production must be considered with more and more attention especially by the Transmission System Operator (TSO). The TSO, in fact, every day provides orders on energy dispatch, once the market outcome has been determined, on extended areas, defined mainly on the basis of power transmission limitations. In Italy, six market zone are defined: Northern-Italy, Central-Northern Italy, Central-Southern Italy, Southern Italy, Sardinia, and Sicily. An accurate hourly renewable power forecasting for the day-ahead on these extended areas brings an improvement both in terms of dispatching and reserve management. In this study, an operational forecasting tool of the hourly solar output for the six Italian market zones is presented, and the performance is analysed. The implementation is carried out by means of a numerical weather prediction model, coupled with a statistical post-processing in order to derive the power forecast on the basis of the meteorological projection. The weather forecast is obtained from the limited area model RAMS on the Italian territory, initialized with IFS-ECMWF boundary conditions. The post-processing calculates the solar power production with the Analog Ensemble technique (AN). This statistical approach forecasts the production using a probability distribution of the measured production registered in the past when the weather scenario looked very similar to the forecasted one. The similarity is evaluated for the components of the solar radiation: global (GHI), diffuse (DIF) and direct normal (DNI) irradiation, together with the corresponding azimuth and zenith solar angles. These are, in fact, the main factors that affect the solar production. Considering that the AN performance is strictly related to the length and quality of the historical data a training period of more than one year has been used. The training set is made by historical Numerical Weather Prediction (NWP) forecasts at 12 UTC for the GHI, DIF and DNI variables over the Italian territory together with corresponding hourly measured production for each of the six zones. The AN technique makes it possible to estimate the aggregate solar production in the area, without information about the technologic characteristics of the all solar parks present in each area. Besides, this information is often only partially available. Every day, the hourly solar power forecast for the six Italian market zones is made publicly available through a website.Keywords: analog ensemble, electricity market, PV forecast, solar energy
Procedia PDF Downloads 15626652 Characterization of 3D-MRP for Analyzing of Brain Balancing Index (BBI) Pattern
Authors: N. Fuad, M. N. Taib, R. Jailani, M. E. Marwan
Abstract:
This paper discusses on power spectral density (PSD) characteristics which are extracted from three-dimensional (3D) electroencephalogram (EEG) models. The EEG signal recording was conducted on 150 healthy subjects. Development of 3D EEG models involves pre-processing of raw EEG signals and construction of spectrogram images. Then, the values of maximum PSD were extracted as features from the model. These features are analysed using mean relative power (MRP) and different mean relative power (DMRP) technique to observe the pattern among different brain balancing indexes. The results showed that by implementing these techniques, the pattern of brain balancing indexes can be clearly observed. Some patterns are indicates between index 1 to index 5 for left frontal (LF) and right frontal (RF).Keywords: power spectral density, 3D EEG model, brain balancing, mean relative power, different mean relative power
Procedia PDF Downloads 47326651 Recent Advances in Data Warehouse
Authors: Fahad Hanash Alzahrani
Abstract:
This paper describes some recent advances in a quickly developing area of data storing and processing based on Data Warehouses and Data Mining techniques, which are associated with software, hardware, data mining algorithms and visualisation techniques having common features for any specific problems and tasks of their implementation.Keywords: data warehouse, data mining, knowledge discovery in databases, on-line analytical processing
Procedia PDF Downloads 40226650 Impact of Task Technology Fit on User Effectiveness, Efficiency and Creativity in Iranian Pharmaceutical Oraganizations
Authors: Milad Keshvardoost, Amir Khanlari, Nader Khalesi
Abstract:
Background: Any firm in the pharmaceutical industry requires efficient and effective management information systems (MIS) to support managerial functions. Purpose: The aim of this study is to investigate the impact of Task-Technology Fit on user effectiveness, efficiency, and creativity in Iranian pharmaceutical companies. Methodology: 345 reliable and validate questionnaires were distributed among selected samples, through the cluster method, to Information system users of eight leading Iranian pharmaceutical companies, based on the likert scale. The proposed model of the article is based on a model with Task technology fit, on user performance with the definition of efficiency, effectiveness, and creativity through mediation effects of perceived usefulness and ease of use. Results: This study confirmed that TTF with definitions of adequacy and compatibility has positive impacts on user performance Conclusion: We concluded that pharmaceutical users of IS, utilizing a system with a precise and intense observation of users' demands, may make facilitation for them to design an exclusive IS framework.Keywords: information systems, user performance, pharmaceuticals, task technology fit
Procedia PDF Downloads 16926649 The Consumer Responses toward the Offensive Product Advertising
Authors: Chin Tangtarntana
Abstract:
The main purpose of this study was to investigate the effects of animation in offensive product advertising. Experiment was conducted to collect consumer responses toward animated and static ads of offensive and non-offensive products. The study was conducted by distributing questionnaires to the target respondents. According to statistics from Innovative Internet Research Center, Thailand, majority of internet users are 18 – 44 years old. The results revealed an interaction between ad design and offensive product. Specifically, when used in offensive product advertisements, animated ads were not effective for consumer attention, but yielded positive response in terms of attitude toward product. The findings support that information processing model is accurate in predicting consumer cognitive response toward cartoon ads, whereas U&G, arousal, and distinctive theory is more accurate in predicting consumer affective response. In practical, these findings can also be used to guide ad designers and marketers that are suitable for offensive products.Keywords: animation, banner ad design, consumer responses, offensive product advertising, stock exchange of Thailand
Procedia PDF Downloads 26626648 Client Hacked Server
Authors: Bagul Abhijeet
Abstract:
Background: Client-Server model is the backbone of today’s internet communication. In which normal user can not have control over particular website or server? By using the same processing model one can have unauthorized access to particular server. In this paper, we discussed about application scenario of hacking for simple website or server consist of unauthorized way to access the server database. This application emerges to autonomously take direct access of simple website or server and retrieve all essential information maintain by administrator. In this system, IP address of server given as input to retrieve user-id and password of server. This leads to breaking administrative security of server and acquires the control of server database. Whereas virus helps to escape from server security by crashing the whole server. Objective: To control malicious attack and preventing all government website, and also find out illegal work to do hackers activity. Results: After implementing different hacking as well as non-hacking techniques, this system hacks simple web sites with normal security credentials. It provides access to server database and allow attacker to perform database operations from client machine. Above Figure shows the experimental result of this application upon different servers and provides satisfactory results as required. Conclusion: In this paper, we have presented a to view to hack the server which include some hacking as well as non-hacking methods. These algorithms and methods provide efficient way to hack server database. By breaking the network security allow to introduce new and better security framework. The terms “Hacking” not only consider for its illegal activities but also it should be use for strengthen our global network.Keywords: Hacking, Vulnerabilities, Dummy request, Virus, Server monitoring
Procedia PDF Downloads 25026647 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data
Authors: Adarsh Shroff
Abstract:
Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.Keywords: big data, map reduce, incremental processing, iterative computation
Procedia PDF Downloads 34926646 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset
Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba
Abstract:
We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process
Procedia PDF Downloads 25826645 Green Chemical Processing in the Teaching Laboratory: A Convenient Solvent Free Microwave Extraction of Natural Products
Authors: Mohamed Amine Ferhat, Mohamed Nadjib Bouhatem, Farid Chemat
Abstract:
One of the principal aims of sustainable and green processing development remains the dissemination and teaching of green chemistry to both developed and developing nations. This paper describes one attempt to show that “north-south” collaborations yield innovative sustainable and green technologies which give major benefits for both nations. In this paper we present early results from a solvent free microwave extraction (SFME) of essential oils using fresh orange peel, a byproduct in the production of orange juice. SFME is performed at atmospheric pressure without added any solvent or water. SFME increases essential oil yield and eliminate wastewater treatment. The procedure is appropriate for the teaching laboratory, and allows the students to learn extraction, chromatographic and spectroscopic analysis skills, and are expose to dramatic visual example of rapid, sustainable and green extraction of essential oil, and are introduced to commercially successful sustainable and green chemical processing with microwave energy.Keywords: essential oil, extraction, green processing, microwave
Procedia PDF Downloads 54126644 An Empirical Investigation of Mobile Banking Services Adoption in Pakistan
Authors: Aijaz A. Shaikh, Richard Glavee-Geo, Heikki Karjaluoto
Abstract:
Adoption of Information Systems (IS) is receiving increasing attention such that its implications have been closely monitored and studied by the IS management community, industry and professional gatekeepers. Building on previous research regarding the adoption of technology, this paper develops and validates an integrated model of the adoption of mobile banking. The model originates from the Technology Acceptance Model (TAM) and the Theory of Planned Behaviour (TPB). This paper intends to offer a preliminary scrutiny of the antecedents of the adoption of mobile banking services in the context of a developing country. Data was collected from Pakistan. The findings showed that an integrated TAM and TPB model greatly explains the adoption intention of mobile banking; and perceived behavioural control and its antecedents play a significant role in predicting adoption Theoretical and managerial implications of findings are presented and discussed.Keywords: developing country, mobile banking service adoption, technology acceptance model, theory of planned behavior
Procedia PDF Downloads 41626643 Corpus-Based Neural Machine Translation: Empirical Study Multilingual Corpus for Machine Translation of Opaque Idioms - Cloud AutoML Platform
Authors: Khadija Refouh
Abstract:
Culture bound-expressions have been a bottleneck for Natural Language Processing (NLP) and comprehension, especially in the case of machine translation (MT). In the last decade, the field of machine translation has greatly advanced. Neural machine translation NMT has recently achieved considerable development in the quality of translation that outperformed previous traditional translation systems in many language pairs. Neural machine translation NMT is an Artificial Intelligence AI and deep neural networks applied to language processing. Despite this development, there remain some serious challenges that face neural machine translation NMT when translating culture bounded-expressions, especially for low resources language pairs such as Arabic-English and Arabic-French, which is not the case with well-established language pairs such as English-French. Machine translation of opaque idioms from English into French are likely to be more accurate than translating them from English into Arabic. For example, Google Translate Application translated the sentence “What a bad weather! It runs cats and dogs.” to “يا له من طقس سيء! تمطر القطط والكلاب” into the target language Arabic which is an inaccurate literal translation. The translation of the same sentence into the target language French was “Quel mauvais temps! Il pleut des cordes.” where Google Translate Application used the accurate French corresponding idioms. This paper aims to perform NMT experiments towards better translation of opaque idioms using high quality clean multilingual corpus. This Corpus will be collected analytically from human generated idiom translation. AutoML translation, a Google Neural Machine Translation Platform, is used as a custom translation model to improve the translation of opaque idioms. The automatic evaluation of the custom model will be compared to the Google NMT using Bilingual Evaluation Understudy Score BLEU. BLEU is an algorithm for evaluating the quality of text which has been machine-translated from one natural language to another. Human evaluation is integrated to test the reliability of the Blue Score. The researcher will examine syntactical, lexical, and semantic features using Halliday's functional theory.Keywords: multilingual corpora, natural language processing (NLP), neural machine translation (NMT), opaque idioms
Procedia PDF Downloads 14826642 Psychometric Properties of the Sensory Processing Measure Preschool-Home among Children with Autism in Saudi Arabia
Authors: Shahad Alkhalifah, Jonh Wright
Abstract:
Autism spectrum disorder (ASD) is a pervasive developmental disorder associated, for 42% to 88% of people with ASD, with sensory processing disorders. Sensory processing disorders (SPD) impact daily functioning, and it is, therefore, essential to be able to diagnose them accurately. Currently, however, there is no assessment tool available for the Saudi Arabia (SA) population that would cover a wider enough age range. Therefore, this study aimed to assess the psychometric properties of the Sensory Processing Measure Preschool-Home Form (SPM-P) when used in English, with a population of English-speaking Saudi participants. This was chosen due to time limitations and the urgency in providing practitioners with appropriate tools. Using a convenience sampling approach group of caregivers of typically developing (TD) children and a group of caregivers for children with ASD were recruited (N = 40 and N = 16, respectively), and completed the SPM-P Home Form. Participants were also invited to complete it again after two weeks for test-retest reliability, and respectively, nine and five agreed. Reliability analyses suggested some issues with a few items when used in the Saudi culture, and, along with interscale correlations, it highlighted concerns with the factor structure. However, it was also found that the SPM-P Home has good criterion-based validity, and it is, therefore, suggested that it can be used until a tool is developed through translation and cultural adaptation. It is also suggested that the current factor structure of SPM-P Home is reassessed using a large sample.Keywords: autism, sensory, assessment, reliability, sensory processing dysfunction, preschool, validity
Procedia PDF Downloads 22926641 Ecological Risk Assessment of Informal E-Waste Processing in Alaba International Market, Lagos, Nigeria
Authors: A. A. Adebayo, O. Osibanjo
Abstract:
Informal electronic waste (e-waste) processing is a crude method of recycling, which is on the increase in Nigeria. The release of hazardous substances such as heavy metals (HMs) into the environment during informal e-waste processing has been a major concern. However, there is insufficient information on environmental contamination from e-waste recycling, associated ecological risk in Alaba International Market, a major electronic market in Lagos, Nigeria. The aims of this study were to determine the levels of HMs in soil, resulting from the e-waste recycling; and also assess associated ecological risks in Alaba international market. Samples of soils (334) were randomly collected seasonally for three years from fourteen selected e-waste activity points and two control sites. The samples were digested using standard methods and HMs analysed by inductive coupled plasma optical emission. Ecological risk was estimated using Ecological Risk index (ER), Potential Ecological Risk index (RI), Index of geoaccumulation (Igeo), Contamination factor (Cf) and degree of contamination factor (Cdeg). The concentrations range of HMs (mg/kg) in soil were: 16.7-11200.0 (Pb); 14.3-22600.0 (Cu); 1.90-6280.0 (Ni), 39.5-4570.0 (Zn); 0.79-12300.0 (Sn); 0.02-138.0 (Cd); 12.7-1710.0 (Ba); 0.18-131.0 (Cr); 0.07-28.0 (V), while As was below detection limit. Concentrations range in control soils were 1.36-9.70 (Pb), 2.06-7.60 (Cu), 1.25-5.11 (Ni), 3.62-15.9 (Zn), BDL-0.56 (Sn), BDL-0.01 (Cd), 14.6-47.6 (Ba), 0.21–12.2 (Cr) and 0.22-22.2 (V). The trend in ecological risk index was in the order Cu > Pb > Ni > Zn > Cr > Cd > Ba > V. The potential ecological risk index with respect to informal e-waste activities were: burning > dismantling > disposal > stockpiling. The index of geo accumulation indices revealed that soils were extremely polluted with Cd, Cu, Pb, Zn and Ni. The contamination factor indicated that 93% of the studied areas have very high contamination status for Pb, Cu, Ba, Sn and Co while Cr and Cd were in the moderately contaminated status. The degree of contamination decreased in the order of Sn > Cu > Pb >> Zn > Ba > Co > Ni > V > Cr > Cd. Heavy metal contamination of Alaba international market environment resulting from informal e-waste processing was established. Proper management of e-waste and remediation of the market environment are recommended to minimize the ecological risks.Keywords: Alaba international market, ecological risk, electronic waste, heavy metal contamination
Procedia PDF Downloads 19626640 The Influence of the Concentration and Temperature on the Rheological Behavior of Carbonyl-Methylcellulose
Authors: Mohamed Rabhi, Kouider Halim Benrahou
Abstract:
The rheological properties of the carbonyl-methylcellulose (CMC), of different concentrations (25000, 50000, 60000, 80000 and 100000 ppm) and different temperatures were studied. We found that the rheological behavior of all CMC solutions presents a pseudo-plastic behavior, it follows the model of Ostwald-de Waele. The objective of this work is the modeling of flow by the CMC Cross model. The Cross model gives us the variation of the viscosity according to the shear rate. This model allowed us to adjust more clearly the rheological characteristics of CMC solutions. A comparison between the Cross model and the model of Ostwald was made. Cross the model fitting parameters were determined by a numerical simulation to make an approach between the experimental curve and those given by the two models. Our study has shown that the model of Cross, describes well the flow of "CMC" for low concentrations.Keywords: CMC, rheological modeling, Ostwald model, cross model, viscosity
Procedia PDF Downloads 40226639 Improving the Statistics Nature in Research Information System
Authors: Rajbir Cheema
Abstract:
In order to introduce an integrated research information system, this will provide scientific institutions with the necessary information on research activities and research results in assured quality. Since data collection, duplication, missing values, incorrect formatting, inconsistencies, etc. can arise in the collection of research data in different research information systems, which can have a wide range of negative effects on data quality, the subject of data quality should be treated with better results. This paper examines the data quality problems in research information systems and presents the new techniques that enable organizations to improve their quality of research information.Keywords: Research information systems (RIS), research information, heterogeneous sources, data quality, data cleansing, science system, standardization
Procedia PDF Downloads 15526638 3D Model of Rain-Wind Induced Vibration of Inclined Cable
Authors: Viet-Hung Truong, Seung-Eock Kim
Abstract:
Rain–wind induced vibration of inclined cable is a special aerodynamic phenomenon because it is easily influenced by many factors, especially the distribution of rivulet and wind velocity. This paper proposes a new 3D model of inclined cable, based on single degree-of-freedom model. Aerodynamic forces are firstly established and verified with the existing results from a 2D model. The 3D model of inclined cable is developed. The 3D model is then applied to assess the effects of wind velocity distribution and the continuity of rivulets on the cable. Finally, an inclined cable model with small sag is investigated.Keywords: 3D model, rain - wind induced vibration, rivulet, analytical model
Procedia PDF Downloads 48726637 Implementation of Model Reference Adaptive Control in Tuning of Controller Gains for Following-Vehicle System with Fixed Time Headway
Authors: Fatemeh Behbahani, Rubiyah Yusof
Abstract:
To avoid collision between following vehicles and vehicles in front, it is vital to keep appropriate, safe spacing between both vehicles over all speeds. Therefore, the following vehicle needs to have exact information regarding the speed and spacing between vehicles. This project is conducted to simulate the tuning of controller gain for a vehicle-following system through the selected control strategy, spacing control policy and fixed-time headway policy. In addition, the paper simulates and designs an adaptive gain controller for a road-vehicle-following system which uses information on the spacing, velocity and also acceleration of a preceding vehicle in the proposed one-vehicle look-ahead strategy. The mathematical model is implemented using Kirchhoff and Newton’s Laws, and stability simulated. The trial-error method was used to obtain a suitable value of controller gain. However, the adaptive-based controller system was able to optimize the gain value automatically. Model Reference Adaptive Control (MRAC) is designed and utilized and based on firstly the Gradient and secondly the Lyapunov approach. The Lyapunov approach considers stability. The Gradient approach was found to improve the best value of gain in the controller system with fixed-time headway.Keywords: one-vehicle look-ahead, model reference adaptive, stability, tuning gain controller, MRAC
Procedia PDF Downloads 23626636 Portable and Parallel Accelerated Development Method for Field-Programmable Gate Array (FPGA)-Central Processing Unit (CPU)- Graphics Processing Unit (GPU) Heterogeneous Computing
Authors: Nan Hu, Chao Wang, Xi Li, Xuehai Zhou
Abstract:
The field-programmable gate array (FPGA) has been widely adopted in the high-performance computing domain. In recent years, the embedded system-on-a-chip (SoC) contains coarse granularity multi-core CPU (central processing unit) and mobile GPU (graphics processing unit) that can be used as general-purpose accelerators. The motivation is that algorithms of various parallel characteristics can be efficiently mapped to the heterogeneous architecture coupled with these three processors. The CPU and GPU offload partial computationally intensive tasks from the FPGA to reduce the resource consumption and lower the overall cost of the system. However, in present common scenarios, the applications always utilize only one type of accelerator because the development approach supporting the collaboration of the heterogeneous processors faces challenges. Therefore, a systematic approach takes advantage of write-once-run-anywhere portability, high execution performance of the modules mapped to various architectures and facilitates the exploration of design space. In this paper, A servant-execution-flow model is proposed for the abstraction of the cooperation of the heterogeneous processors, which supports task partition, communication and synchronization. At its first run, the intermediate language represented by the data flow diagram can generate the executable code of the target processor or can be converted into high-level programming languages. The instantiation parameters efficiently control the relationship between the modules and computational units, including two hierarchical processing units mapping and adjustment of data-level parallelism. An embedded system of a three-dimensional waveform oscilloscope is selected as a case study. The performance of algorithms such as contrast stretching, etc., are analyzed with implementations on various combinations of these processors. The experimental results show that the heterogeneous computing system with less than 35% resources achieves similar performance to the pure FPGA and approximate energy efficiency.Keywords: FPGA-CPU-GPU collaboration, design space exploration, heterogeneous computing, intermediate language, parameterized instantiation
Procedia PDF Downloads 11626635 Mathematical Modeling of the Operating Process and a Method to Determine the Design Parameters in an Electromagnetic Hammer Using Solenoid Electromagnets
Authors: Song Hyok Choe
Abstract:
This study presented a method to determine the optimum design parameters based on a mathematical model of the operating process in a manual electromagnetic hammer using solenoid electromagnets. The operating process of the electromagnetic hammer depends on the circuit scheme of the power controller. Mathematical modeling of the operating process was carried out by considering the energy transfer process in the forward and reverse windings and the electromagnetic force acting on the impact and brake pistons. Using the developed mathematical model, the initial design data of a manual electromagnetic hammer proposed in this paper are encoded and analyzed in Matlab. On the other hand, a measuring experiment was carried out by using a measurement device to check the accuracy of the developed mathematical model. The relative errors of the analytical results for measured stroke distance of the impact piston, peak value of forward stroke current and peak value of reverse stroke current were −4.65%, 9.08% and 9.35%, respectively. Finally, it was shown that the mathematical model of the operating process of an electromagnetic hammer is relatively accurate, and it can be used to determine the design parameters of the electromagnetic hammer. Therefore, the design parameters that can provide the required impact energy in the manual electromagnetic hammer were determined using a mathematical model developed. The proposed method will be used for the further design and development of the various types of percussion rock drills.Keywords: solenoid electromagnet, electromagnetic hammer, stone processing, mathematical modeling
Procedia PDF Downloads 4426634 Food Processing Technology and Packaging: A Case Study of Indian Cashew-Nut Industry
Authors: Parashram Jakappa Patil
Abstract:
India is the global leader in world cashew business and cashew-nut industry is one of the important food processing industries in world. However India is the largest producer, processor, exporter and importer eschew in the world. India is providing cashew to the rest of the world. India is meeting world demand of cashew. India has a tremendous potential of cashew production and export to other countries. Every year India earns more than 2000 cores rupees through cashew trade. Cashew industry is one of the important small scale industries in the country which is playing significant role in rural development. It is generating more than 400000 jobs at remote area and 95% cashew worker are women, it is giving income to poor cashew farmers, majority cashew processing units are small and cottage, it is helping to stop migration from young farmers for employment opportunities, it is motivation rural entrepreneurship development and it is also helping to environment protection etc. Hence India cashew business is very important agribusiness in India which has potential make inclusive development. World Bank and IMF recognized cashew-nut industry is one the important tool for poverty eradication at global level. It shows important of cashew business and its strong existence in India. In spite of such huge potential cashew processing industry is facing different problems such as lack of infrastructure ability, lack of supply of raw cashew, lack of availability of finance, collection of raw cashew, unavailability of warehouse, marketing of cashew kernels, lack of technical knowledge and especially processing technology and packaging of finished products. This industry has great prospects such as scope for more cashew cultivation and cashew production, employment generation, formation of cashew processing units, alcohols production from cashew apple, shield oil production, rural development, poverty elimination, development of social and economic backward class and environment protection etc. This industry has domestic as well as foreign market; India has tremendous potential in this regard. The cashew is a poor men’s crop but rich men’s food. The cashew is a source of income and livelihood for poor farmers. Cashew-nut industry may play very important role in the development of hilly region. The objectives of this paper are to identify problems of cashew processing and use of processing technology, problems of cashew kernel packaging, evolving of cashew processing technology over the year and its impact on final product and impact of good processing by adopting appropriate technology packaging on international trade of cashew-nut. The most important problem of cashew processing industry is that is processing and packaging. Bad processing reduce the quality of cashew kernel at large extent especially broken of cashew kernel which has very less price in market compare to whole cashew kernel and not eligible for export. On the other hand if there is no good packaging of cashew kernel will get moisture which destroy test of it. International trade of cashew-nut is depend of two things one is cashew processing and other is packaging. This study has strong relevance because cashew-nut industry is the labour oriented, where processing technology is not playing important role because 95% processing work is manual. Hence processing work was depending on physical performance of worker which makes presence of large workforce inevitable. There are many cashew processing units closed because they are not getting sufficient work force. However due to advancement in technology slowly this picture is changing and processing work get improve. Therefore it is interesting to explore all the aspects in context of cashew processing and packaging of cashew business.Keywords: cashew, processing technology, packaging, international trade, change
Procedia PDF Downloads 42026633 Cicadas: A Clinician-assisted, Closed-loop Technology, Mobile App for Adolescents with Autism Spectrum Disorders
Authors: Bruno Biagianti, Angela Tseng, Kathy Wannaviroj, Allison Corlett, Megan DuBois, Kyu Lee, Suma Jacob
Abstract:
Background: ASD is characterized by pervasive Sensory Processing Abnormalities (SPA) and social cognitive deficits that persist throughout the course of the illness and have been linked to functional abnormalities in specific neural systems that underlie the perception, processing, and representation of sensory information. SPA and social cognitive deficits are associated with difficulties in interpersonal relationships, poor development of social skills, reduced social interactions and lower academic performance. Importantly, they can hamper the effects of established evidence-based psychological treatments—including PEERS (Program for the Education and Enrichment of Relationship Skills), a parent/caregiver-assisted, 16-weeks social skills intervention—which nonetheless requires a functional brain capable of assimilating and retaining information and skills. As a matter of fact, some adolescents benefit from PEERS more than others, calling for strategies to increase treatment response rates. Objective: We will present interim data on CICADAS (Care Improving Cognition for ADolescents on the Autism Spectrum)—a clinician-assisted, closed-loop technology mobile application for adolescents with ASD. Via ten mobile assessments, CICADAS captures data on sensory processing abnormalities and associated cognitive deficits. These data populate a machine learning algorithm that tailors the delivery of ten neuroplasticity-based social cognitive training (NB-SCT) exercises targeting sensory processing abnormalities. Methods: In collaboration with the Autism Spectrum and Neurodevelopmental Disorders Clinic at the University of Minnesota, we conducted a fully remote, three-arm, randomized crossover trial with adolescents with ASD to document the acceptability of CICADAS and evaluate its potential as a stand-alone treatment or as a treatment enhancer of PEERS. Twenty-four adolescents with ASD (ages 11-18) have been initially randomized to 16 weeks of PEERS + CICADAS (Arm A) vs. 16 weeks of PEERS + computer games vs. 16 weeks of CICADAS alone (Arm C). After 16 weeks, the full battery of assessments has been remotely administered. Results: We have evaluated the acceptability of CICADAS by examining adherence rates, engagement patterns, and exit survey data. We found that: 1) CICADAS is able to serve as a treatment enhancer for PEERS, inducing greater improvements in sensory processing, cognition, symptom reduction, social skills and behaviors, as well as the quality of life compared to computer games; 2) the concurrent delivery of PEERS and CICADAS induces greater improvements in study outcomes compared to CICADAS only. Conclusion: While preliminary, our results indicate that the individualized assessment and treatment approach designed in CICADAS seems effective in inducing adaptive long-term learning about social-emotional events. CICADAS-induced enhancement of processing and cognition facilitates the application of PEERS skills in the environment of adolescents with ASD, thus improving their real-world functioning.Keywords: ASD, social skills, cognitive training, mobile app
Procedia PDF Downloads 212