Search results for: accuracy assessment.
8012 A Trend Based Forecasting Framework of the ATA Method and Its Performance on the M3-Competition Data
Authors: H. Taylan Selamlar, I. Yavuz, G. Yapar
Abstract:
It is difficult to make predictions especially about the future and making accurate predictions is not always easy. However, better predictions remain the foundation of all science therefore the development of accurate, robust and reliable forecasting methods is very important. Numerous number of forecasting methods have been proposed and studied in the literature. There are still two dominant major forecasting methods: Box-Jenkins ARIMA and Exponential Smoothing (ES), and still new methods are derived or inspired from them. After more than 50 years of widespread use, exponential smoothing is still one of the most practically relevant forecasting methods available due to their simplicity, robustness and accuracy as automatic forecasting procedures especially in the famous M-Competitions. Despite its success and widespread use in many areas, ES models have some shortcomings that negatively affect the accuracy of forecasts. Therefore, a new forecasting method in this study will be proposed to cope with these shortcomings and it will be called ATA method. This new method is obtained from traditional ES models by modifying the smoothing parameters therefore both methods have similar structural forms and ATA can be easily adapted to all of the individual ES models however ATA has many advantages due to its innovative new weighting scheme. In this paper, the focus is on modeling the trend component and handling seasonality patterns by utilizing classical decomposition. Therefore, ATA method is expanded to higher order ES methods for additive, multiplicative, additive damped and multiplicative damped trend components. The proposed models are called ATA trended models and their predictive performances are compared to their counter ES models on the M3 competition data set since it is still the most recent and comprehensive time-series data collection available. It is shown that the models outperform their counters on almost all settings and when a model selection is carried out amongst these trended models ATA outperforms all of the competitors in the M3- competition for both short term and long term forecasting horizons when the models’ forecasting accuracies are compared based on popular error metrics.Keywords: accuracy, exponential smoothing, forecasting, initial value
Procedia PDF Downloads 1768011 Implicit Responses for Assessment of Autism Based on Natural Behaviors Obtained Inside Immersive Virtual Environment
Authors: E. Olmos-Raya, A. Cascales Martínez, N. Minto de Sousa, M. Alcañiz Raya
Abstract:
The late detection and subjectivity of the assessment of Autism Spectrum Disorder (ASD) imposed a difficulty for the children’s clinical and familiar environment. The results showed in this paper, are part of a research project about the assessment and training of social skills in children with ASD, whose overall goal is the use of virtual environments together with physiological measures in order to find a new model of objective ASD assessment based on implicit brain processes measures. In particular, this work tries to contribute by studying the differences and changes in the Skin Conductance Response (SCR) and Eye Tracking (ET) between a typical development group (TD group) and an ASD group (ASD group) after several combined stimuli using a low cost Immersive Virtual Environment (IVE). Subjects were exposed to a virtual environment that showed natural scenes that stimulated visual, auditory and olfactory perceptual system. By exposing them to the IVE, subjects showed natural behaviors while measuring SCR and ET. This study compared measures of subjects diagnosed with ASD (N = 18) with a control group of subjects with typical development (N=10) when exposed to three different conditions: only visual (V), visual and auditory (VA) and visual, auditory and olfactory (VAO) stimulation. Correlations between SCR and ET measures were also correlated with the Autism Diagnostic Observation Schedule (ADOS) test. SCR measures showed significant differences among the experimental condition between groups. The ASD group presented higher level of SCR while we did not find significant differences between groups regarding DF. We found high significant correlations among all the experimental conditions in SCR measures and the subscale of ADOS test of imagination and symbolic thinking. Regarding the correlation between ET measures and ADOS test, the results showed significant relationship between VA condition and communication scores.Keywords: autism, electrodermal activity, eye tracking, immersive virtual environment, virtual reality
Procedia PDF Downloads 1388010 A Supervised Approach for Word Sense Disambiguation Based on Arabic Diacritics
Authors: Alaa Alrakaf, Sk. Md. Mizanur Rahman
Abstract:
Since the last two decades’ Arabic natural language processing (ANLP) has become increasingly much more important. One of the key issues related to ANLP is ambiguity. In Arabic language different pronunciation of one word may have a different meaning. Furthermore, ambiguity also has an impact on the effectiveness and efficiency of Machine Translation (MT). The issue of ambiguity has limited the usefulness and accuracy of the translation from Arabic to English. The lack of Arabic resources makes ambiguity problem more complicated. Additionally, the orthographic level of representation cannot specify the exact meaning of the word. This paper looked at the diacritics of Arabic language and used them to disambiguate a word. The proposed approach of word sense disambiguation used Diacritizer application to Diacritize Arabic text then found the most accurate sense of an ambiguous word using Naïve Bayes Classifier. Our Experimental study proves that using Arabic Diacritics with Naïve Bayes Classifier enhances the accuracy of choosing the appropriate sense by 23% and also decreases the ambiguity in machine translation.Keywords: Arabic natural language processing, machine learning, machine translation, Naive bayes classifier, word sense disambiguation
Procedia PDF Downloads 3558009 Multi-Atlas Segmentation Based on Dynamic Energy Model: Application to Brain MR Images
Authors: Jie Huo, Jonathan Wu
Abstract:
Segmentation of anatomical structures in medical images is essential for scientific inquiry into the complex relationships between biological structure and clinical diagnosis, treatment and assessment. As a method of incorporating the prior knowledge and the anatomical structure similarity between a target image and atlases, multi-atlas segmentation has been successfully applied in segmenting a variety of medical images, including the brain, cardiac, and abdominal images. The basic idea of multi-atlas segmentation is to transfer the labels in atlases to the coordinate of the target image by matching the target patch to the atlas patch in the neighborhood. However, this technique is limited by the pairwise registration between target image and atlases. In this paper, a novel multi-atlas segmentation approach is proposed by introducing a dynamic energy model. First, the target is mapped to each atlas image by minimizing the dynamic energy function, then the segmentation of target image is generated by weighted fusion based on the energy. The method is tested on MICCAI 2012 Multi-Atlas Labeling Challenge dataset which includes 20 target images and 15 atlases images. The paper also analyzes the influence of different parameters of the dynamic energy model on the segmentation accuracy and measures the dice coefficient by using different feature terms with the energy model. The highest mean dice coefficient obtained with the proposed method is 0.861, which is competitive compared with the recently published method.Keywords: brain MRI segmentation, dynamic energy model, multi-atlas segmentation, energy minimization
Procedia PDF Downloads 3338008 Resistivity Tomography Optimization Based on Parallel Electrode Linear Back Projection Algorithm
Authors: Yiwei Huang, Chunyu Zhao, Jingjing Ding
Abstract:
Electrical Resistivity Tomography has been widely used in the medicine and the geology, such as the imaging of the lung impedance and the analysis of the soil impedance, etc. Linear Back Projection is the core algorithm of Electrical Resistivity Tomography, but the traditional Linear Back Projection can not make full use of the information of the electric field. In this paper, an imaging method of Parallel Electrode Linear Back Projection for Electrical Resistivity Tomography is proposed, which generates the electric field distribution that is not linearly related to the traditional Linear Back Projection, captures the new information and improves the imaging accuracy without increasing the number of electrodes by changing the connection mode of the electrodes. The simulation results show that the accuracy of the image obtained by the inverse operation obtained by the Parallel Electrode Linear Back Projection can be improved by about 20%.Keywords: electrical resistivity tomography, finite element simulation, image optimization, parallel electrode linear back projection
Procedia PDF Downloads 1518007 SC-LSH: An Efficient Indexing Method for Approximate Similarity Search in High Dimensional Space
Authors: Sanaa Chafik, Imane Daoudi, Mounim A. El Yacoubi, Hamid El Ouardi
Abstract:
Locality Sensitive Hashing (LSH) is one of the most promising techniques for solving nearest neighbour search problem in high dimensional space. Euclidean LSH is the most popular variation of LSH that has been successfully applied in many multimedia applications. However, the Euclidean LSH presents limitations that affect structure and query performances. The main limitation of the Euclidean LSH is the large memory consumption. In order to achieve a good accuracy, a large number of hash tables is required. In this paper, we propose a new hashing algorithm to overcome the storage space problem and improve query time, while keeping a good accuracy as similar to that achieved by the original Euclidean LSH. The Experimental results on a real large-scale dataset show that the proposed approach achieves good performances and consumes less memory than the Euclidean LSH.Keywords: approximate nearest neighbor search, content based image retrieval (CBIR), curse of dimensionality, locality sensitive hashing, multidimensional indexing, scalability
Procedia PDF Downloads 3208006 Impact of Marine Hydrodynamics and Coastal Morphology on Changes in Mangrove Forests (Case Study: West of Strait of Hormuz, Iran)
Authors: Fatemeh Parhizkar, Mojtaba Yamani, Abdolla Behboodi, Masoomeh Hashemi
Abstract:
The mangrove forests are natural and valuable gifts that exist in some parts of the world, including Iran. Regarding the threats faced by these forests and the declining area of them all over the world, as well as in Iran, it is very necessary to manage and monitor them. The current study aimed to investigate the changes in mangrove forests and the relationship between these changes and the marine hydrodynamics and coastal morphology in the area between qeshm island and the west coast of the Hormozgan province (i.e. the coastline between Mehran river and Bandar-e Pol port) in the 49-year period. After preprocessing and classifying satellite images using the SVM, MLC, and ANN classifiers and evaluating the accuracy of the maps, the SVM approach with the highest accuracy (the Kappa coefficient of 0.97 and overall accuracy of 98) was selected for preparing the classification map of all images. The results indicate that from 1972 to 1987, the area of these forests have had experienced a declining trend, and in the next years, their expansion was initiated. These forests include the mangrove forests of Khurkhuran wetland, Muriz Deraz Estuary, Haft Baram Estuary, the mangrove forest in the south of the Laft Port, and the mangrove forests between the Tabl Pier, Maleki Village, and Gevarzin Village. The marine hydrodynamic and geomorphological characteristics of the region, such as average intertidal zone, sediment data, the freshwater inlet of Mehran river, wave stability and calmness, topography and slope, as well as mangrove conservation projects make the further expansion of mangrove forests in this area possible. By providing significant and up-to-date information on the development and decline of mangrove forests in different parts of the coast, this study can significantly contribute to taking measures for the conservation and restoration of mangrove forests.Keywords: mangrove forests, marine hydrodynamics, coastal morphology, west of strait of Hormuz, Iran
Procedia PDF Downloads 948005 Julia-Based Computational Tool for Composite System Reliability Assessment
Authors: Josif Figueroa, Kush Bubbar, Greg Young-Morris
Abstract:
The reliability evaluation of composite generation and bulk transmission systems is crucial for ensuring a reliable supply of electrical energy to significant system load points. However, evaluating adequacy indices using probabilistic methods like sequential Monte Carlo Simulation can be computationally expensive. Despite this, it is necessary when time-varying and interdependent resources, such as renewables and energy storage systems, are involved. Recent advances in solving power network optimization problems and parallel computing have improved runtime performance while maintaining solution accuracy. This work introduces CompositeSystems, an open-source Composite System Reliability Evaluation tool developed in Julia™, to address the current deficiencies of commercial and non-commercial tools. This work introduces its design, validation, and effectiveness, which includes analyzing two different formulations of the Optimal Power Flow problem. The simulations demonstrate excellent agreement with existing published studies while improving replicability and reproducibility. Overall, the proposed tool can provide valuable insights into the performance of transmission systems, making it an important addition to the existing toolbox for power system planning.Keywords: open-source software, composite system reliability, optimization methods, Monte Carlo methods, optimal power flow
Procedia PDF Downloads 728004 Modeling of Geotechnical Data Using GIS and Matlab for Eastern Ahmedabad City, Gujarat
Authors: Rahul Patel, S. P. Dave, M. V Shah
Abstract:
Ahmedabad is a rapidly growing city in western India that is experiencing significant urbanization and industrialization. With projections indicating that it will become a metropolitan city in the near future, various construction activities are taking place, making soil testing a crucial requirement before construction can commence. To achieve this, construction companies and contractors need to periodically conduct soil testing. This study focuses on the process of creating a spatial database that is digitally formatted and integrated with geotechnical data and a Geographic Information System (GIS). Building a comprehensive geotechnical Geo-database involves three essential steps. Firstly, borehole data is collected from reputable sources. Secondly, the accuracy and redundancy of the data are verified. Finally, the geotechnical information is standardized and organized for integration into the database. Once the Geo-database is complete, it is integrated with GIS. This integration allows users to visualize, analyze, and interpret geotechnical information spatially. Using a Topographic to Raster interpolation process in GIS, estimated values are assigned to all locations based on sampled geotechnical data values. The study area was contoured for SPT N-Values, Soil Classification, Φ-Values, and Bearing Capacity (T/m2). Various interpolation techniques were cross-validated to ensure information accuracy. The GIS map generated by this study enables the calculation of SPT N-Values, Φ-Values, and bearing capacities for different footing widths and various depths. This approach highlights the potential of GIS in providing an efficient solution to complex phenomena that would otherwise be tedious to achieve through other means. Not only does GIS offer greater accuracy, but it also generates valuable information that can be used as input for correlation analysis. Furthermore, this system serves as a decision support tool for geotechnical engineers. The information generated by this study can be utilized by engineers to make informed decisions during construction activities. For instance, they can use the data to optimize foundation designs and improve site selection. In conclusion, the rapid growth experienced by Ahmedabad requires extensive construction activities, necessitating soil testing. This study focused on the process of creating a comprehensive geotechnical database integrated with GIS. The database was developed by collecting borehole data from reputable sources, verifying its accuracy and redundancy, and organizing the information for integration. The GIS map generated by this study is an efficient solution that offers greater accuracy and generates valuable information that can be used as input for correlation analysis. It also serves as a decision support tool for geotechnical engineers, allowing them to make informed decisions during construction activities.Keywords: arcGIS, borehole data, geographic information system (GIS), geo-database, interpolation, SPT N-value, soil classification, φ-value, bearing capacity
Procedia PDF Downloads 668003 Energy Consumption Forecast Procedure for an Industrial Facility
Authors: Tatyana Aleksandrovna Barbasova, Lev Sergeevich Kazarinov, Olga Valerevna Kolesnikova, Aleksandra Aleksandrovna Filimonova
Abstract:
We regard forecasting of energy consumption by private production areas of a large industrial facility as well as by the facility itself. As for production areas the forecast is made based on empirical dependencies of the specific energy consumption and the production output. As for the facility itself implementation of the task to minimize the energy consumption forecasting error is based on adjustment of the facility’s actual energy consumption values evaluated with the metering device and the total design energy consumption of separate production areas of the facility. The suggested procedure of optimal energy consumption was tested based on the actual data of core product output and energy consumption by a group of workshops and power plants of the large iron and steel facility. Test results show that implementation of this procedure gives the mean accuracy of energy consumption forecasting for winter 2014 of 0.11% for the group of workshops and 0.137% for the power plants.Keywords: energy consumption, energy consumption forecasting error, energy efficiency, forecasting accuracy, forecasting
Procedia PDF Downloads 4438002 Identifying Knowledge Gaps in Incorporating Toxicity of Particulate Matter Constituents for Developing Regulatory Limits on Particulate Matter
Authors: Ananya Das, Arun Kumar, Gazala Habib, Vivekanandan Perumal
Abstract:
Regulatory bodies has proposed limits on Particulate Matter (PM) concentration in air; however, it does not explicitly indicate the incorporation of effects of toxicities of constituents of PM in developing regulatory limits. This study aimed to provide a structured approach to incorporate toxic effects of components in developing regulatory limits on PM. A four-step human health risk assessment framework consists of - (1) hazard identification (parameters: PM and its constituents and their associated toxic effects on health), (2) exposure assessment (parameters: concentrations of PM and constituents, information on size and shape of PM; fate and transport of PM and constituents in respiratory system), (3) dose-response assessment (parameters: reference dose or target toxicity dose of PM and its constituents), and (4) risk estimation (metric: hazard quotient and/or lifetime incremental risk of cancer as applicable). Then parameters required at every step were obtained from literature. Using this information, an attempt has been made to determine limits on PM using component-specific information. An example calculation was conducted for exposures of PM2.5 and its metal constituents from Indian ambient environment to determine limit on PM values. Identified data gaps were: (1) concentrations of PM and its constituents and their relationship with sampling regions, (2) relationship of toxicity of PM with its components.Keywords: air, component-specific toxicity, human health risks, particulate matter
Procedia PDF Downloads 3098001 Reliability Evaluation of a Payment Model in Mobile E-Commerce Using Colored Petri Net
Authors: Abdolghader Pourali, Mohammad V. Malakooti, Muhammad Hussein Yektaie
Abstract:
A mobile payment system in mobile e-commerce generally have high security so that the user can trust it for doing business deals, sales, paying financial transactions, etc. in the mobile payment system. Since an architecture or payment model in e-commerce only shows the way of interaction and collaboration among users and mortgagers and does not present any evaluation of effectiveness and confidence about financial transactions to stakeholders. In this paper, we try to present a detailed assessment of the reliability of a mobile payment model in the mobile e-commerce using formal models and colored Petri nets. Finally, we demonstrate that the reliability of this system has high value (case study: a secure payment model in mobile commerce.Keywords: reliability, colored Petri net, assessment, payment models, m-commerce
Procedia PDF Downloads 5358000 Acute Hepatotoxicity of Nano and Micro-Sized Iron Particles in Adult Albino Rats
Authors: Ghada Hasabo, Mahmoud Saber Elbasiouny, Mervat Abdelsalam, Sherin Ghaleb, Niveen Eldessouky
Abstract:
In the near future, nanotechnology is envisaged for large scale use. Hence health and safety issues of nanoparticles should be promptly addressed. In the present study the acute hepatoxicity assessment due to high single oral dose of nano iron and micro iron particles were studied. The normal daily activities, biochemical alterations, blood coagulation, histopathological changes in Wister rats were the aspect of the toxicological assessment.This work found that significant alterations in biochemical enzymes (serum iron level, liver enzymes, albumin, and bilirubin levels), blood coagulation (PT, PC, INR), and histopathological changes occurred more prominently in the nano iron particle treated group.Keywords: nanobiotechnology, nanosystems, nanomaterials, nanotechnology
Procedia PDF Downloads 5027999 Airport Investment Risk Assessment under Uncertainty
Authors: Elena M. Capitanul, Carlos A. Nunes Cosenza, Walid El Moudani, Felix Mora Camino
Abstract:
The construction of a new airport or the extension of an existing one requires massive investments and many times public private partnerships were considered in order to make feasible such projects. One characteristic of these projects is uncertainty with respect to financial and environmental impacts on the medium to long term. Another one is the multistage nature of these types of projects. While many airport development projects have been a success, some others have turned into a nightmare for their promoters. This communication puts forward a new approach for airport investment risk assessment. The approach takes explicitly into account the degree of uncertainty in activity levels prediction and proposes milestones for the different stages of the project for minimizing risk. Uncertainty is represented through fuzzy dual theory and risk management is performed using dynamic programming. An illustration of the proposed approach is provided.Keywords: airports, fuzzy logic, risk, uncertainty
Procedia PDF Downloads 4137998 Translation Quality Assessment in Fansubbed English-Chinese Swearwords: A Corpus-Based Study of the Big Bang Theory
Authors: Qihang Jiang
Abstract:
Fansubbing, the combination of fan and subtitling, is one of the main branches of Audiovisual Translation (AVT) having kindled more and more interest of researchers into the AVT field in recent decades. In particular, the quality of so-called non-professional translation seems questionable due to the non-transparent qualification of subtitlers in a huge community network. This paper attempts to figure out how YYeTs aka 'ZiMuZu', the largest fansubbing group in China, translates swearwords from English to Chinese for its fans of the prevalent American sitcom The Big Bang Theory, taking cultural, social and political elements into account in the context of China. By building a bilingual corpus containing both the source and target texts, this paper found that most of the original swearwords were translated in a toned-down manner, probably due to Chinese audiences’ cultural and social network features as well as the strict censorship under the Chinese government. Additionally, House (2015)’s newly revised model of Translation Quality Assessment (TQA) was applied and examined. Results revealed that most of the subtitled swearwords achieved their pragmatic functions and exerted a communicative effect for audiences. In conclusion, this paper enriches the empirical research concerning House’s new TQA model, gives a full picture of the subtitling of swearwords in AVT field and provides a practical guide for the practitioners in their career of subtitling.Keywords: corpus-based approach, fansubbing, pragmatic functions, swearwords, translation quality assessment
Procedia PDF Downloads 1417997 Assessing Soft Skills In Accounting Programmes: Insights From South African University Lecturers
Authors: Dolly Nyaguthii Wanjau
Abstract:
This study contributes to our understanding of how lecturers assess soft skills in accounting programmes, with the intention of producing graduates that are better prepared for the world of work. Insights were obtained through semi-structured interviews with twelve South African universities that offer chartered accountant training and accredited by SAICA. It was found that the lecturers assessed soft skills using traditional methods of assessments such as tests, assignments, and examinations. However, there were missed opportunities to embrace ICT tools in the assessment process, and this could be attributed to a lack of resources within the participating universities. Given the increasing use of digital tools for business activities, it is important that ICT tools be embraced as an inseparable part of soft skills because employers are increasingly looking for accounting graduates with digital skills.Keywords: accounting, assessment, ICT skills, SAICA, soft skills
Procedia PDF Downloads 1267996 Modeling Stream Flow with Prediction Uncertainty by Using SWAT Hydrologic and RBNN Neural Network Models for Agricultural Watershed in India
Authors: Ajai Singh
Abstract:
Simulation of hydrological processes at the watershed outlet through modelling approach is essential for proper planning and implementation of appropriate soil conservation measures in Damodar Barakar catchment, Hazaribagh, India where soil erosion is a dominant problem. This study quantifies the parametric uncertainty involved in simulation of stream flow using Soil and Water Assessment Tool (SWAT), a watershed scale model and Radial Basis Neural Network (RBNN), an artificial neural network model. Both the models were calibrated and validated based on measured stream flow and quantification of the uncertainty in SWAT model output was assessed using ‘‘Sequential Uncertainty Fitting Algorithm’’ (SUFI-2). Though both the model predicted satisfactorily, but RBNN model performed better than SWAT with R2 and NSE values of 0.92 and 0.92 during training, and 0.71 and 0.70 during validation period, respectively. Comparison of the results of the two models also indicates a wider prediction interval for the results of the SWAT model. The values of P-factor related to each model shows that the percentage of observed stream flow values bracketed by the 95PPU in the RBNN model as 91% is higher than the P-factor in SWAT as 87%. In other words the RBNN model estimates the stream flow values more accurately and with less uncertainty. It could be stated that RBNN model based on simple input could be used for estimation of monthly stream flow, missing data, and testing the accuracy and performance of other models.Keywords: SWAT, RBNN, SUFI 2, bootstrap technique, stream flow, simulation
Procedia PDF Downloads 3687995 The Effects of Weather Events and Land Use Change on Urban Ecosystems: From Risk to Resilience
Authors: Szu-Hua Wang
Abstract:
Urban ecosystems, as complex coupled human-environment systems, contain abundant natural resources for breeding natural assets and, at the same time, attract urban assets and consume natural resources, triggered by urban development. Land use change illustrates the interaction between human activities and environments factually. However, IPCC (2014) announces that land use change and urbanization due to human activities are the major cause of climate change, leading to serious impacts on urban ecosystem resilience and risk. For this reason, risk assessment and resilience analysis are the keys for responding to climate change on urban ecosystems. Urban spatial planning can guide urban development by land use planning, transportation planning, and environmental planning and affect land use allocation and human activities by building major constructions and protecting important national land resources simultaneously. Urban spatial planning can aggravate climate change and, on the other hand, mitigate and adapt climate change. Research on effects of spatial planning on land use change and climate change is one of intense issues currently. Therefore, this research focuses on developing frameworks for risk assessment and resilience analysis from the aspect of ecosystem based on typhoon precipitation in Taipei area. The integrated method of risk assessment and resilience analysis will be also addressed for applying spatial planning practice and sustainable development.Keywords: ecosystem, land use change, risk analysis, resilience
Procedia PDF Downloads 4157994 Basic Modal Displacements (BMD) for Optimizing the Buildings Subjected to Earthquakes
Authors: Seyed Sadegh Naseralavi, Mohsen Khatibinia
Abstract:
In structural optimizations through meta-heuristic algorithms, analyses of structures are performed for many times. For this reason, performing the analyses in a time saving way is precious. The importance of the point is more accentuated in time-history analyses which take much time. To this aim, peak picking methods also known as spectrum analyses are generally utilized. However, such methods do not have the required accuracy either done by square root of sum of squares (SRSS) or complete quadratic combination (CQC) rules. The paper presents an efficient technique for evaluating the dynamic responses during the optimization process with high speed and accuracy. In the method, first by using a static equivalent of the earthquake, an initial design is obtained. Then, the displacements in the modal coordinates are achieved. The displacements are herein called basic modal displacements (MBD). For each new design of the structure, the responses can be derived by well scaling each of the MBD along the time and amplitude and superposing them together using the corresponding modal matrices. To illustrate the efficiency of the method, an optimization problems is studied. The results show that the proposed approach is a suitable replacement for the conventional time history and spectrum analyses in such problems.Keywords: basic modal displacements, earthquake, optimization, spectrum
Procedia PDF Downloads 3587993 Anaesthetic Management of a Huge Oropharyngeal Mass
Authors: Vasudha Govil, Suresh Singhal
Abstract:
Introduction: Patients with oropharyngeal masses pose a challenge for an anaesthetist in terms of ventilation and tracheal intubation. Thus, preoperative assessment and preparation become an integral part of managing such anticipated difficult airway cases. Case report: A 45- year old female presented with growth in the oropharynx causing dysphagia and hoarseness of voice. Clinical examination and investigations predicted a difficult airway. It was managed with fibreoptic nasotracheal intubation with a successful perioperative outcome. Tracheostomy was kept as plan B in case of the CVCI situation. Conclusion: Careful preoperative examination and assessment is required to prepare oneself for difficult airway. Fibreoptic bronchoscope-guided nasotracheal intubation in a spontaneously breathing patient is a safe and successful airway management technique in difficult airway cases.Keywords: airway, difficult, mass, oropharyngeal
Procedia PDF Downloads 1917992 Assessment of the Groundwater Agricultural Pollution Risk: Case of the Semi-Arid Region (Batna-East Algeria)
Authors: Dib Imane, Chettah Wahid, Khedidja Abdelhamid
Abstract:
The plain of Gadaïne - Ain Yaghout, located in the wilaya of Batna (Eastern Algeria), experiences intensive human activities, particularly in agricultural practices which are accompanied by an increasing use of chemical fertilizers and manure. These activities lead to a degradation of the quality of water resources. In order to protect the quality of groundwater in this plain and formulate effective strategies to mitigate or avoid any contamination of groundwater, a risk assessment using the European method known as “COSTE Action 620” was applied to the mio-. plio-quaternary aquifer of this plain. Risk assessment requires the identification of existing dangers and their potential impact on groundwater by using a system of evaluation and weighting. In addition, it also requires the integration of the hydrogeological factors that influence the movement of contaminants by means of the intrinsic vulnerability maps of groundwater, which were produced according to the modified DRASTIC method. The overall danger on the plain ranges from very low to high. Farms containing stables, houses detached from the public sewer system, and sometimes manure piles were assigned a weighting factor expressing the highest degree of harmfulness; this created a medium to high danger index. Large areas for agricultural practice and grazing are characterized, successively, by low to very low danger. Therefore, the risks present at the study site are classified according to a range from medium to very high-risk intensity. These classes successively represent 3%, 49%, and 0.2% of the surface of the plain. Cultivated land and farms present a high to very high level of risk successively. In addition, with the exception of the salt mine, which presents a very high level of risk, the gas stations and cemeteries, as well as the railway line, represent a high level of risk.Keywords: semi-arid, quality of water resources, risk assessment, vulnerability, contaminants
Procedia PDF Downloads 487991 IoT Based Monitoring Temperature and Humidity
Authors: Jay P. Sipani, Riki H. Patel, Trushit Upadhyaya
Abstract:
Today there is a demand to monitor environmental factors almost in all research institutes and industries and even for domestic uses. The analog data measurement requires manual effort to note readings, and there may be a possibility of human error. Such type of systems fails to provide and store precise values of parameters with high accuracy. Analog systems are having drawback of storage/memory. Therefore, there is a requirement of a smart system which is fully automated, accurate and capable enough to monitor all the environmental parameters with utmost possible accuracy. Besides, it should be cost-effective as well as portable too. This paper represents the Wireless Sensor (WS) data communication using DHT11, Arduino, SIM900A GSM module, a mobile device and Liquid Crystal Display (LCD). Experimental setup includes the heating arrangement of DHT11 and transmission of its data using Arduino and SIM900A GSM shield. The mobile device receives the data using Arduino, GSM shield and displays it on LCD too. Heating arrangement is used to heat and cool the temperature sensor to study its characteristics.Keywords: wireless communication, Arduino, DHT11, LCD, SIM900A GSM module, mobile phone SMS
Procedia PDF Downloads 2807990 Evaluation Methods for Question Decomposition Formalism
Authors: Aviv Yaniv, Ron Ben Arosh, Nadav Gasner, Michael Konviser, Arbel Yaniv
Abstract:
This paper introduces two methods for the evaluation of Question Decomposition Meaning Representation (QDMR) as predicted by sequence-to-sequence model and COPYNET parser for natural language questions processing, motivated by the fact that previous evaluation metrics used for this task do not take into account some characteristics of the representation, such as partial ordering structure. To this end, several heuristics to extract such partial dependencies are formulated, followed by the hereby proposed evaluation methods denoted as Proportional Graph Matcher (PGM) and Conversion to Normal String Representation (Nor-Str), designed to better capture the accuracy level of QDMR predictions. Experiments are conducted to demonstrate the efficacy of the proposed evaluation methods and show the added value suggested by one of them- the Nor-Str, for better distinguishing between high and low-quality QDMR when predicted by models such as COPYNET. This work represents an important step forward in the development of better evaluation methods for QDMR predictions, which will be critical for improving the accuracy and reliability of natural language question-answering systems.Keywords: NLP, question answering, question decomposition meaning representation, QDMR evaluation metrics
Procedia PDF Downloads 767989 Risk and Uncertainty in Aviation: A Thorough Analysis of System Vulnerabilities
Authors: C. V. Pietreanu, S. E. Zaharia, C. Dinu
Abstract:
Hazard assessment and risks quantification are key components for estimating the impact of existing regulations. But since regulatory compliance cannot cover all risks in aviation, the authors point out that by studying causal factors and eliminating uncertainty, an accurate analysis can be outlined. The research debuts by making delimitations on notions, as confusion on the terms over time has reflected in less rigorous analysis. Throughout this paper, it will be emphasized the fact that the variation in human performance and organizational factors represent the biggest threat from an operational perspective. Therefore, advanced risk assessment methods analyzed by the authors aim to understand vulnerabilities of the system given by a nonlinear behavior. Ultimately, the mathematical modeling of existing hazards and risks by eliminating uncertainty implies establishing an optimal solution (i.e. risk minimization).Keywords: control, human factor, optimization, risk management, uncertainty
Procedia PDF Downloads 2487988 A Model of Empowerment Evaluation of Knowledge Management in Private Banks Using Fuzzy Inference System
Authors: Nazanin Pilevari, Kamyar Mahmoodi
Abstract:
The purpose of this research is to provide a model based on fuzzy inference system for evaluating empowerment of Knowledge management. The first prototype of the research was developed based on the study of literature. In the next step, experts were provided with these models and after implementing consensus-based reform, the views of Fuzzy Delphi experts and techniques, components and Index research model were finalized. Culture, structure, IT and leadership were considered as dimensions of empowerment. Then, In order to collect and extract data for fuzzy inference system based on knowledge and Experience, the experts were interviewed. The values obtained from designed fuzzy inference system, made review and assessment of the organization's empowerment of Knowledge management possible. After the design and validation of systems to measure indexes ,empowerment of Knowledge management and inputs into fuzzy inference) in the AYANDEH Bank, a questionnaire was used. In the case of this bank, the system output indicates that the status of empowerment of Knowledge management, culture, organizational structure and leadership are at the moderate level and information technology empowerment are relatively high. Based on these results, the status of knowledge management empowerment in AYANDE Bank, was moderate. Eventually, some suggestions for improving the current situation of banks were provided. According to studies of research history, the use of powerful tools in Fuzzy Inference System for assessment of Knowledge management and knowledge management empowerment such an assessment in the field of banking, are the innovation of this Research.Keywords: knowledge management, knowledge management empowerment, fuzzy inference system, fuzzy Delphi
Procedia PDF Downloads 3587987 Effective Teaching without Digital Enhancement
Authors: D. A. Carnegie
Abstract:
Whilst there is a movement towards increased digital augmentation in order to facilitate effective tertiary learning, this must come with an awareness of the limitations of such an approach. Learning is best achieved in an environment that includes their learning peers where difficulties can be shared and learning enabled. Policy that advocates for digital technology in place of a physical classroom is dangerous and is often driven by financial concerns rather than pedagogical ones. In this paper, a mostly digital-less form of teaching is presented – one that has proven to be extremely effective. Implicit is anecdotal evidence that student prefer the old overhead transparencies to PowerPoint presentations. Varying and reinforcing assessment, facilitation of effective note-taking, and just actively engaging with students is at the core of a good tertiary education experience. Digital techniques can augment and complement, but not replace these core personal teaching requirements.Keywords: engineering education, active classroom engagement, effective note taking, reinforcing assessment
Procedia PDF Downloads 3507986 Performance Assessment of GSO Satellites before and after Enhancing the Pointing Effect
Authors: Amr Emam, Joseph Victor, Mohamed Abd Elghany
Abstract:
The paper presents the effect of the orbit inclination on the pointing error of the satellite antenna and consequently on its footprint on earth for a typical Ku- band payload system. The performance assessment is examined both theoretically and by means of practical measurements, taking also into account all additional sources of pointing errors, such as East-West station keeping, orbit eccentricity and actual attitude control performance. An implementation and computation of the sinusoidal biases in satellite roll and pitch used to compensate the pointing error of the satellite antenna coverage is studied and evaluated before and after the pointing corrections performed. A method for evaluation of the performance of the implemented biases has been introduced through measuring satellite received level from a tracking 11m and fixed 4.8m transmitting antenna before and after the implementation of the pointing corrections.Keywords: satellite, inclined orbit, pointing errors, coverage optimization
Procedia PDF Downloads 4017985 Hybrid Anomaly Detection Using Decision Tree and Support Vector Machine
Authors: Elham Serkani, Hossein Gharaee Garakani, Naser Mohammadzadeh, Elaheh Vaezpour
Abstract:
Intrusion detection systems (IDS) are the main components of network security. These systems analyze the network events for intrusion detection. The design of an IDS is through the training of normal traffic data or attack. The methods of machine learning are the best ways to design IDSs. In the method presented in this article, the pruning algorithm of C5.0 decision tree is being used to reduce the features of traffic data used and training IDS by the least square vector algorithm (LS-SVM). Then, the remaining features are arranged according to the predictor importance criterion. The least important features are eliminated in the order. The remaining features of this stage, which have created the highest level of accuracy in LS-SVM, are selected as the final features. The features obtained, compared to other similar articles which have examined the selected features in the least squared support vector machine model, are better in the accuracy, true positive rate, and false positive. The results are tested by the UNSW-NB15 dataset.Keywords: decision tree, feature selection, intrusion detection system, support vector machine
Procedia PDF Downloads 2627984 Machine Learning Techniques for Estimating Ground Motion Parameters
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine
Procedia PDF Downloads 1217983 Project Progress Prediction in Software Devlopment Integrating Time Prediction Algorithms and Large Language Modeling
Authors: Dong Wu, Michael Grenn
Abstract:
Managing software projects effectively is crucial for meeting deadlines, ensuring quality, and managing resources well. Traditional methods often struggle with predicting project timelines accurately due to uncertain schedules and complex data. This study addresses these challenges by combining time prediction algorithms with Large Language Models (LLMs). It makes use of real-world software project data to construct and validate a model. The model takes detailed project progress data such as task completion dynamic, team Interaction and development metrics as its input and outputs predictions of project timelines. To evaluate the effectiveness of this model, a comprehensive methodology is employed, involving simulations and practical applications in a variety of real-world software project scenarios. This multifaceted evaluation strategy is designed to validate the model's significant role in enhancing forecast accuracy and elevating overall management efficiency, particularly in complex software project environments. The results indicate that the integration of time prediction algorithms with LLMs has the potential to optimize software project progress management. These quantitative results suggest the effectiveness of the method in practical applications. In conclusion, this study demonstrates that integrating time prediction algorithms with LLMs can significantly improve the predictive accuracy and efficiency of software project management. This offers an advanced project management tool for the industry, with the potential to improve operational efficiency, optimize resource allocation, and ensure timely project completion.Keywords: software project management, time prediction algorithms, large language models (LLMS), forecast accuracy, project progress prediction
Procedia PDF Downloads 75