Search results for: data validation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7718

Search results for: data validation

1148 Assessing Innovation Activity in Mexico and South Korea: An Econometric Approach

Authors: Mario Gómez, Won Ho Kim, Ángel Licona, José Carlos Rodríguez

Abstract:

This article analyzes innovation activity in Mexico and South Korea. It develops an econometric model to test for structural breaks in the number of patent applications filed by residents and nonresidents in these countries during the period of 1965 to 2012. These changes may suggest that firms’ innovative capabilities have changed because of implementing different science, technology and innovation (STI) policies in Mexico and South Korea. Two important features characterize this research from others already developed by these authors. First, the theoretical research framework in this research is the debate between the assimilation view of growth and the accumulation view of growth. This characteristic suggests that trade liberalization should be accompanied by an adequate STI policy to boost competitiveness among indigenous firms. Second, the analysis in this research stresses the importance of key actors (e.g. governments) to successfully develop innovation capabilities among indigenous firms. Therefore, the question conducting this research is how STI policies in Mexico and South Korea contributed to develop firms’ innovation capabilities in these countries during last decades? The results from this research suggests that STI policy in South Korea was more suitable to boost innovation firms to compete in markets. Data to develop this research was released by the World Intellectual Property Organization (WIPO).

Keywords: Econometric methods, innovation, Mexico, South Korea, STI Policy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 961
1147 Complex Condition Monitoring System of Aircraft Gas Turbine Engine

Authors: A. M. Pashayev, D. D. Askerov, C. Ardil, R. A. Sadiqov, P. S. Abdullayev

Abstract:

Researches show that probability-statistical methods application, especially at the early stage of the aviation Gas Turbine Engine (GTE) technical condition diagnosing, when the flight information has property of the fuzzy, limitation and uncertainty is unfounded. Hence the efficiency of application of new technology Soft Computing at these diagnosing stages with the using of the Fuzzy Logic and Neural Networks methods is considered. According to the purpose of this problem training with high accuracy of fuzzy multiple linear and non-linear models (fuzzy regression equations) which received on the statistical fuzzy data basis is made. For GTE technical condition more adequate model making dynamics of skewness and kurtosis coefficients- changes are analysed. Researches of skewness and kurtosis coefficients values- changes show that, distributions of GTE workand output parameters of the multiple linear and non-linear generalised models at presence of noise measured (the new recursive Least Squares Method (LSM)). The developed GTE condition monitoring system provides stage-by-stage estimation of engine technical conditions. As application of the given technique the estimation of the new operating aviation engine technical condition was made.

Keywords: aviation gas turbine engine, technical condition, fuzzy logic, neural networks, fuzzy statistics

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2543
1146 An Improved Algorithm for Channel Estimations of OFDM System based Pilot Signal

Authors: Ahmed N. H. Alnuaimy, Mahamod Ismail, Mohd. A. M. Ali, Kasmiran Jumari, Ayman A. El-Saleh

Abstract:

This paper presents a new algorithm for the channel estimation of the OFDM system based on a pilot signal for the new generation of high data rate communication systems. In orthogonal frequency division multiplexing (OFDM) systems over fast-varying fading channels, channel estimation and tracking is generally carried out by transmitting known pilot symbols in given positions of the frequency-time grid. In this paper, we propose to derive an improved algorithm based on the calculation of the mean and the variance of the adjacent pilot signals for a specific distribution of the pilot signals in the OFDM frequency-time grid then calculating of the entire unknown channel coefficients from the equation of the mean and the variance. Simulation results shows that the performance of the OFDM system increase as the length of the channel increase where the accuracy of the estimated channel will be increased using this low complexity algorithm, also the number of the pilot signal needed to be inserted in the OFDM signal will be reduced which lead to increase in the throughput of the signal over the OFDM system in compared with other type of the distribution such as Comb type and Block type channel estimation.

Keywords: Channel estimation, orthogonal frequency divisionmultiplexing (OFDM), comb type channel estimation, block typechannel estimation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1415
1145 An Adaptive Dimensionality Reduction Approach for Hyperspectral Imagery Semantic Interpretation

Authors: Akrem Sellami, Imed Riadh Farah, Basel Solaiman

Abstract:

With the development of HyperSpectral Imagery (HSI) technology, the spectral resolution of HSI became denser, which resulted in large number of spectral bands, high correlation between neighboring, and high data redundancy. However, the semantic interpretation is a challenging task for HSI analysis due to the high dimensionality and the high correlation of the different spectral bands. In fact, this work presents a dimensionality reduction approach that allows to overcome the different issues improving the semantic interpretation of HSI. Therefore, in order to preserve the spatial information, the Tensor Locality Preserving Projection (TLPP) has been applied to transform the original HSI. In the second step, knowledge has been extracted based on the adjacency graph to describe the different pixels. Based on the transformation matrix using TLPP, a weighted matrix has been constructed to rank the different spectral bands based on their contribution score. Thus, the relevant bands have been adaptively selected based on the weighted matrix. The performance of the presented approach has been validated by implementing several experiments, and the obtained results demonstrate the efficiency of this approach compared to various existing dimensionality reduction techniques. Also, according to the experimental results, we can conclude that this approach can adaptively select the relevant spectral improving the semantic interpretation of HSI.

Keywords: Band selection, dimensionality reduction, feature extraction, hyperspectral imagery, semantic interpretation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1169
1144 Challenges for Interface Designers in Designing Sensor Dashboards in the Context of Industry 4.0

Authors: Naveen Kumar, Shyambihari Prajapati

Abstract:

Industry 4.0 is the fourth industrial revolution that focuses on interconnectivity of machine to machine, human to machine and human to human via Internet of Things (IoT). Technologies of industry 4.0 facilitate communication between human and machine through IoT and forms Cyber-Physical Production System (CPPS). In CPPS, multiple shop floors sensor data are connected through IoT and displayed through sensor dashboard to the operator. These sensor dashboards have enormous amount of information to be presented which becomes complex for operators to perform monitoring, controlling and interpretation tasks. Designing handheld sensor dashboards for supervision task will become a challenge for the interface designers. This paper reports emerging technologies of industry 4.0, changing context of increasing information complexity in consecutive industrial revolutions and upcoming design challenges for interface designers in context of Industry 4.0. Authors conclude that information complexity of sensor dashboards design has increased with consecutive industrial revolutions and designs of sensor dashboard causes cognitive load on users. Designing such complex dashboards interfaces in Industry 4.0 context will become main challenges for the interface designers.

Keywords: Industry 4.0, sensor dashboard design, Cyber-physical production system, Interface designer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 668
1143 An Implementation of Fuzzy Logic Technique for Prediction of the Power Transformer Faults

Authors: Omar M. Elmabrouk., Roaa Y. Taha., Najat M. Ebrahim, Sabbreen A. Mohammed

Abstract:

Power transformers are the most crucial part of power electrical system, distribution and transmission grid. This part is maintained using predictive or condition-based maintenance approach. The diagnosis of power transformer condition is performed based on Dissolved Gas Analysis (DGA). There are five main methods utilized for analyzing these gases. These methods are International Electrotechnical Commission (IEC) gas ratio, Key Gas, Roger gas ratio, Doernenburg, and Duval Triangle. Moreover, due to the importance of the transformers, there is a need for an accurate technique to diagnose and hence predict the transformer condition. The main objective of this technique is to avoid the transformer faults and hence to maintain the power electrical system, distribution and transmission grid. In this paper, the DGA was utilized based on the data collected from the transformer records available in the General Electricity Company of Libya (GECOL) which is located in Benghazi-Libya. The Fuzzy Logic (FL) technique was implemented as a diagnostic approach based on IEC gas ratio method. The FL technique gave better results and approved to be used as an accurate prediction technique for power transformer faults. Also, this technique is approved to be a quite interesting for the readers and the concern researchers in the area of FL mathematics and power transformer.

Keywords: Fuzzy logic, dissolved gas-in-oil analysis, DGA, prediction, power transformer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1356
1142 A Dynamic Composition of an Adaptive Course

Authors: S. Chiali, Z.Eberrichi, M.Malki

Abstract:

The number of framework conceived for e-learning constantly increase, unfortunately the creators of learning materials and educational institutions engaged in e-formation adopt a “proprietor" approach, where the developed products (courses, activities, exercises, etc.) can be exploited only in the framework where they were conceived, their uses in the other learning environments requires a greedy adaptation in terms of time and effort. Each one proposes courses whose organization, contents, modes of interaction and presentations are unique for all learners, unfortunately the latter are heterogeneous and are not interested by the same information, but only by services or documents adapted to their needs. Currently the new tendency for the framework conceived for e-learning, is the interoperability of learning materials, several standards exist (DCMI (Dublin Core Metadata Initiative)[2], LOM (Learning Objects Meta data)[1], SCORM (Shareable Content Object Reference Model)[6][7][8], ARIADNE (Alliance of Remote Instructional Authoring and Distribution Networks for Europe)[9], CANCORE (Canadian Core Learning Resource Metadata Application Profiles)[3]), they converge all to the idea of learning objects. They are also interested in the adaptation of the learning materials according to the learners- profile. This article proposes an approach for the composition of courses adapted to the various profiles (knowledge, preferences, objectives) of learners, based on two ontologies (domain to teach and educational) and the learning objects.

Keywords: Adaptive educational hypermedia systems (AEHS), E-learning, Learner's model, Learning objects, Metadata, Ontology.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1959
1141 Investigating Solar Cycles and Media Sentiment Through Advanced NLP Techniques

Authors: Aghamusa Azizov

Abstract:

This study investigates the correlation between solar activity and sentiment in news media coverage, using a large-scale dataset of solar activity since 1750 and over 15 million articles from "The New York Times" dating from 1851 onwards. Employing Pearson's correlation coefficient and multiple Natural Language Processing (NLP) tools—TextBlob, Vader, and DistillBERT—the research examines the extent to which fluctuations in solar phenomena are reflected in the sentiment of historical news narratives. The findings reveal that the correlation between solar activity and media sentiment is generally negligible, suggesting a weak influence of solar patterns on the portrayal of events in news media. Notably, a moderate positive correlation was observed between the sentiments derived from TextBlob and Vader, indicating consistency across NLP tools. The analysis provides insights into the historical impact of solar activity on human affairs and highlights the importance of using multiple analytical methods to understand complex relationships in large datasets. The study contributes to the broader understanding of how extraterrestrial factors may intersect with media-reported events and underlines the intricate nature of interdisciplinary research in the data science and historical domains.

Keywords: Solar Activity Correlation, Media Sentiment Analysis, Natural Language Processing, NLP, Historical Event Patterns.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 69
1140 Kinetic Theory Based CFD Modeling of Particulate Flows in Horizontal Pipes

Authors: Pandaba Patro, Brundaban Patro

Abstract:

The numerical simulation of fully developed gas–solid flow in a horizontal pipe is done using the eulerian-eulerian approach, also known as two fluids modeling as both phases are treated as continuum and inter-penetrating continua. The solid phase stresses are modeled using kinetic theory of granular flow (KTGF). The computed results for velocity profiles and pressure drop are compared with the experimental data. We observe that the convection and diffusion terms in the granular temperature cannot be neglected in gas solid flow simulation along a horizontal pipe. The particle-wall collision and lift also play important role in eulerian modeling. We also investigated the effect of flow parameters like gas velocity, particle properties and particle loading on pressure drop prediction in different pipe diameters. Pressure drop increases with gas velocity and particle loading. The gas velocity has the same effect ((proportional toU2 ) as single phase flow on pressure drop prediction. With respect to particle diameter, pressure drop first increases, reaches a peak and then decreases. The peak is a strong function of pipe bore.

Keywords: CFD, Eulerian modeling, gas solid flow, KTGF.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3173
1139 Incorporating Semantic Similarity Measure in Genetic Algorithm : An Approach for Searching the Gene Ontology Terms

Authors: Razib M. Othman, Safaai Deris, Rosli M. Illias, Hany T. Alashwal, Rohayanti Hassan, FarhanMohamed

Abstract:

The most important property of the Gene Ontology is the terms. These control vocabularies are defined to provide consistent descriptions of gene products that are shareable and computationally accessible by humans, software agent, or other machine-readable meta-data. Each term is associated with information such as definition, synonyms, database references, amino acid sequences, and relationships to other terms. This information has made the Gene Ontology broadly applied in microarray and proteomic analysis. However, the process of searching the terms is still carried out using traditional approach which is based on keyword matching. The weaknesses of this approach are: ignoring semantic relationships between terms, and highly depending on a specialist to find similar terms. Therefore, this study combines semantic similarity measure and genetic algorithm to perform a better retrieval process for searching semantically similar terms. The semantic similarity measure is used to compute similitude strength between two terms. Then, the genetic algorithm is employed to perform batch retrievals and to handle the situation of the large search space of the Gene Ontology graph. The computational results are presented to show the effectiveness of the proposed algorithm.

Keywords: Gene Ontology, Semantic similarity measure, Genetic algorithm, Ontology search

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1489
1138 Feasibility Investigation of Near Infrared Spectrometry for Particle Size Estimation of Nano Structures

Authors: A. Bagheri Garmarudi, M. Khanmohammadi, N. Khoddami, K. Shabani

Abstract:

Determination of nano particle size is substantial since the nano particle size exerts a significant effect on various properties of nano materials. Accordingly, proposing non-destructive, accurate and rapid techniques for this aim is of high interest. There are some conventional techniques to investigate the morphology and grain size of nano particles such as scanning electron microscopy (SEM), atomic force microscopy (AFM) and X-ray diffractometry (XRD). Vibrational spectroscopy is utilized to characterize different compounds and applied for evaluation of the average particle size based on relationship between particle size and near infrared spectra [1,4] , but it has never been applied in quantitative morphological analysis of nano materials. So far, the potential application of nearinfrared (NIR) spectroscopy with its ability in rapid analysis of powdered materials with minimal sample preparation, has been suggested for particle size determination of powdered pharmaceuticals. The relationship between particle size and diffuse reflectance (DR) spectra in near infrared region has been applied to introduce a method for estimation of particle size. Back propagation artificial neural network (BP-ANN) as a nonlinear model was applied to estimate average particle size based on near infrared diffuse reflectance spectra. Thirty five different nano TiO2 samples with different particle size were analyzed by DR-FTNIR spectrometry and the obtained data were processed by BP- ANN.

Keywords: near infrared, particle size, chemometrics, neuralnetwork, nano structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1840
1137 Proposal of Commutation Protocol in Hybrid Sensors and Vehicular Networks for Intelligent Transport Systems

Authors: Taha Bensiradj, Samira Moussaoui

Abstract:

Hybrid Sensors and Vehicular Networks (HSVN), represent a hybrid network, which uses several generations of Ad-Hoc networks. It is used especially in Intelligent Transport Systems (ITS). The HSVN allows making collaboration between the Wireless Sensors Network (WSN) deployed on the border of the road and the Vehicular Network (VANET). This collaboration is defined by messages exchanged between the two networks for the purpose to inform the drivers about the state of the road, provide road safety information and more information about traffic on the road. Moreover, this collaboration created by HSVN, also allows the use of a network and the advantage of improving another network. For example, the dissemination of information between the sensors quickly decreases its energy, and therefore, we can use vehicles that do not have energy constraint to disseminate the information between sensors. On the other hand, to solve the disconnection problem in VANET, the sensors can be used as gateways that allow sending the messages received by one vehicle to another. However, because of the short communication range of the sensor and its low capacity of storage and processing of data, it is difficult to ensure the exchange of road messages between it and the vehicle, which can be moving at high speed at the time of exchange. This represents the time where the vehicle is in communication range with the sensor. This work is the proposition of a communication protocol between the sensors and the vehicle used in HSVN. The latter has as the purpose to ensure the exchange of road messages in the available time of exchange.

Keywords: HSVN, ITS, VANET, WSN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1232
1136 Different in Factors of the Distributor Selection for Food and Non-Food OTOP Entrepreneur in Thailand

Authors: Phutthiwat Waiyawuththanapoom

Abstract:

This study has only one objective which is to identify the different in factors of choosing the distributor for food and non-food OTOP entrepreneur in Thailand. In this research, the types of OTOP product will be divided into two groups which are food and non-food. The sample for the food type OTOP product was the processed fruit and vegetable from Nakorn Pathom province and the sample for the non-food type OTOP product was the court doll from Ang Thong province. The research was divided into 3 parts which were a study of the distribution pattern and how to choose the distributor of the food type OTOP product, a study of the distribution pattern and how to choose the distributor of the non-food type OTOP product and a comparison between 2 types of products to find the differentiation in the factor of choosing distributor. The data and information was collected by using the interview. The populations in the research were 5 producers of the processed fruit and vegetable from Nakorn Pathom province and 5 producers of the court doll from Ang Thong province. The significant factor in choosing the distributor of the food type OTOP product is the material handling efficiency and on-time delivery but for the non-food type OTOP product is focused on the channel of distribution and cost of the distributor.

Keywords: Distributor, OTOP, Food and Non-Food, Selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1606
1135 The Conceptual and Procedural Knowledge of Rational Numbers in Primary School Teachers

Authors: R. M. Kashim

Abstract:

The study investigates the conceptual and procedural knowledge of rational number in primary school teachers, specifically, the primary school teachers level of conceptual knowledge about rational number and the primary school teachers level of procedural knowledge about rational numbers. The study was carried out in Bauchi metropolis in Bauchi state of Nigeria. A Conceptual and Procedural Knowledge Test was used as the instrument for data collection, 54 mathematics teachers in Bauchi primary schools were involved in the study. The collections were analyzed using mean and standard deviation. The findings revealed that the primary school mathematics teachers in Bauchi metropolis posses a low level of conceptual knowledge of rational number and also possess a high level of Procedural knowledge of rational number. It is therefore recommended that to be effective, teachers teaching mathematics most posses a deep understanding of both conceptual and procedural knowledge. That way the most knowledgeable teachers in mathematics deliver highly effective rational number instructions. Teachers should not ignore the mathematical concept aspect of rational number teaching. This is because only the procedural aspect of Rational number is highlighted during instructions; this often leads to rote - learning of procedures without understanding the meanings. It is necessary for teachers to learn rational numbers teaching method that focus on both conceptual knowledge and procedural knowledge teaching.

Keywords: Conceptual knowledge, primary school teachers, procedural knowledge, rational numbers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1676
1134 A Method for Consensus Building between Teachers and Learners in a Value Co-Creative Learning Service

Authors: Ryota Sugino, Satoshi Mizoguchi, Koji Kimita, Keiichi Muramatsu, Tatsunori Matsui, Yoshiki Shimomura

Abstract:

Improving added value and productivity of services entails improving both value-in-exchange and value-in-use. Value-in-use is realized by value co-creation, where providers and receivers create value together. In higher education services, value-in-use comes from learners achieving learning outcomes (e.g., knowledge and skills) that are consistent with their learning goals. To enhance the learning outcomes of a learner, it is necessary to enhance and utilize the abilities of the teacher along with the abilities of the learner. To do this, however, the learner and the teacher need to build a consensus about their respective roles. Teachers need to provide effective learning content; learners need to choose the appropriate learning strategies by using the learning content through consensus building. This makes consensus building an important factor in value co-creation. However, methods to build a consensus about their respective roles may not be clearly established, making such consensus difficult. In this paper, we propose some strategies for consensus building between a teacher and a learner in value co-creation. We focus on a teacher and learner co-design and propose an analysis method to clarify a collaborative design process to realize value co-creation. We then analyze some counseling data obtained from a university class. This counseling aimed to build a consensus for value-in-use, learning outcomes, and learning strategies between the teacher and the learner.

Keywords: Consensus building, value co-creation, higher education, learning service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1772
1133 Two-Stage Launch Vehicle Trajectory Modeling for Low Earth Orbit Applications

Authors: Assem M. F. Sallam, Ah. El-S. Makled

Abstract:

This paper presents a study on the trajectory of a two stage launch vehicle. The study includes dynamic responses of motion parameters as well as the variation of angles affecting the orientation of the launch vehicle (LV). LV dynamic characteristics including state vector variation with corresponding altitude and velocity for the different LV stages separation, as well as the angle of attack and flight path angles are also discussed. A flight trajectory study for the drop zone of first stage and the jettisoning of fairing are introduced in the mathematical modeling to study their effect. To increase the accuracy of the LV model, atmospheric model is used taking into consideration geographical location and the values of solar flux related to the date and time of launch, accurate atmospheric model leads to enhancement of the calculation of Mach number, which affects the drag force over the LV. The mathematical model is implemented on MATLAB based software (Simulink). The real available experimental data are compared with results obtained from the theoretical computation model. The comparison shows good agreement, which proves the validity of the developed simulation model; the maximum error noticed was generally less than 10%, which is a result that can lead to future works and enhancement to decrease this level of error.

Keywords: Launch vehicle modeling, launch vehicle trajectory, mathematical modeling, MATLAB-Simulink.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3299
1132 Effect of High-Heeled Shoes on Gait: A Micro-Electro-Mechanical-Systems Based Approach

Authors: Harun Sumbul, Orhan Ozyurt

Abstract:

The accelerations generated by the shoes in the body should be known in order to prevent balance problems, degradation of body shape and to spend less energy. In this study, it is aimed to investigate the effects of the shoe heel height on the human body. The working group has been created as five women (range 27-32 years) with different characteristics and five shoes with different heel heights (1, 3.5, 5, 7 and 9 cm). Individuals in the study group wore shoes and walked along a 20-meter racecourse. The accelerations created by the shoes are measured in three axes (30.270 accelerometric data) and analyzed. Results show us that; while walking with high-heeled shoes, the foot is lifted more; in this case, more effort has been spent. So, more weight has occurred at ankles and joints. Since high-heeled shoes cause greater acceleration, women wearing high-heeled shoes tend to pay more attention when taking a step. As a result, for foot and body health, shoe heel must be designed to absorb the reaction from the ground. High heels disrupt the structure of the foot and it is damaging the body shape. In this respect, this study is considered to be a remarkable method to find of effect of high-heeled shoes on gait by using accelerometer in the literature.

Keywords: Acceleration, sensor, gait analysis, high shoe heel, micro-electro-mechanical-systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 973
1131 Tagging by Combining Rules- Based Method and Memory-Based Learning

Authors: Tlili-Guiassa Yamina

Abstract:

Many natural language expressions are ambiguous, and need to draw on other sources of information to be interpreted. Interpretation of the e word تعاون to be considered as a noun or a verb depends on the presence of contextual cues. To interpret words we need to be able to discriminate between different usages. This paper proposes a hybrid of based- rules and a machine learning method for tagging Arabic words. The particularity of Arabic word that may be composed of stem, plus affixes and clitics, a small number of rules dominate the performance (affixes include inflexional markers for tense, gender and number/ clitics include some prepositions, conjunctions and others). Tagging is closely related to the notion of word class used in syntax. This method is based firstly on rules (that considered the post-position, ending of a word, and patterns), and then the anomaly are corrected by adopting a memory-based learning method (MBL). The memory_based learning is an efficient method to integrate various sources of information, and handling exceptional data in natural language processing tasks. Secondly checking the exceptional cases of rules and more information is made available to the learner for treating those exceptional cases. To evaluate the proposed method a number of experiments has been run, and in order, to improve the importance of the various information in learning.

Keywords: Arabic language, Based-rules, exceptions, Memorybased learning, Tagging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1622
1130 An Analysis of Innovative Cloud Model as Bridging the Gap between Physical and Virtualized Business Environments: The Customer Perspective

Authors: Asim Majeed, Rehan Bhana, Mak Sharma, Rebecca Goode, Nizam Bolia, Mike, Lloyd-Williams

Abstract:

This study aims to investigate and explore the underlying causes of security concerns of customers emerged when WHSmith transformed its physical system to virtualized business model through NetSuite. NetSuite is essentially fully integrated software which helps transforming the physical system to virtualized business model. Modern organisations are moving away from traditional business models to cloud based models and consequently it is expected to have a better, secure and innovative environment for customers. The vital issue of the modern age race is the security when transforming virtualized through cloud based models and designers of interactive systems often misunderstand privacy and even often ignore it, thus causing concerns for users. The content analysis approach is being used to collect the qualitative data from 120 online bloggers including TRUSTPILOT. The results and finding provide useful new insights into the nature and form of security concerns of online users after they have used the WHSmith services offered online through their website. Findings have theoretical as well as practical implications for the successful adoption of cloud computing Business-to-Business model and similar systems.

Keywords: Innovation, virtualization, cloud computing, organizational flexibility

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1754
1129 A Neural Network Classifier for Estimation of the Degree of Infestation by Late Blight on Tomato Leaves

Authors: Gizelle K. Vianna, Gabriel V. Cunha, Gustavo S. Oliveira

Abstract:

Foliage diseases in plants can cause a reduction in both quality and quantity of agricultural production. Intelligent detection of plant diseases is an essential research topic as it may help monitoring large fields of crops by automatically detecting the symptoms of foliage diseases. This work investigates ways to recognize the late blight disease from the analysis of tomato digital images, collected directly from the field. A pair of multilayer perceptron neural network analyzes the digital images, using data from both RGB and HSL color models, and classifies each image pixel. One neural network is responsible for the identification of healthy regions of the tomato leaf, while the other identifies the injured regions. The outputs of both networks are combined to generate the final classification of each pixel from the image and the pixel classes are used to repaint the original tomato images by using a color representation that highlights the injuries on the plant. The new images will have only green, red or black pixels, if they came from healthy or injured portions of the leaf, or from the background of the image, respectively. The system presented an accuracy of 97% in detection and estimation of the level of damage on the tomato leaves caused by late blight.

Keywords: Artificial neural networks, digital image processing, pattern recognition.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2551
1128 Towards an Enhanced Stochastic Simulation Model for Risk Analysis in Highway Construction

Authors: Anshu Manik, William G. Buttlar, Kasthurirangan Gopalakrishnan

Abstract:

Over the years, there is a growing trend towards quality-based specifications in highway construction. In many Quality Control/Quality Assurance (QC/QA) specifications, the contractor is primarily responsible for quality control of the process, whereas the highway agency is responsible for testing the acceptance of the product. A cooperative investigation was conducted in Illinois over several years to develop a prototype End-Result Specification (ERS) for asphalt pavement construction. The final characteristics of the product are stipulated in the ERS and the contractor is given considerable freedom in achieving those characteristics. The risk for the contractor or agency depends on how the acceptance limits and processes are specified. Stochastic simulation models are very useful in estimating and analyzing payment risk in ERS systems and these form an integral part of the Illinois-s prototype ERS system. This paper describes the development of an innovative methodology to estimate the variability components in in-situ density, air voids and asphalt content data from ERS projects. The information gained from this would be crucial in simulating these ERS projects for estimation and analysis of payment risks associated with asphalt pavement construction. However, these methods require at least two parties to conduct tests on all the split samples obtained according to the sampling scheme prescribed in present ERS implemented in Illinois.

Keywords: Asphalt Pavement, Risk Analysis, StochasticSimulation, QC/QA.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1513
1127 Sustainable Solutions for Enhancing Efficiency, Safety, and Quality of Construction Value Chain Services Integration

Authors: Lo Kar Yin

Abstract:

In view of the increasing speed and quantity of the housing supply, building, and civil engineering infrastructure works triggered by the pandemic across the globe, contractors, professional services providers (PSP), including consultants (e.g., architect, project manager, civil/geotechnical/structural engineer, building services engineer, quantity surveyor/cost manager, etc.) and suppliers have faced tremendous challenges of the fierce market, limited manpower, and resources under contract prices fluctuation and competitive fee and price. With qualitative analysis, this paper is to identify the available information from the industry stakeholders with a view to finding solutions for enhancing efficiency, safety, and quality of construction value chain services for public and private organisations/companies’ sustainable growth, not limited to checking the deliverables and data transfer from multi-disciplinary parties. Technology, contracts, and people are the key requirements for shaping the construction industry. With the integration of a modern engineering contract (e.g., NEC) collaborative approach, practical workflows are designed to address loopholes together with different levels of people employment/retention and technology adoption to achieve the best value for money.

Keywords: Sustainable Development, Sustainable solutions, contract, construction value chain, Building Information Modelling, BIM integration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 183
1126 Incorporation Mechanism of Stabilizing Simulated Lead-Laden Sludge in Aluminum-Rich Ceramics

Authors: Xingwen Lu, Kaimin Shih

Abstract:

This study investigated a strategy of blending lead-laden sludge and Al-rich precursors to reduce the release of metals from the stabilized products. Using PbO as the simulated lead-laden sludge to sinter with γ-Al2O3 by Pb:Al molar ratios of 1:2 and 1:12, PbAl2O4 and PbAl12O19 were formed as final products during the sintering process, respectively. By firing the PbO + γ-Al2O3 mixtures with different Pb/Al molar ratios at 600 to 1000 °C, the lead transformation was determined through X-ray diffraction (XRD) data. In Pb/Al molar ratio of 1/2 system, the formation of PbAl2O4 is initiated at 700 °C, but an effective formation was observed above 750 °C. An intermediate phase, Pb9Al8O21, was detected in the temperature range of 800-900 °C. However, different incorporation behavior for sintering PbO with Al-rich precursors at a Pb/Al molar ratio of 1/12 was observed during the formation of PbAl12O19 in this system. In the sintering process, both temperature and time effect on the formation of PbAl2O4 and PbAl12O19 phases were estimated. Finally, a prolonged leaching test modified from the U.S. Environmental Protection Agency-s toxicity characteristic leaching procedure (TCLP) was used to evaluate the durability of PbO, Pb9Al8O21, PbAl2O4 and PbAl12O19 phases. Comparison for the leaching results of the four phases demonstrated the higher intrinsic resistance of PbAl12O19 against acid attack.

Keywords: Sludge, Lead, Stabilization, Leaching behavior

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1913
1125 A Bio-Ecological Perspective on Risk Awareness and Factors Associated with Substance Use during Pregnancy in Communities of the Western Cape Province, South Africa

Authors: Mutshinye Manguvhewa, Maria Florence, Mansoo Yu

Abstract:

Substance use among pregnant women is a perennial problem in the Western Cape Province of South Africa. There are many influential elements related with substance use among women of childbearing-age. Factors associated with substance use during pregnancy were explored using qualitative research approach and bio-ecological theoretical framework was utilised to guide the study. Participants were selected using purposive sampling. Participants accessed from the Department of Social Development who met the inclusion criteria of the study were interviewed using semi structured interviews. Participants were referred for psychological intervention during the interview if deemed necessary. Braun and Clarke’s six phases of thematic analysis were used to analyse the data. The study adhered to ethical measures for the participants’ protection. Participants had been knowledgeable about the study earlier than the initiation of the interviews and the important points of their voluntary participation had been explained. The key findings from this study illustrate that social factors, individual area and romantic relationship are the major contributing factors to substance use among pregnant ladies in this sample. Recommendations arising from the study encompass that the stakeholders, rehabilitation centers, Department of Health and future researchers ought to act proactively against substance use all through pregnancy.

Keywords: Bio-ecological factors, pregnancy risk awareness, antenatal care, substance use.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 454
1124 Design, Simulation, and Implementation of a Digital Pulse Oxygen Saturation Measurement System Using the Arduino Microcontroller

Authors: Muhibul Haque Bhuyan, Md. Refat Sarder

Abstract:

If a person can monitor his/her oxygen saturation level intermittently then he/she can identify his/her condition early and thus he/she can seek a doctor’s help. This paper reports the design, simulation, and implementation of a low-cost pulse oxygen saturation measurement device based on a reflective photoplethysmography (PPG) system using an integrated circuit sensor as the fundamental component of this health status checking device. The measurement of the physiological parameter is the blood oxygen saturation level (SpO2) in the peripheral capillary. This work has been implemented using an Arduino Uno R3 microcontroller along with this sensor integrated circuit (IC). The system is designed in the Proteus environment and then simulated to check its performance. After that, the hardware implementation is performed. We used a clipping type optical sensor to sense the arterial oxygen saturation level of blood signal from the fingertips of an individual and then transformed it into the digital data in the microcontroller through its programming its instruction. The designed system was tested by measuring the SpO2 level for several people of different ages, from 12 to 57 years of age. Besides, the same people were tested using a standard machine purchased from the market. Test results were found very satisfactory as the average percentage of error was very low, 1.59% only.

Keywords: Digital pulse oxygen saturation level, oximeter, measurement, design, simulation, implementation, proteus, Arduino Uno microcontroller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1858
1123 Dam Operation Management Criteria during Floods: Case Study of Dez Dam in Southwest Iran

Authors: Ali Heidari

Abstract:

This paper presents the principles for improving flood mitigation operation in multipurpose dams and maximizing reservoir performance during flood occurrence with a focus on the real-time operation of gated spillways. The criteria of operation include the safety of dams during flood management, minimizing the downstream flood risk by decreasing the flood hazard and fulfilling water supply and other purposes of the dam operation in mid and long terms horizons. The parameters deemed to be important include flood inflow, outlet capacity restrictions, downstream flood inundation damages, economic revenue of dam operation, and environmental and sedimentation restrictions. A simulation model was used to determine the real-time release of the Dez Dam located in the Dez Rivers in southwest Iran, considering the gate regulation curves for the gated spillway. The results of the simulation model show that there is a possibility to improve the current procedures used in the real-time operation of the dams, particularly using gate regulation curves and early flood forecasting system results. The Dez Dam operation data show that in one of the best flood control records, 17% of the total active volume and flood control pool of the reservoir have not been used in decreasing the downstream flood hazard despite the availability of a flood forecasting system.

Keywords: Dam operation, flood control criteria, Dez Dam, Iran.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 385
1122 Didactical and Semiotic Affordance of GeoGebra in a Productive Mathematical Discourse

Authors: I. Benning

Abstract:

Using technology to expand the learning space is critical for a productive mathematical discourse. This is a case study of two teachers who developed and enacted GeoGebra-based mathematics lessons following their engagement in a two-year professional development. The didactical and semiotic affordance of GeoGebra in widening the learning space for a productive mathematical discourse was explored. The approach of thematic analysis was used for lesson artefact, lesson observation, and interview data. The results indicated that constructing tools in GeoGebra provided a didactical milieu where students used them to explore mathematical concepts with little or no support from their teacher. The prompt feedback from the GeoGebra motivated students to practice mathematical concepts repeatedly in which they privately rethink their solutions before comparing their answers with that of their colleagues. The constructing tools enhanced self-discovery, team spirit, and dialogue among students. With regards to the semiotic construct, the tools widened the physical and psychological atmosphere of the classroom by providing animations that served as virtual concrete to enhance the recording, manipulation, testing of a mathematical idea, construction, and interpretation of geometric objects. These findings advance the discussion of widening the classroom for a productive mathematical discourse within the context of the mathematics curriculum of Ghana and similar sub-Saharan African countries.

Keywords: GeoGebra, theory of didactical situation, semiotic mediation, mathematics laboratory, mathematical discussion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 393
1121 Study of Heat Transfer in the Poly Ethylene Fluidized Bed Reactor Numerically and Experimentally

Authors: Mahdi Hamzehei

Abstract:

In this research, heat transfer of a poly Ethylene fluidized bed reactor without reaction were studied experimentally and computationally at different superficial gas velocities. A multifluid Eulerian computational model incorporating the kinetic theory for solid particles was developed and used to simulate the heat conducting gas–solid flows in a fluidized bed configuration. Momentum exchange coefficients were evaluated using the Syamlal– O-Brien drag functions. Temperature distributions of different phases in the reactor were also computed. Good agreement was found between the model predictions and the experimentally obtained data for the bed expansion ratio as well as the qualitative gas–solid flow patterns. The simulation and experimental results showed that the gas temperature decreases as it moves upward in the reactor, while the solid particle temperature increases. Pressure drop and temperature distribution predicted by the simulations were in good agreement with the experimental measurements at superficial gas velocities higher than the minimum fluidization velocity. Also, the predicted time-average local voidage profiles were in reasonable agreement with the experimental results. The study showed that the computational model was capable of predicting the heat transfer and the hydrodynamic behavior of gas-solid fluidized bed flows with reasonable accuracy.

Keywords: Gas-solid flows, fluidized bed, Hydrodynamics, Heat transfer, Turbulence model, CFD

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1959
1120 Modeling and Optimization of Abrasive Waterjet Parameters using Regression Analysis

Authors: Farhad Kolahan, A. Hamid Khajavi

Abstract:

Abrasive waterjet is a novel machining process capable of processing wide range of hard-to-machine materials. This research addresses modeling and optimization of the process parameters for this machining technique. To model the process a set of experimental data has been used to evaluate the effects of various parameter settings in cutting 6063-T6 aluminum alloy. The process variables considered here include nozzle diameter, jet traverse rate, jet pressure and abrasive flow rate. Depth of cut, as one of the most important output characteristics, has been evaluated based on different parameter settings. The Taguchi method and regression modeling are used in order to establish the relationships between input and output parameters. The adequacy of the model is evaluated using analysis of variance (ANOVA) technique. The pairwise effects of process parameters settings on process response outputs are also shown graphically. The proposed model is then embedded into a Simulated Annealing algorithm to optimize the process parameters. The optimization is carried out for any desired values of depth of cut. The objective is to determine proper levels of process parameters in order to obtain a certain level of depth of cut. Computational results demonstrate that the proposed solution procedure is quite effective in solving such multi-variable problems.

Keywords: AWJ cutting, Mathematical modeling, Simulated Annealing, Optimization

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2153
1119 Approach for Demonstrating Reliability Targets for Rail Transport during Low Mileage Accumulation in the Field: Methodology and Case Study

Authors: Nipun Manirajan, Heeralal Gargama, Sushil Guhe, Manoj Prabhakaran

Abstract:

In railway industry, train sets are designed based on contractual requirements (mission profile), where reliability targets are measured in terms of mean distance between failures (MDBF). However, during the beginning of revenue services, trains do not achieve the designed mission profile distance (mileage) within the timeframe due to infrastructure constraints, scarcity of commuters or other operational challenges thereby not respecting the original design inputs. Since trains do not run sufficiently and do not achieve the designed mileage within the specified time, car builder has a risk of not achieving the contractual MDBF target. This paper proposes a constant failure rate based model to deal with the situations where mileage accumulation is not a part of the design mission profile. The model provides appropriate MDBF target to be demonstrated based on actual accumulated mileage. A case study of rolling stock running in the field is undertaken to analyze the failure data and MDBF target demonstration during low mileage accumulation. The results of case study prove that with the proposed method, reliability targets are achieved under low mileage accumulation.

Keywords: Mean distance between failures, mileage based reliability, reliability target normalization, rolling stock reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1181