Search results for: common vector approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18827

Search results for: common vector approach

18347 Creating Knowledge Networks: Comparative Analysis of Reference Cases

Authors: Sylvia Villarreal, Edna Bravo

Abstract:

Knowledge management focuses on coordinating technologies, people, processes, and structures to generate a competitive advantage and considering that networks are perceived as mechanisms for knowledge creation and transfer, this research presents the stages and practices related to the creation of knowledge networks. The methodology started with a literature review adapted from the systematic literature review (SLR). The descriptive analysis includes variables such as approach (conceptual or practical), industry, knowledge management processes and mythologies (qualitative or quantitative), etc. The content analysis includes identification of reference cases. These cases were characterized based on variables as scope, creation goal, years, network approach, actors and creation methodology. It was possible to do a comparative analysis to determinate similarities and differences in these cases documented in knowledge network scientific literature. Consequently, it was shown that even the need and impact of knowledge networks in organizations, the initial guidelines for their creation are not documented, so there is not a guide of good practices and lessons learned. The reference cases are from industries as energy, education, creative, automotive and textile. Their common points are the human approach; it is oriented to interactions to facilitate the appropriation of knowledge, explicit and tacit. The stages of every case are analyzed to propose the main successful elements.

Keywords: creation, knowledge management, network, stages

Procedia PDF Downloads 289
18346 Comparison of Crossover Types to Obtain Optimal Queries Using Adaptive Genetic Algorithm

Authors: Wafa’ Alma'Aitah, Khaled Almakadmeh

Abstract:

this study presents an information retrieval system of using genetic algorithm to increase information retrieval efficiency. Using vector space model, information retrieval is based on the similarity measurement between query and documents. Documents with high similarity to query are judge more relevant to the query and should be retrieved first. Using genetic algorithms, each query is represented by a chromosome; these chromosomes are fed into genetic operator process: selection, crossover, and mutation until an optimized query chromosome is obtained for document retrieval. Results show that information retrieval with adaptive crossover probability and single point type crossover and roulette wheel as selection type give the highest recall. The proposed approach is verified using (242) proceedings abstracts collected from the Saudi Arabian national conference.

Keywords: genetic algorithm, information retrieval, optimal queries, crossover

Procedia PDF Downloads 280
18345 Implementing a Plurilingual Approach to ELF in Primary School: An International Comparative Study

Authors: A. Chabert

Abstract:

The present paper is motivated by the current influence of communicative approaches in language policies around the globe (especially through the Common European Framework of Reference), along with the exponential spread of English as a Lingua Franca worldwide. This study focuses on English language learning and teaching in the last year of primary education in Spain (in the bilingual Valencian region), Norway (in the Trondelag region), and China (in the Hunan region) and proposes a plurilingual communicative approach to ELT in line with ELF awareness and the current retheorisation of ELF within multilingualism (Jenkins, 2018). This study, interdisciplinary in nature, attempts to find a convergence point among English Language Teaching, English as a Lingua Franca, Language Ecology and Multilingualism, breaking with the boundaries that separate languages in language teaching and acknowledging English as international communication, while protecting the mother tongue and language diversity within multilingualism. Our experiment included over 400 students across Spain, Norway, and China, and the outcomes obtained demonstrate that despite the different factors involved in different cultures and contexts, a plurilingual approach to English learning improved English scores by 20% in each of the contexts. Through our study, we reflect on the underestimated value of the mother tongue in ELT, as well as the need for a sustainable ELF perspective in education worldwide.

Keywords: English as a Lingua Franca, English language teaching, language ecology, multilingualism

Procedia PDF Downloads 125
18344 Enhancing Communicative Skills for Students in Automatics

Authors: Adrian Florin Busu

Abstract:

The communicative approach, or communicative language teaching, used for enhancing communicative skills in students in automatics is a modern teaching approach based on the concept of learning a language through having to communicate real meaning. In the communicative approach, real communication is both the objective of learning and the means through which it takes place. This approach was initiated during the 1970’s and quickly became prominent, as it proposed an alternative to the previous systems-oriented approaches. In other words, instead of focusing on the acquisition of grammar and vocabulary, the communicative approach aims at developing students’ competence to communicate in the target language with an enhanced focus on real-life situations. To put it in an nutshell, CLT considers using the language to be just as important as actually learning the language.

Keywords: communication, approach, objective, learning

Procedia PDF Downloads 148
18343 Research of Actuators of Common Rail Injection Systems with the Use of LabVIEW on a Specially Designed Test Bench

Authors: G. Baranski, A. Majczak, M. Wendeker

Abstract:

Currently, the most commonly used solution to provide fuel to the diesel engines is the Common Rail system. Compared to previous designs, as a due to relatively simple construction and electronic control systems, these systems allow achieving favourable engine operation parameters with particular emphasis on low emission of toxic compounds into the atmosphere. In this system, the amount of injected fuel dose is strictly dependent on the course of parameters of the electrical impulse sent by the power amplifier power supply system injector from the engine controller. The article presents the construction of a laboratory test bench to examine the course of the injection process and the expense in storage injection systems. The test bench enables testing of injection systems with electromagnetically controlled injectors with the use of scientific engineering tools. The developed system is based on LabView software and CompactRIO family controller using FPGA systems and a real time microcontroller. The results of experimental research on electromagnetic injectors of common rail system, controlled by a dedicated National Instruments card, confirm the effectiveness of the presented approach. The results of the research described in the article present the influence of basic parameters of the electric impulse opening the electromagnetic injector on the value of the injected fuel dose. Acknowledgement: This work has been realized in the cooperation with The Construction Office of WSK ‘PZL-KALISZ’ S.A.’ and is part of Grant Agreement No. POIR.01.02.00-00-0002/15 financed by the Polish National Centre for Research and Development.

Keywords: fuel injector, combustion engine, fuel pressure, compression ignition engine, power supply system, controller, LabVIEW

Procedia PDF Downloads 117
18342 Machine Learning for Aiding Meningitis Diagnosis in Pediatric Patients

Authors: Karina Zaccari, Ernesto Cordeiro Marujo

Abstract:

This paper presents a Machine Learning (ML) approach to support Meningitis diagnosis in patients at a children’s hospital in Sao Paulo, Brazil. The aim is to use ML techniques to reduce the use of invasive procedures, such as cerebrospinal fluid (CSF) collection, as much as possible. In this study, we focus on predicting the probability of Meningitis given the results of a blood and urine laboratory tests, together with the analysis of pain or other complaints from the patient. We tested a number of different ML algorithms, including: Adaptative Boosting (AdaBoost), Decision Tree, Gradient Boosting, K-Nearest Neighbors (KNN), Logistic Regression, Random Forest and Support Vector Machines (SVM). Decision Tree algorithm performed best, with 94.56% and 96.18% accuracy for training and testing data, respectively. These results represent a significant aid to doctors in diagnosing Meningitis as early as possible and in preventing expensive and painful procedures on some children.

Keywords: machine learning, medical diagnosis, meningitis detection, pediatric research

Procedia PDF Downloads 139
18341 Developing High-Definition Flood Inundation Maps (HD-Fims) Using Raster Adjustment with Scenario Profiles (RASPTM)

Authors: Robert Jacobsen

Abstract:

Flood inundation maps (FIMs) are an essential tool in communicating flood threat scenarios to the public as well as in floodplain governance. With an increasing demand for online raster FIMs, the FIM State-of-the-Practice (SOP) is rapidly advancing to meet the dual requirements for high-resolution and high-accuracy—or High-Definition. Importantly, today’s technology also enables the resolution of problems of local—neighborhood-scale—bias errors that often occur in FIMs, even with the use of SOP two-dimensional flood modeling. To facilitate the development of HD-FIMs, a new GIS method--Raster Adjustment with Scenario Profiles, RASPTM—is described for adjusting kernel raster FIMs to match refined scenario profiles. With RASPTM, flood professionals can prepare HD-FIMs for a wide range of scenarios with available kernel rasters, including kernel rasters prepared from vector FIMs. The paper provides detailed procedures for RASPTM, along with an example of applying RASPTM to prepare an HD-FIM for the August 2016 Flood in Louisiana using both an SOP kernel raster and a kernel raster derived from an older vector-based flood insurance rate map. The accuracy of the HD-FIMs achieved with the application of RASPTM to the two kernel rasters is evaluated.

Keywords: hydrology, mapping, high-definition, inundation

Procedia PDF Downloads 60
18340 Time Parameter Based for the Detection of Catastrophic Faults in Analog Circuits

Authors: Arabi Abderrazak, Bourouba Nacerdine, Ayad Mouloud, Belaout Abdeslam

Abstract:

In this paper, a new test technique of analog circuits using time mode simulation is proposed for the single catastrophic faults detection in analog circuits. This test process is performed to overcome the problem of catastrophic faults being escaped in a DC mode test applied to the inverter amplifier in previous research works. The circuit under test is a second-order low pass filter constructed around this type of amplifier but performing a function that differs from that of the previous test. The test approach performed in this work is based on two key- elements where the first one concerns the unique square pulse signal selected as an input vector test signal to stimulate the fault effect at the circuit output response. The second element is the filter response conversion to a square pulses sequence obtained from an analog comparator. This signal conversion is achieved through a fixed reference threshold voltage of this comparison circuit. The measurement of the three first response signal pulses durations is regarded as fault effect detection parameter on one hand, and as a fault signature helping to hence fully establish an analog circuit fault diagnosis on another hand. The results obtained so far are very promising since the approach has lifted up the fault coverage ratio in both modes to over 90% and has revealed the harmful side of faults that has been masked in a DC mode test.

Keywords: analog circuits, analog faults diagnosis, catastrophic faults, fault detection

Procedia PDF Downloads 430
18339 The Russian Preposition 'за': A Cognitive Linguistic Approach

Authors: M. Kalyuga

Abstract:

Prepositions have long been considered to be one of the major challenges for second language learners, since they have multiple uses that differ greatly from one language to another. The traditional approach to second language teaching supplies students with a list of uses of a preposition that they have to memorise and no explanation is provided. Contrary to the traditional grammar approach, the cognitive linguistic approach offers an explanation for the use of prepositions and provides strategies to comprehend and learn prepositions that would be otherwise seem obscure. The present paper demonstrates the use of the cognitive approach for the explanation of prepositions through the example of the Russian preposition 'за'. The paper demonstrates how various spatial and non-spatial uses of this preposition are linked together through metaphorical and metonymical mapping. The diversity of expressions with за is explained by the range of spatial scenes this preposition is associated with.

Keywords: language teaching, Russian, preposition 'за', cognitive approach

Procedia PDF Downloads 438
18338 Nationalist Approach to the Music Culture in Early Republic Period in Turkey

Authors: Hilmi Yazici

Abstract:

Just after Ottoman period, new more homogenic republic was struggling to form a national identity and dealing with the cultural and historical background of the nation. This new republic had an aim of modernization and westernization which started in the late ottoman period. In this process, the culture was an important basis to form a new nation and it clearly put forward that the new citizens of the new national republic are to have a modern and national culture. The result of this aimed change was to find the Turkish culture suppressed among the common people of the Anatolia and to take the western modernization and breed this with national culture. So in this context, we can say that this approach separated the people from ottoman culture and its roots to empower the national identity. Repeatedly, it may be said that Turkish folkloric music was an important basis for the new revolution, on the other hand classical Turkish music was alienated with the idea that it didn’t belong to Turkish culture. So the aim of this study is to determine how these efforts to nationalize a new identity and culture was successful and conscious intervention to folkloric Turkish music became efficient.

Keywords: opera, nationalism in music, Turkish music

Procedia PDF Downloads 276
18337 Impact of Curvatures in the Dike Line on Wave Run-up and Wave Overtopping, ConDike-Project

Authors: Malte Schilling, Mahmoud M. Rabah, Sven Liebisch

Abstract:

Wave run-up and overtopping are the relevant parameters for the dimensioning of the crest height of dikes. Various experimental as well as numerical studies have investigated these parameters under different boundary conditions (e.g. wave conditions, structure type). Particularly for the dike design in Europe, a common approach is formulated where wave and structure properties are parameterized. However, this approach assumes equal run-up heights and overtopping discharges along the longitudinal axis. However, convex dikes have a heterogeneous crest by definition. Hence, local differences in a convex dike line are expected to cause wave-structure interactions different to a straight dike. This study aims to assess both run-up and overtopping at convexly curved dikes. To cast light on the relevance of curved dikes for the design approach mentioned above, physical model tests were conducted in a 3D wave basin of the Ludwig-Franzius-Institute Hannover. A dike of a slope of 1:6 (height over length) was tested under both regular waves and TMA wave spectra. Significant wave heights ranged from 7 to 10 cm and peak periods from 1.06 to 1.79 s. Both run-up and overtopping was assessed behind the curved and straight sections of the dike. Both measurements were compared to a dike with a straight line. It was observed that convex curvatures in the longitudinal dike line cause a redirection of incident waves leading to a concentration around the center point. Measurements prove that both run-up heights and overtopping rates are higher than on the straight dike. It can be concluded that deviations from a straight longitudinal dike line have an impact on design parameters and imply uncertainties within the design approach in force. Therefore, it is recommended to consider these influencing factors for such cases.

Keywords: convex dike, longitudinal curvature, overtopping, run-up

Procedia PDF Downloads 283
18336 Production and Leftovers Usage Policies to Minimize Food Waste under Uncertain and Correlated Demand

Authors: Esma Birisci, Ronald McGarvey

Abstract:

One of the common problems in food service industry is demand uncertainty. This research presents a multi-criteria optimization approach to identify the efficient frontier of points lying between the minimum-waste and minimum-shortfall solutions within uncertain demand environment. It also addresses correlation across demands for items (e.g., hamburgers are often demanded with french fries). Reducing overproduction food waste (and its corresponding environmental impacts) and an aversion to shortfalls (leave some customer hungry) need to consider as two contradictory objectives in an all-you-care-to-eat environment food service operation. We identify optimal production adjustments relative to demand forecasts, demand thresholds for utilization of leftovers, and percentages of demand to be satisfied by leftovers, considering two alternative metrics for overproduction waste: mass; and greenhouse gas emissions. Demand uncertainty and demand correlations are addressed using a kernel density estimation approach. A statistical analysis of the changes in decision variable values across each of the efficient frontiers can then be performed to identify the key variables that could be modified to reduce the amount of wasted food at minimal increase in shortfalls. We illustrate our approach with an application to empirical data from Campus Dining Services operations at the University of Missouri.

Keywords: environmental studies, food waste, production planning, uncertain and correlated demand

Procedia PDF Downloads 360
18335 Pre-service Social Studies Teachers Readiness in Promoting 21st Century Learning: Evidence from a Ghanaian University

Authors: Joseph Bentil

Abstract:

Successful acquisition of 21st-century competencies needed by students to navigate through the ever-changing world requires that they are taught and molded by 21st-century teachers with the needed professional competencies. Accordingly, this study sought to understand the readiness and how efficacious pre-service Social Studies specialism students are towards the implementation of the Common Core Social Studies Curriculum in the Junior High Schools in Ghana. Theory of Experience served as the theoretical lens for the study. Working within the pragmatist paradigm, this study utilized the cross-sectional descriptive survey design with a mixed method approach where, through census sampling technique, all the 120 pre-service Social Studies specialism students were sampled for the study. A structured questionnaire and an interview guide were the instruments employed for data collection. Descriptive statistics (mean, standard deviation and inferential statistics like independent samples t-test, one-way between groups ANOVA and Pearson Product Moment Correlation) were employed in the analysis the research questions and hypotheses with the aid of version 28 of SPSS while the qualitative data was analyzed using thematic analysis. The findings discovered that pre-service Social Studies teachers were highly ready and efficacious towards implementing the Common Core Junior High School Social Studies curriculum. However, male pre-service teachers were highly efficacious and ready than their female counterparts. Besides, it was disclosed that pre-service teachers within the 31-40 years age bracket were found to be highly efficacious and ready than their colleagues with 20-30 and below 20 years age bracket respectively. The findings further revealed that there was a moderate and statistically significant positive relationship between pre-service teachers’ readiness and efficacy in implementing the Common Core Social Studies curriculum. Therefore, the study recommended that interventional programmes aimed at raising the readiness and efficacy beliefs of pre-service teachers should be targeted towards female preservice teachers and those below 20 years age bracket for successful implementation and realization of the competencies enshrined in the common core social Studies curriculum.

Keywords: pre-service, readiness, social studies, teachers

Procedia PDF Downloads 67
18334 A Method for False Alarm Recognition Based on Multi-Classification Support Vector Machine

Authors: Weiwei Cui, Dejian Lin, Leigang Zhang, Yao Wang, Zheng Sun, Lianfeng Li

Abstract:

Built-in test (BIT) is an important technology in testability field, and it is widely used in state monitoring and fault diagnosis. With the improvement of modern equipment performance and complexity, the scope of BIT becomes larger, and it leads to the emergence of false alarm problem. The false alarm makes the health assessment unstable, and it reduces the effectiveness of BIT. The conventional false alarm suppression methods such as repeated test and majority voting cannot meet the requirement for a complicated system, and the intelligence algorithms such as artificial neural networks (ANN) are widely studied and used. However, false alarm has a very low frequency and small sample, yet a method based on ANN requires a large size of training sample. To recognize the false alarm, we propose a method based on multi-classification support vector machine (SVM) in this paper. Firstly, we divide the state of a system into three states: healthy, false-alarm, and faulty. Then we use multi-classification with '1 vs 1' policy to train and recognize the state of a system. Finally, an example of fault injection system is taken to verify the effectiveness of the proposed method by comparing ANN. The result shows that the method is reasonable and effective.

Keywords: false alarm, fault diagnosis, SVM, k-means, BIT

Procedia PDF Downloads 145
18333 An Analytic Network Process Approach towards Academic Staff Selection

Authors: Nasrullah khan

Abstract:

Today business environment is very dynamic and most of organizations are in tough competition for their added values and sustainable hold in market. To achieve such objectives, organizations must have dynamic and creative people as optimized process. To get these people, there should strong human resource management system in organizations. There are multiple approaches have been devised in literature to hire more job relevant and more suitable people. This study proposed an ANP (Analytic Network Process) approach to hire faculty members for a university system. This study consists of two parts. In fist part, a through literature survey and universities interview are conducted in order to find the common criteria for the selection of academic staff. In second part the available candidates are prioritized on the basis of the relative values of these criteria. According to results the GRE & foreign language, GPA and research paper writing were most important factors for the selection of academic staff.

Keywords: creative people, ANP, academic staff, business environment

Procedia PDF Downloads 402
18332 Artificial Intelligence-Generated Previews of Hyaluronic Acid-Based Treatments

Authors: Ciro Cursio, Giulia Cursio, Pio Luigi Cursio, Luigi Cursio

Abstract:

Communication between practitioner and patient is of the utmost importance in aesthetic medicine: as of today, images of previous treatments are the most common tool used by doctors to describe and anticipate future results for their patients. However, using photos of other people often reduces the engagement of the prospective patient and is further limited by the number and quality of pictures available to the practitioner. Pre-existing work solves this issue in two ways: 3D scanning of the area with manual editing of the 3D model by the doctor or automatic prediction of the treatment by warping the image with hand-written parameters. The first approach requires the manual intervention of the doctor, while the second approach always generates results that aren’t always realistic. Thus, in one case, there is significant manual work required by the doctor, and in the other case, the prediction looks artificial. We propose an AI-based algorithm that autonomously generates a realistic prediction of treatment results. For the purpose of this study, we focus on hyaluronic acid treatments in the facial area. Our approach takes into account the individual characteristics of each face, and furthermore, the prediction system allows the patient to decide which area of the face she wants to modify. We show that the predictions generated by our system are realistic: first, the quality of the generated images is on par with real images; second, the prediction matches the actual results obtained after the treatment is completed. In conclusion, the proposed approach provides a valid tool for doctors to show patients what they will look like before deciding on the treatment.

Keywords: prediction, hyaluronic acid, treatment, artificial intelligence

Procedia PDF Downloads 102
18331 Application of the Building Information Modeling Planning Approach to the Factory Planning

Authors: Peggy Näser

Abstract:

Factory planning is a systematic, objective-oriented process for planning a factory, structured into a sequence of phases, each of which is dependent on the preceding phase and makes use of particular methods and tools, and extending from the setting of objectives to the start of production. The digital factory, on the other hand, is the generic term for a comprehensive network of digital models, methods, and tools – including simulation and 3D visualisation – integrated by a continuous data management system. Its aim is the holistic planning, evaluation and ongoing improvement of all the main structures, processes and resources of the real factory in conjunction with the product. Digital factory planning has already become established in factory planning. The application of Building Information Modeling has not yet been established in factory planning but has been used predominantly in the planning of public buildings. Furthermore, this concept is limited to the planning of the buildings and does not include the planning of equipment of the factory (machines, technical equipment) and their interfaces to the building. BIM is a cooperative method of working, in which the information and data relevant to its lifecycle are consistently recorded, managed and exchanged in a transparent communication between the involved parties on the basis of digital models of a building. Both approaches, the planning approach of Building Information Modeling and the methodical approach of the Digital Factory, are based on the use of a comprehensive data model. Therefore it is necessary to examine how the approach of Building Information Modeling can be extended in the context of factory planning in such a way that an integration of the equipment planning, as well as the building planning, can take place in a common digital model. For this, a number of different perspectives have to be investigated: the equipment perspective including the tools used to implement a comprehensive digital planning process, the communication perspective between the planners of different fields, the legal perspective, that the legal certainty in each country and the quality perspective, on which the quality criteria are defined and the planning will be evaluated. The individual perspectives are examined and illustrated in the article. An approach model for the integration of factory planning into the BIM approach, in particular for the integrated planning of equipment and buildings and the continuous digital planning is developed. For this purpose, the individual factory planning phases are detailed in the sense of the integration of the BIM approach. A comprehensive software concept is shown on the tool. In addition, the prerequisites required for this integrated planning are presented. With the help of the newly developed approach, a better coordination between equipment and buildings is to be achieved, the continuity of the digital factory planning is improved, the data quality is improved and expensive implementation errors are avoided in the implementation.

Keywords: building information modeling, digital factory, digital planning, factory planning

Procedia PDF Downloads 251
18330 An Intelligent Text Independent Speaker Identification Using VQ-GMM Model Based Multiple Classifier System

Authors: Ben Soltane Cheima, Ittansa Yonas Kelbesa

Abstract:

Speaker Identification (SI) is the task of establishing identity of an individual based on his/her voice characteristics. The SI task is typically achieved by two-stage signal processing: training and testing. The training process calculates speaker specific feature parameters from the speech and generates speaker models accordingly. In the testing phase, speech samples from unknown speakers are compared with the models and classified. Even though performance of speaker identification systems has improved due to recent advances in speech processing techniques, there is still need of improvement. In this paper, a Closed-Set Tex-Independent Speaker Identification System (CISI) based on a Multiple Classifier System (MCS) is proposed, using Mel Frequency Cepstrum Coefficient (MFCC) as feature extraction and suitable combination of vector quantization (VQ) and Gaussian Mixture Model (GMM) together with Expectation Maximization algorithm (EM) for speaker modeling. The use of Voice Activity Detector (VAD) with a hybrid approach based on Short Time Energy (STE) and Statistical Modeling of Background Noise in the pre-processing step of the feature extraction yields a better and more robust automatic speaker identification system. Also investigation of Linde-Buzo-Gray (LBG) clustering algorithm for initialization of GMM, for estimating the underlying parameters, in the EM step improved the convergence rate and systems performance. It also uses relative index as confidence measures in case of contradiction in identification process by GMM and VQ as well. Simulation results carried out on voxforge.org speech database using MATLAB highlight the efficacy of the proposed method compared to earlier work.

Keywords: feature extraction, speaker modeling, feature matching, Mel frequency cepstrum coefficient (MFCC), Gaussian mixture model (GMM), vector quantization (VQ), Linde-Buzo-Gray (LBG), expectation maximization (EM), pre-processing, voice activity detection (VAD), short time energy (STE), background noise statistical modeling, closed-set tex-independent speaker identification system (CISI)

Procedia PDF Downloads 295
18329 A General Framework for Knowledge Discovery Using High Performance Machine Learning Algorithms

Authors: S. Nandagopalan, N. Pradeep

Abstract:

The aim of this paper is to propose a general framework for storing, analyzing, and extracting knowledge from two-dimensional echocardiographic images, color Doppler images, non-medical images, and general data sets. A number of high performance data mining algorithms have been used to carry out this task. Our framework encompasses four layers namely physical storage, object identification, knowledge discovery, user level. Techniques such as active contour model to identify the cardiac chambers, pixel classification to segment the color Doppler echo image, universal model for image retrieval, Bayesian method for classification, parallel algorithms for image segmentation, etc., were employed. Using the feature vector database that have been efficiently constructed, one can perform various data mining tasks like clustering, classification, etc. with efficient algorithms along with image mining given a query image. All these facilities are included in the framework that is supported by state-of-the-art user interface (UI). The algorithms were tested with actual patient data and Coral image database and the results show that their performance is better than the results reported already.

Keywords: active contour, bayesian, echocardiographic image, feature vector

Procedia PDF Downloads 407
18328 Face Tracking and Recognition Using Deep Learning Approach

Authors: Degale Desta, Cheng Jian

Abstract:

The most important factor in identifying a person is their face. Even identical twins have their own distinct faces. As a result, identification and face recognition are needed to tell one person from another. A face recognition system is a verification tool used to establish a person's identity using biometrics. Nowadays, face recognition is a common technique used in a variety of applications, including home security systems, criminal identification, and phone unlock systems. This system is more secure because it only requires a facial image instead of other dependencies like a key or card. Face detection and face identification are the two phases that typically make up a human recognition system.The idea behind designing and creating a face recognition system using deep learning with Azure ML Python's OpenCV is explained in this paper. Face recognition is a task that can be accomplished using deep learning, and given the accuracy of this method, it appears to be a suitable approach. To show how accurate the suggested face recognition system is, experimental results are given in 98.46% accuracy using Fast-RCNN Performance of algorithms under different training conditions.

Keywords: deep learning, face recognition, identification, fast-RCNN

Procedia PDF Downloads 128
18327 Pharmaceutical Applications of Newton's Second Law and Disc Inertia

Authors: Nicholas Jensen

Abstract:

As the effort to create new drugs to treat rare conditions cost-effectively intensifies, there is a need to ensure maximum efficiency in the manufacturing process. This includes the creation of ultracompact treatment forms, which can best be achieved via applications of fundamental laws of physics. This paper reports an experiment exploring the relationship between the forms of Newton's 2ⁿᵈ Law appropriate to linear motion and to transversal architraves. The moment of inertia of three discs was determined by experiments and compared with previous data derived from a theoretical relationship. The method used was to attach the discs to a moment arm. Comparing the results with those obtained from previous experiments, it is found to be consistent with the first law of thermodynamics. It was further found that Newton's 2ⁿᵈ law violates the second law of thermodynamics. The purpose of this experiment was to explore the relationship between the forms of Newton's 2nd Law appropriate to linear motion and to apply torque to a twisting force, which is determined by position vector r and force vector F. Substituting equation alpha in place of beta; angular acceleration is a linear acceleration divided by radius r of the moment arm. The nevrological analogy of Newton's 2nd Law states that these findings can contribute to a fuller understanding of thermodynamics in relation to viscosity. Implications for the pharmaceutical industry will be seen to be fruitful from these findings.

Keywords: Newtonian physics, inertia, viscosity, pharmaceutical applications

Procedia PDF Downloads 107
18326 Dengue Virus Infection Rate in Mosquitoes Collected in Thailand Related to Environmental Factors

Authors: Chanya Jetsukontorn

Abstract:

Dengue hemorrhagic fever is the most important Mosquito-borne disease and the major public health problem in Thailand. The most important vector is Aedes aegypti. Environmental factors such as temperature, relative humidity, and biting rate affect dengue virus infection. The most effective measure for prevention is controlling of vector mosquitoes. In addition, surveillance of field-caught mosquitoes is imperative for determining the natural vector and can provide an early warning sign at risk of transmission in an area. In this study, Aedes aegypti mosquitoes were collected in Amphur Muang, Phetchabun Province, Thailand. The mosquitoes were collected in the rainy season and the dry season both indoor and outdoor. During mosquito’s collection, the data of environmental factors such as temperature, humidity and breeding sites were observed and recorded. After identified to species, mosquitoes were pooled according to genus/species, and sampling location. Pools consisted of a maximum of 10 Aedes mosquitoes. 70 pools of 675 Aedes aegypti were screened with RT-PCR for flaviviruses. To confirm individual infection for determining True infection rate, individual mosquitoes which gave positive results of flavivirus detection were tested for dengue virus by RT-PCR. The infection rate was 5.93% (4 positive individuals from 675 mosquitoes). The probability to detect dengue virus in mosquitoes at the neighbour’s houses was 1.25 times, especially where distances between neighboring houses and patient’s houses were less than 50 meters. The relative humidity in dengue-infected villages with dengue-infected mosquitoes was significantly higher than villages that free from dengue-infected mosquitoes. Indoor biting rate of Aedes aegypti was 14.87 times higher than outdoor, and biting times of 09.00-10.00, 10.00-11.00, 11.00-12.00 yielded 1.77, 1.46, 0.68mosquitoes/man-hour, respectively. These findings confirm environmental factors were related to Dengue infection in Thailand. Data obtained from this study will be useful for the prevention and control of the diseases.

Keywords: Aedes aegypti, Dengue virus, environmental factors, one health, PCR

Procedia PDF Downloads 130
18325 Analysing Time Series for a Forecasting Model to the Dynamics of Aedes Aegypti Population Size

Authors: Flavia Cordeiro, Fabio Silva, Alvaro Eiras, Jose Luiz Acebal

Abstract:

Aedes aegypti is present in the tropical and subtropical regions of the world and is a vector of several diseases such as dengue fever, yellow fever, chikungunya, zika etc. The growth in the number of arboviruses cases in the last decades became a matter of great concern worldwide. Meteorological factors like mean temperature and precipitation are known to influence the infestation by the species through effects on physiology and ecology, altering the fecundity, mortality, lifespan, dispersion behaviour and abundance of the vector. Models able to describe the dynamics of the vector population size should then take into account the meteorological variables. The relationship between meteorological factors and the population dynamics of Ae. aegypti adult females are studied to provide a good set of predictors to model the dynamics of the mosquito population size. The time-series data of capture of adult females of a public health surveillance program from the city of Lavras, MG, Brazil had its association with precipitation, humidity and temperature analysed through a set of statistical methods for time series analysis commonly adopted in Signal Processing, Information Theory and Neuroscience. Cross-correlation, multicollinearity test and whitened cross-correlation were applied to determine in which time lags would occur the influence of meteorological variables on the dynamics of the mosquito abundance. Among the findings, the studied case indicated strong collinearity between humidity and precipitation, and precipitation was selected to form a pair of descriptors together with temperature. In the techniques used, there were observed significant associations between infestation indicators and both temperature and precipitation in short, mid and long terms, evincing that those variables should be considered in entomological models and as public health indicators. A descriptive model used to test the results exhibits a strong correlation to data.

Keywords: Aedes aegypti, cross-correlation, multicollinearity, meteorological variables

Procedia PDF Downloads 165
18324 Identifying Protein-Coding and Non-Coding Regions in Transcriptomes

Authors: Angela U. Makolo

Abstract:

Protein-coding and Non-coding regions determine the biology of a sequenced transcriptome. Research advances have shown that Non-coding regions are important in disease progression and clinical diagnosis. Existing bioinformatics tools have been targeted towards Protein-coding regions alone. Therefore, there are challenges associated with gaining biological insights from transcriptome sequence data. These tools are also limited to computationally intensive sequence alignment, which is inadequate and less accurate to identify both Protein-coding and Non-coding regions. Alignment-free techniques can overcome the limitation of identifying both regions. Therefore, this study was designed to develop an efficient sequence alignment-free model for identifying both Protein-coding and Non-coding regions in sequenced transcriptomes. Feature grouping and randomization procedures were applied to the input transcriptomes (37,503 data points). Successive iterations were carried out to compute the gradient vector that converged the developed Protein-coding and Non-coding Region Identifier (PNRI) model to the approximate coefficient vector. The logistic regression algorithm was used with a sigmoid activation function. A parameter vector was estimated for every sample in 37,503 data points in a bid to reduce the generalization error and cost. Maximum Likelihood Estimation (MLE) was used for parameter estimation by taking the log-likelihood of six features and combining them into a summation function. Dynamic thresholding was used to classify the Protein-coding and Non-coding regions, and the Receiver Operating Characteristic (ROC) curve was determined. The generalization performance of PNRI was determined in terms of F1 score, accuracy, sensitivity, and specificity. The average generalization performance of PNRI was determined using a benchmark of multi-species organisms. The generalization error for identifying Protein-coding and Non-coding regions decreased from 0.514 to 0.508 and to 0.378, respectively, after three iterations. The cost (difference between the predicted and the actual outcome) also decreased from 1.446 to 0.842 and to 0.718, respectively, for the first, second and third iterations. The iterations terminated at the 390th epoch, having an error of 0.036 and a cost of 0.316. The computed elements of the parameter vector that maximized the objective function were 0.043, 0.519, 0.715, 0.878, 1.157, and 2.575. The PNRI gave an ROC of 0.97, indicating an improved predictive ability. The PNRI identified both Protein-coding and Non-coding regions with an F1 score of 0.970, accuracy (0.969), sensitivity (0.966), and specificity of 0.973. Using 13 non-human multi-species model organisms, the average generalization performance of the traditional method was 74.4%, while that of the developed model was 85.2%, thereby making the developed model better in the identification of Protein-coding and Non-coding regions in transcriptomes. The developed Protein-coding and Non-coding region identifier model efficiently identified the Protein-coding and Non-coding transcriptomic regions. It could be used in genome annotation and in the analysis of transcriptomes.

Keywords: sequence alignment-free model, dynamic thresholding classification, input randomization, genome annotation

Procedia PDF Downloads 53
18323 Factors Influencing Soil Organic Carbon Storage Estimation in Agricultural Soils: A Machine Learning Approach Using Remote Sensing Data Integration

Authors: O. Sunantha, S. Zhenfeng, S. Phattraporn, A. Zeeshan

Abstract:

The decline of soil organic carbon (SOC) in global agriculture is a critical issue requiring rapid and accurate estimation for informed policymaking. While it is recognized that SOC predictors vary significantly when derived from remote sensing data and environmental variables, identifying the specific parameters most suitable for accurately estimating SOC in diverse agricultural areas remains a challenge. This study utilizes remote sensing data to precisely estimate SOC and identify influential factors in diverse agricultural areas, such as paddy, corn, sugarcane, cassava, and perennial crops. Extreme gradient boosting (XGBoost), random forest (RF), and support vector regression (SVR) models are employed to analyze these factors' impact on SOC estimation. The results show key factors influencing SOC estimation include slope, vegetation indices (EVI), spectral reflectance indices (red index, red edge2), temperature, land use, and surface soil moisture, as indicated by their averaged importance scores across XGBoost, RF, and SVR models. Therefore, using different machine learning algorithms for SOC estimation reveals varying influential factors from remote sensing data and environmental variables. This approach emphasizes feature selection, as different machine learning algorithms identify various key factors from remote sensing data and environmental variables for accurate SOC estimation.

Keywords: factors influencing SOC estimation, remote sensing data, environmental variables, machine learning

Procedia PDF Downloads 8
18322 Failure Probability Assessment of Concrete Spherical Domes Subjected to Ventilation Controlled Fires Using BIM Tools

Authors: A. T. Kassem

Abstract:

Fires areconsidered a common hazardous action that any building may face. Most buildings’ structural elements are designed, taking into consideration precautions for fire safety, using deterministic design approaches. Public and highly important buildings are commonly designed considering standard fire rating and, in many cases, contain large compartments with central domes. Real fire scenarios are not commonly brought into action in structural design of buildings because of complexities in both scenarios and analysis tools. This paper presents a modern approach towards analysis of spherical domes in real fire condition via implementation of building information modelling, and adopting a probabilistic approach. BIMhas been implemented to bridge the gap between various software packages enabling them to function interactively to model both real fire and corresponding structural response. Ventilation controlled fires scenarios have been modeled using both “Revit” and “Pyrosim”. Monte Carlo simulation has been adopted to engage the probabilistic analysis approach in dealing with various parameters. Conclusions regarding failure probability and fire endurance, in addition to the effects of various parameters, have been extracted.

Keywords: concrete, spherical domes, ventilation controlled fires, BIM, monte carlo simulation, pyrosim, revit

Procedia PDF Downloads 86
18321 An Effective Approach to Knowledge Capture in Whole Life Costing in Constructions Project

Authors: Ndibarafinia Young Tobin, Simon Burnett

Abstract:

In spite of the benefits of implementing whole life costing technique as a valuable approach for comparing alternative building designs allowing operational cost benefits to be evaluated against any initial cost increases and also as part of procurement in the construction industry, its adoption has been relatively slow due to the lack of tangible evidence, ‘know-how’ skills and knowledge of the practice, i.e. the lack of professionals in many establishments with knowledge and training on the use of whole life costing technique, this situation is compounded by the absence of available data on whole life costing from relevant projects, lack of data collection mechanisms and so on. This has proved to be very challenging to those who showed some willingness to employ the technique in a construction project. The knowledge generated from a project can be considered as best practices learned on how to carry out tasks in a more efficient way, or some negative lessons learned which have led to losses and slowed down the progress of the project and performance. Knowledge management in whole life costing practice can enhance whole life costing analysis execution in a construction project, as lessons learned from one project can be carried on to future projects, resulting in continuous improvement, providing knowledge that can be used in the operation and maintenance phases of an assets life span. Purpose: The purpose of this paper is to report an effective approach which can be utilised in capturing knowledge in whole life costing practice in a construction project. Design/methodology/approach: An extensive literature review was first conducted on the concept of knowledge management and whole life costing. This was followed by a semi-structured interview to explore the existing and good practice knowledge management in whole life costing practice in a construction project. The data gathered from the semi-structured interview was analyzed using content analysis and used to structure an effective knowledge capturing approach. Findings: From the results obtained in the study, it shows that the practice of project review is the common method used in the capturing of knowledge and should be undertaken in an organized and accurate manner, and results should be presented in the form of instructions or in a checklist format, forming short and precise insights. The approach developed advised that irrespective of how effective the approach to knowledge capture, the absence of an environment for sharing knowledge, would render the approach ineffective. Open culture and resources are critical for providing a knowledge sharing setting, and leadership has to sustain whole life costing knowledge capture, giving full support for its implementation. The knowledge capturing approach has been evaluated by practitioners who are experts in the area of whole life costing practice. The results have indicated that the approach to knowledge capture is suitable and efficient.

Keywords: whole life costing, knowledge capture, project review, construction industry, knowledge management

Procedia PDF Downloads 253
18320 [Keynote Talk]: sEMG Interface Design for Locomotion Identification

Authors: Rohit Gupta, Ravinder Agarwal

Abstract:

Surface electromyographic (sEMG) signal has the potential to identify the human activities and intention. This potential is further exploited to control the artificial limbs using the sEMG signal from residual limbs of amputees. The paper deals with the development of multichannel cost efficient sEMG signal interface for research application, along with evaluation of proposed class dependent statistical approach of the feature selection method. The sEMG signal acquisition interface was developed using ADS1298 of Texas Instruments, which is a front-end interface integrated circuit for ECG application. Further, the sEMG signal is recorded from two lower limb muscles for three locomotions namely: Plane Walk (PW), Stair Ascending (SA), Stair Descending (SD). A class dependent statistical approach is proposed for feature selection and also its performance is compared with 12 preexisting feature vectors. To make the study more extensive, performance of five different types of classifiers are compared. The outcome of the current piece of work proves the suitability of the proposed feature selection algorithm for locomotion recognition, as compared to other existing feature vectors. The SVM Classifier is found as the outperformed classifier among compared classifiers with an average recognition accuracy of 97.40%. Feature vector selection emerges as the most dominant factor affecting the classification performance as it holds 51.51% of the total variance in classification accuracy. The results demonstrate the potentials of the developed sEMG signal acquisition interface along with the proposed feature selection algorithm.

Keywords: classifiers, feature selection, locomotion, sEMG

Procedia PDF Downloads 280
18319 Normal and Peaberry Coffee Beans Classification from Green Coffee Bean Images Using Convolutional Neural Networks and Support Vector Machine

Authors: Hira Lal Gope, Hidekazu Fukai

Abstract:

The aim of this study is to develop a system which can identify and sort peaberries automatically at low cost for coffee producers in developing countries. In this paper, the focus is on the classification of peaberries and normal coffee beans using image processing and machine learning techniques. The peaberry is not bad and not a normal bean. The peaberry is born in an only single seed, relatively round seed from a coffee cherry instead of the usual flat-sided pair of beans. It has another value and flavor. To make the taste of the coffee better, it is necessary to separate the peaberry and normal bean before green coffee beans roasting. Otherwise, the taste of total beans will be mixed, and it will be bad. In roaster procedure time, all the beans shape, size, and weight must be unique; otherwise, the larger bean will take more time for roasting inside. The peaberry has a different size and different shape even though they have the same weight as normal beans. The peaberry roasts slower than other normal beans. Therefore, neither technique provides a good option to select the peaberries. Defect beans, e.g., sour, broken, black, and fade bean, are easy to check and pick up manually by hand. On the other hand, the peaberry pick up is very difficult even for trained specialists because the shape and color of the peaberry are similar to normal beans. In this study, we use image processing and machine learning techniques to discriminate the normal and peaberry bean as a part of the sorting system. As the first step, we applied Deep Convolutional Neural Networks (CNN) and Support Vector Machine (SVM) as machine learning techniques to discriminate the peaberry and normal bean. As a result, better performance was obtained with CNN than with SVM for the discrimination of the peaberry. The trained artificial neural network with high performance CPU and GPU in this work will be simply installed into the inexpensive and low in calculation Raspberry Pi system. We assume that this system will be used in under developed countries. The study evaluates and compares the feasibility of the methods in terms of accuracy of classification and processing speed.

Keywords: convolutional neural networks, coffee bean, peaberry, sorting, support vector machine

Procedia PDF Downloads 133
18318 The Presidential Mediator: Different Terminologies Same Missions

Authors: Khodr Fakih

Abstract:

The Ombudsman is a procedural mechanism that provides a different approach of dispute resolution. The ombudsman primarily deals with specific grievances from the public against governmental injustice and misconduct. The ombudsman theory is considered an important instrument to any democratic government. This is true since it improves the transparency of the governmental activities in a world in which executive power are rising. Many countries have adopted the concept of Ombudsman but under different terminologies. This paper will provide the different types of Ombudsman and the common activities/processes of fulfilling their mandates.

Keywords: administration, citizens, government, mediator, ombudsman, presidential mediator

Procedia PDF Downloads 321