Search results for: computerized adaptive testing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4193

Search results for: computerized adaptive testing

3833 Application of a SubIval Numerical Solver for Fractional Circuits

Authors: Marcin Sowa

Abstract:

The paper discusses the subinterval-based numerical method for fractional derivative computations. It is now referred to by its acronym – SubIval. The basis of the method is briefly recalled. The ability of the method to be applied in time stepping solvers is discussed. The possibility of implementing a time step size adaptive solver is also mentioned. The solver is tested on a transient circuit example. In order to display the accuracy of the solver – the results have been compared with those obtained by means of a semi-analytical method called gcdAlpha. The time step size adaptive solver applying SubIval has been proven to be very accurate as the results are very close to the referential solution. The solver is currently able to solve FDE (fractional differential equations) with various derivative orders for each equation and any type of source time functions.

Keywords: numerical method, SubIval, fractional calculus, numerical solver, circuit analysis

Procedia PDF Downloads 207
3832 Design Systems and the Need for a Usability Method: Assessing the Fitness of Components and Interaction Patterns in Design Systems Using Atmosphere Methodology

Authors: Patrik Johansson, Selina Mardh

Abstract:

The present study proposes a usability test method, Atmosphere, to assess the fitness of components and interaction patterns of design systems. The method covers the user’s perception of the components of the system, the efficiency of the logic of the interaction patterns, perceived ease of use as well as the user’s understanding of the intended outcome of interactions. These aspects are assessed by combining measures of first impression, visual affordance and expectancy. The method was applied to a design system developed for the design of an electronic health record system. The study was conducted involving 15 healthcare personnel. It could be concluded that the Atmosphere method provides tangible data that enable human-computer interaction practitioners to analyze and categorize components and patterns based on perceived usability, success rate of identifying interactive components and success rate of understanding components and interaction patterns intended outcome.

Keywords: atomic design, atmosphere methodology, design system, expectancy testing, first impression testing, usability testing, visual affordance testing

Procedia PDF Downloads 180
3831 Effect Analysis of an Improved Adaptive Speech Noise Reduction Algorithm in Online Communication Scenarios

Authors: Xingxing Peng

Abstract:

With the development of society, there are more and more online communication scenarios such as teleconference and online education. In the process of conference communication, the quality of voice communication is a very important part, and noise may cause the communication effect of participants to be greatly reduced. Therefore, voice noise reduction has an important impact on scenarios such as voice calls. This research focuses on the key technologies of the sound transmission process. The purpose is to maintain the audio quality to the maximum so that the listener can hear clearer and smoother sound. Firstly, to solve the problem that the traditional speech enhancement algorithm is not ideal when dealing with non-stationary noise, an adaptive speech noise reduction algorithm is studied in this paper. Traditional noise estimation methods are mainly used to deal with stationary noise. In this chapter, we study the spectral characteristics of different noise types, especially the characteristics of non-stationary Burst noise, and design a noise estimator module to deal with non-stationary noise. Noise features are extracted from non-speech segments, and the noise estimation module is adjusted in real time according to different noise characteristics. This adaptive algorithm can enhance speech according to different noise characteristics, improve the performance of traditional algorithms to deal with non-stationary noise, so as to achieve better enhancement effect. The experimental results show that the algorithm proposed in this chapter is effective and can better adapt to different types of noise, so as to obtain better speech enhancement effect.

Keywords: speech noise reduction, speech enhancement, self-adaptation, Wiener filter algorithm

Procedia PDF Downloads 59
3830 Analysis of the Annual Proficiency Testing Procedure for Intermediate Reference Laboratories Conducted by the National Reference Laboratory from 2013 to 2017

Authors: Reena K., Mamatha H. G., Somshekarayya, P. Kumar

Abstract:

Objectives: The annual proficiency testing of intermediate reference laboratories is conducted by the National Reference Laboratory (NRL) to assess the efficiency of the laboratories to correctly identify Mycobacterium tuberculosis and to determine its drug susceptibility pattern. The proficiency testing results from 2013 to 2017 were analyzed to determine laboratories that were consistent in reporting quality results and those that had difficulty in doing so. Methods: A panel of twenty cultures were sent out to each of these laboratories. The laboratories were expected to grow the cultures in their own laboratories, set up drug susceptibly testing by all the methods they were certified for and report the results within the stipulated time period. The turnaround time for reporting results, specificity, sensitivity positive and negative predictive values and efficiency of the laboratory in identifying the cultures were analyzed. Results: Most of the laboratories had reported their results within the stipulated time period. However, there was enormous delay in reporting results from few of the laboratories. This was mainly due to improper functioning of the biosafety level III laboratory. Only 40% of the laboratories had 100% efficiency in solid culture using Lowenstein Jensen medium. This was expected as a solid culture, and drug susceptibility testing is not used for diagnosing drug resistance. Rapid molecular methods such as Line probe assay and Genexpert are used to determine drug resistance. Automated liquid culture system such as the Mycobacterial growth indicator tube is used to determine prognosis of the patient while on treatment. It was observed that 90% of the laboratories had achieved 100% in the liquid culture method. Almost all laboratories had achieved 100% efficiency in the line probe assay method which is the method of choice for determining drug-resistant tuberculosis. Conclusion: Since the liquid culture and line probe assay technologies are routinely used for the detection of drug-resistant tuberculosis the laboratories exhibited higher level of efficiency as compared to solid culture and drug susceptibility testing which are rarely used. The infrastructure of the laboratory should be maintained properly so that samples can be processed safely and results could be declared on time.

Keywords: annual proficiency testing, drug susceptibility testing, intermediate reference laboratory, national reference laboratory

Procedia PDF Downloads 182
3829 Comparison of the Chest X-Ray and Computerized Tomography Scans Requested from the Emergency Department

Authors: Sahabettin Mete, Abdullah C. Hocagil, Hilal Hocagil, Volkan Ulker, Hasan C. Taskin

Abstract:

Objectives and Goals: An emergency department is a place where people can come for a multitude of reasons 24 hours a day. As it is an easy, accessible place, thanks to self-sacrificing people who work in emergency departments. But the workload and overcrowding of emergency departments are increasing day by day. Under these circumstances, it is important to choose a quick, easily accessible and effective test for diagnosis. This results in laboratory and imaging tests being more than 40% of all emergency department costs. Despite all of the technological advances in imaging methods and available computerized tomography (CT), chest X-ray, the older imaging method, has not lost its appeal and effectiveness for nearly all emergency physicians. Progress in imaging methods are very convenient, but physicians should consider the radiation dose, cost, and effectiveness, as well as imaging methods to be carefully selected and used. The aim of the study was to investigate the effectiveness of chest X-ray in immediate diagnosis against the advancing technology by comparing chest X-ray and chest CT scan results of the patients in the emergency department. Methods: Patients who applied to Bulent Ecevit University Faculty of Medicine’s emergency department were investigated retrospectively in between 1 September 2014 and 28 February 2015. Data were obtained via MIAMED (Clear Canvas Image Server v6.2, Toronto, Canada), information management system which patients’ files are saved electronically in the clinic, and were retrospectively scanned. The study included 199 patients who were 18 or older, had both chest X-ray and chest CT imaging. Chest X-ray images were evaluated by the emergency medicine senior assistant in the emergency department, and the findings were saved to the study form. CT findings were obtained from already reported data by radiology department in the clinic. Chest X-ray was evaluated with seven questions in terms of technique and dose adequacy. Patients’ age, gender, application complaints, comorbid diseases, vital signs, physical examination findings, diagnosis, chest X-ray findings and chest CT findings were evaluated. Data saved and statistical analyses have made via using SPSS 19.0 for Windows. And the value of p < 0.05 were accepted statistically significant. Results: 199 patients were included in the study. In 38,2% (n=76) of all patients were diagnosed with pneumonia and it was the most common diagnosis. The chest X-ray imaging technique was appropriate in patients with the rate of 31% (n=62) of all patients. There was not any statistically significant difference (p > 0.05) between both imaging methods (chest X-ray and chest CT) in terms of determining the rates of displacement of the trachea, pneumothorax, parenchymal consolidation, increased cardiothoracic ratio, lymphadenopathy, diaphragmatic hernia, free air levels in the abdomen (in sections including the image), pleural thickening, parenchymal cyst, parenchymal mass, parenchymal cavity, parenchymal atelectasis and bone fractures. Conclusions: When imaging findings, showing cases that needed to be quickly diagnosed, were investigated, chest X-ray and chest CT findings were matched at a high rate in patients with an appropriate imaging technique. However, chest X-rays, evaluated in the emergency department, were frequently taken with an inappropriate technique.

Keywords: chest x-ray, chest computerized tomography, chest imaging, emergency department

Procedia PDF Downloads 193
3828 Research on the Rewriting and Adaptation in the English Translation of the Analects

Authors: Jun Xu, Haiyan Xiao

Abstract:

The Analects (Lunyu) is one of the most recognized Confucian classics and one of the earliest Chinese classics that have been translated into English and known to the West. Research on the translation of The Analects has witnessed a transfer from the comparison of the text and language to a wider description of social and cultural contexts. Mainly on the basis of Legge and Waley’s translations of The Analects, this paper integrates Lefevere’s theory of rewriting and Verschueren’s theory of adaptation and explores the influence of ideology and poetics on the translation. It analyses how translators make adaptive decisions in the manipulation of ideology and poetics. It is proved that the English translation of The Analects is the translators’ initiative rewriting of the original work, which is a selective and adaptive process in the multi-layered contexts of the target language. The research on the translation of classics should include both the manipulative factors and translator’s initiative as well.

Keywords: The Analects, ideology, poetics, rewriting, adaptation

Procedia PDF Downloads 278
3827 Automated User Story Driven Approach for Web-Based Functional Testing

Authors: Mahawish Masud, Muhammad Iqbal, M. U. Khan, Farooque Azam

Abstract:

Manual writing of test cases from functional requirements is a time-consuming task. Such test cases are not only difficult to write but are also challenging to maintain. Test cases can be drawn from the functional requirements that are expressed in natural language. However, manual test case generation is inefficient and subject to errors.  In this paper, we have presented a systematic procedure that could automatically derive test cases from user stories. The user stories are specified in a restricted natural language using a well-defined template.  We have also presented a detailed methodology for writing our test ready user stories. Our tool “Test-o-Matic” automatically generates the test cases by processing the restricted user stories. The generated test cases are executed by using open source Selenium IDE.  We evaluate our approach on a case study, which is an open source web based application. Effectiveness of our approach is evaluated by seeding faults in the open source case study using known mutation operators.  Results show that the test case generation from restricted user stories is a viable approach for automated testing of web applications.

Keywords: automated testing, natural language, restricted user story modeling, software engineering, software testing, test case specification, transformation and automation, user story, web application testing

Procedia PDF Downloads 388
3826 Electro-Mechanical Response and Engineering Properties of Piezocomposite with Imperfect Interface

Authors: Rattanan Tippayaphalapholgul, Yasothorn Sapsathiarn

Abstract:

Composites of piezoelectric materials are widely use in practical applications such as nondestructive testing devices, smart adaptive structures and medical devices. A thorough understanding of coupled electro-elastic response and properties of piezocomposite are crucial for the development and design of piezoelectric composite materials used in advanced applications. The micromechanics analysis is employed in this paper to determine the response and engineering properties of the piezocomposite. A mechanical imperfect interface bonding between piezoelectric inclusion and polymer matrix is taken into consideration in the analysis. The micromechanics analysis is based on the Boundary Element Method (BEM) together with the periodic micro-field micromechanics theory. A selected set of numerical results is presented to investigate the influence of volume ratio and interface bonding condition on effective piezocomposite material coefficients and portray basic features of coupled electroelastic response within the domain of piezocomposite unit cell.

Keywords: effective engineering properties, electroelastic response, imperfect interface, piezocomposite

Procedia PDF Downloads 233
3825 Anticipation of Bending Reinforcement Based on Iranian Concrete Code Using Meta-Heuristic Tools

Authors: Seyed Sadegh Naseralavi, Najmeh Bemani

Abstract:

In this paper, different concrete codes including America, New Zealand, Mexico, Italy, India, Canada, Hong Kong, Euro Code and Britain are compared with the Iranian concrete design code. First, by using Adaptive Neuro Fuzzy Inference System (ANFIS), the codes having the most correlation with the Iranian ninth issue of the national regulation are determined. Consequently, two anticipated methods are used for comparing the codes: Artificial Neural Network (ANN) and Multi-variable regression. The results show that ANN performs better. Predicting is done by using only tensile steel ratio and with ignoring the compression steel ratio.

Keywords: adaptive neuro fuzzy inference system, anticipate method, artificial neural network, concrete design code, multi-variable regression

Procedia PDF Downloads 286
3824 Adaptive Threshold Adjustment of Clear Channel Assessment in LAA Down Link

Authors: Yu Li, Dongyao Wang, Xiaobao Sun, Wei Ni

Abstract:

In long-term evolution (LTE), the carriers around 5GHz are planned to be utilized without licenses to further enlarge system capacity. This feature is termed licensed assisted access (LAA). The channel sensing (clean channel assessment, CCA) is required before any transmission on these unlicensed carriers, in order to make sure the harmonious co-existence of LAA with other radio access technology in the unlicensed band. Obviously, the CCA threshold is very critical, which decides whether the transmission right following CCA is delivered in time and without collisions. An improper CCA threshold may cause buffer overflow of some eNodeBs if the eNodeBs are heavily loaded with the traffic. Thus, to solve these problems, we propose an adaptive threshold adjustment method for CCA in the LAA downlink. Both the load and transmission opportunities are concerned. The trend of the LAA throughput as the threshold varies is obtained, which guides the threshold adjustment. The co-existing between LAA and Wi-Fi is particularly tested. The results from system-level simulation confirm the merits of our design, especially in heavy traffic cases.

Keywords: LTE, LAA, CCA, threshold adjustment

Procedia PDF Downloads 142
3823 Investigation of Shear Strength, and Dilative Behavior of Coarse-grained Samples Using Laboratory Test and Machine Learning Technique

Authors: Ehsan Mehryaar, Seyed Armin Motahari Tabari

Abstract:

Coarse-grained soils are known and commonly used in a wide range of geotechnical projects, including high earth dams or embankments for their high shear strength. The most important engineering property of these soils is friction angle which represents the interlocking between soil particles and can be applied widely in designing and constructing these earth structures. Friction angle and dilative behavior of coarse-grained soils can be estimated from empirical correlations with in-situ testing and physical properties of the soil or measured directly in the laboratory performing direct shear or triaxial tests. Unfortunately, large-scale testing is difficult, challenging, and expensive and is not possible in most soil mechanic laboratories. So, it is common to remove the large particles and do the tests, which cannot be counted as an exact estimation of the parameters and behavior of the original soil. This paper describes a new methodology to simulate particles grading distribution of a well-graded gravel sample to a smaller scale sample as it can be tested in an ordinary direct shear apparatus to estimate the stress-strain behavior, friction angle, and dilative behavior of the original coarse-grained soil considering its confining pressure, and relative density using a machine learning method. A total number of 72 direct shear tests are performed in 6 different sizes, 3 different confining pressures, and 4 different relative densities. Multivariate Adaptive Regression Spline (MARS) technique was used to develop an equation in order to predict shear strength and dilative behavior based on the size distribution of coarse-grained soil particles. Also, an uncertainty analysis was performed in order to examine the reliability of the proposed equation.

Keywords: MARS, coarse-grained soil, shear strength, uncertainty analysis

Procedia PDF Downloads 162
3822 Non-Destructive Testing of Selective Laser Melting Products

Authors: Luca Collini, Michele Antolotti, Diego Schiavi

Abstract:

At present, complex geometries within production time shrinkage, rapidly increasing demand, and high-quality standard requirement make the non-destructive (ND) control of additively manufactured components indispensable means. On the other hand, a technology gap and the lack of standards regulating the methods and the acceptance criteria indicate the NDT of these components a stimulating field to be still fully explored. Up to date, penetrant testing, acoustic wave, tomography, radiography, and semi-automated ultrasound methods have been tested on metal powder based products so far. External defects, distortion, surface porosity, roughness, texture, internal porosity, and inclusions are the typical defects in the focus of testing. Detection of density and layers compactness are also been tried on stainless steels by the ultrasonic scattering method. In this work, the authors want to present and discuss the radiographic and the ultrasound ND testing on additively manufactured Ti₆Al₄V and inconel parts obtained by the selective laser melting (SLM) technology. In order to test the possibilities given by the radiographic method, both X-Rays and γ-Rays are tried on a set of specifically designed specimens realized by the SLM. The specimens contain a family of defectology, which represent the most commonly found, as cracks and lack of fusion. The tests are also applied to real parts of various complexity and thickness. A set of practical indications and of acceptance criteria is finally drawn.

Keywords: non-destructive testing, selective laser melting, radiography, UT method

Procedia PDF Downloads 147
3821 Three Dimensional Analysis of Cubesat Thermal Vacuum Test

Authors: Maged Assem Soliman Mossallam

Abstract:

Thermal vacuum testing target is to qualify the space system and ensure its operability under harsh space environment. The functionality of the cubesat was checked at extreme orbit conditions. Test was performed for operational and nonoperational modes. Analysis is done to simulate the cubesat thermal cycling inside thermal vacuum chamber. Comsol Multiphysics finite element is used to solve three dimensional problem for the cubesat inside TVAC. Three dimensional CAD model is done using Autodesk Inventor program. The boundary conditions were applied from the actual shroud temperature. The input heat load variation with time is considered to solve the transient three dimensional problem. Results show that the simulated temperature profiles are within an acceptable range from the real testing data.

Keywords: cubesat, thermal vacuum test, testing simulation, finite element analysis

Procedia PDF Downloads 151
3820 Design of a Tool for Generating Test Cases from BPMN

Authors: Prat Yotyawilai, Taratip Suwannasart

Abstract:

Business Process Model and Notation (BPMN) is more important in the business process and creating functional models, and is a standard for OMG, which becomes popular in various organizations and in education. Researches related to software testing based on models are prominent. Although most researches use the UML model in software testing, not many researches use the BPMN Model in creating test cases. Therefore, this research proposes a design of a tool for generating test cases from the BPMN. The model is analyzed and the details of the various components are extracted before creating a flow graph. Both details of components and the flow graph are used in generating test cases.

Keywords: software testing, test case, BPMN, flow graph

Procedia PDF Downloads 556
3819 Development of Adaptive Architecture Classrooms through the Application of Augmented Reality in Private Universities of Malaysia

Authors: Sara Namdarian, Hafez Salleh

Abstract:

This paper scrutinizes the circumstances of the application of Augmented Reality (AR) technology to enhance the adaptability of architecture classrooms in private Malaysian university classrooms. This study aims to indicate the constraints of mono-functional classrooms in comparison to the potentials of multi-functional classrooms derived from AR application through an exploratory mixed method strategy. This paper expects to contribute towards recognition of suitable AR techniques which can be applied in the development of Adaptive-AR-Classroom-Systems (AARCS) in architecture classrooms. The findings, derived from the analysis, show current classrooms have limited functional spaces, and concludes that AR application can be used in design classrooms to provide a variety of visuals and virtual objects that are required in conducting architecture projects in higher educational centers.

Keywords: design activity, space enhancement, design education, architectural design augmented reality

Procedia PDF Downloads 448
3818 Efficient Wind Fragility Analysis of Concrete Chimney under Stochastic Extreme Wind Incorporating Temperature Effects

Authors: Soumya Bhattacharjya, Avinandan Sahoo, Gaurav Datta

Abstract:

Wind fragility analysis of chimney is often carried out disregarding temperature effect. However, the combined effect of wind and temperature is the most critical limit state for chimney design. Hence, in the present paper, an efficient fragility analysis for concrete chimney is explored under combined wind and temperature effect. Wind time histories are generated by Davenports Power Spectral Density Function and using Weighed Amplitude Wave Superposition Technique. Fragility analysis is often carried out in full Monte Carlo Simulation framework, which requires extensive computational time. Thus, in the present paper, an efficient adaptive metamodelling technique is adopted to judiciously approximate limit state function, which will be subsequently used in the simulation framework. This will save substantial computational time and make the approach computationally efficient. Uncertainty in wind speed, wind load related parameters, and resistance-related parameters is considered. The results by the full simulation approach, conventional metamodelling approach and proposed adaptive metamodelling approach will be compared. Effect of disregarding temperature in wind fragility analysis will be highlighted.

Keywords: adaptive metamodelling technique, concrete chimney, fragility analysis, stochastic extreme wind load, temperature effect

Procedia PDF Downloads 215
3817 Quality is the Matter of All

Authors: Mohamed Hamza, Alex Ohoussou

Abstract:

At JAWDA, our primary focus is on ensuring the satisfaction of our clients worldwide. We are committed to delivering new features on our SaaS platform as quickly as possible while maintaining high-quality standards. In this paper, we highlight two key aspects of testing that represent an evolution of current methods and a potential trend for the future, which have enabled us to uphold our commitment effectively. These aspects are: "One Sandbox per Pull Request" (dynamic test environments instead of static ones) and "QA for All.".

Keywords: QA for all, dynamic sandboxes, QAOPS, CICD, continuous testing, all testers, QA matters for all, 1 sandbox per PR, utilization rate, coverage rate

Procedia PDF Downloads 34
3816 Adaptive Online Object Tracking via Positive and Negative Models Matching

Authors: Shaomei Li, Yawen Wang, Chao Gao

Abstract:

To improve tracking drift which often occurs in adaptive tracking, an algorithm based on the fusion of tracking and detection is proposed in this paper. Firstly, object tracking is posed as a binary classification problem and is modeled by partial least squares (PLS) analysis. Secondly, tracking object frame by frame via particle filtering. Thirdly, validating the tracking reliability based on both positive and negative models matching. Finally, relocating the object based on SIFT features matching and voting when drift occurs. Object appearance model is updated at the same time. The algorithm cannot only sense tracking drift but also relocate the object whenever needed. Experimental results demonstrate that this algorithm outperforms state-of-the-art algorithms on many challenging sequences.

Keywords: object tracking, tracking drift, partial least squares analysis, positive and negative models matching

Procedia PDF Downloads 532
3815 Modeling of Age Hardening Process Using Adaptive Neuro-Fuzzy Inference System: Results from Aluminum Alloy A356/Cow Horn Particulate Composite

Authors: Chidozie C. Nwobi-Okoye, Basil Q. Ochieze, Stanley Okiy

Abstract:

This research reports on the modeling of age hardening process using adaptive neuro-fuzzy inference system (ANFIS). The age hardening output (Hardness) was predicted using ANFIS. The input parameters were ageing time, temperature and percentage composition of cow horn particles (CHp%). The results show the correlation coefficient (R) of the predicted hardness values versus the measured values was of 0.9985. Subsequently, values outside the experimental data points were predicted. When the temperature was kept constant, and other input parameters were varied, the average relative error of the predicted values was 0.0931%. When the temperature was varied, and other input parameters kept constant, the average relative error of the hardness values predictions was 80%. The results show that ANFIS with coarse experimental data points for learning is not very effective in predicting process outputs in the age hardening operation of A356 alloy/CHp particulate composite. The fine experimental data requirements by ANFIS make it more expensive in modeling and optimization of age hardening operations of A356 alloy/CHp particulate composite.

Keywords: adaptive neuro-fuzzy inference system (ANFIS), age hardening, aluminum alloy, metal matrix composite

Procedia PDF Downloads 155
3814 An AK-Chart for the Non-Normal Data

Authors: Chia-Hau Liu, Tai-Yue Wang

Abstract:

Traditional multivariate control charts assume that measurement from manufacturing processes follows a multivariate normal distribution. However, this assumption may not hold or may be difficult to verify because not all the measurement from manufacturing processes are normal distributed in practice. This study develops a new multivariate control chart for monitoring the processes with non-normal data. We propose a mechanism based on integrating the one-class classification method and the adaptive technique. The adaptive technique is used to improve the sensitivity to small shift on one-class classification in statistical process control. In addition, this design provides an easy way to allocate the value of type I error so it is easier to be implemented. Finally, the simulation study and the real data from industry are used to demonstrate the effectiveness of the propose control charts.

Keywords: multivariate control chart, statistical process control, one-class classification method, non-normal data

Procedia PDF Downloads 423
3813 Implementation of Conceptual Real-Time Embedded Functional Design via Drive-By-Wire ECU Development

Authors: Ananchai Ukaew, Choopong Chauypen

Abstract:

Design concepts of real-time embedded system can be realized initially by introducing novel design approaches. In this literature, model based design approach and in-the-loop testing were employed early in the conceptual and preliminary phase to formulate design requirements and perform quick real-time verification. The design and analysis methodology includes simulation analysis, model based testing, and in-the-loop testing. The design of conceptual drive-by-wire, or DBW, algorithm for electronic control unit, or ECU, was presented to demonstrate the conceptual design process, analysis, and functionality evaluation. The concepts of DBW ECU function can be implemented in the vehicle system to improve electric vehicle, or EV, conversion drivability. However, within a new development process, conceptual ECU functions and parameters are needed to be evaluated. As a result, the testing system was employed to support conceptual DBW ECU functions evaluation. For the current setup, the system components were consisted of actual DBW ECU hardware, electric vehicle models, and control area network or CAN protocol. The vehicle models and CAN bus interface were both implemented as real-time applications where ECU and CAN protocol functionality were verified according to the design requirements. The proposed system could potentially benefit in performing rapid real-time analysis of design parameters for conceptual system or software algorithm development.

Keywords: drive-by-wire ECU, in-the-loop testing, model-based design, real-time embedded system

Procedia PDF Downloads 350
3812 Performance Comparison of Different Regression Methods for a Polymerization Process with Adaptive Sampling

Authors: Florin Leon, Silvia Curteanu

Abstract:

Developing complete mechanistic models for polymerization reactors is not easy, because complex reactions occur simultaneously; there is a large number of kinetic parameters involved and sometimes the chemical and physical phenomena for mixtures involving polymers are poorly understood. To overcome these difficulties, empirical models based on sampled data can be used instead, namely regression methods typical of machine learning field. They have the ability to learn the trends of a process without any knowledge about its particular physical and chemical laws. Therefore, they are useful for modeling complex processes, such as the free radical polymerization of methyl methacrylate achieved in a batch bulk process. The goal is to generate accurate predictions of monomer conversion, numerical average molecular weight and gravimetrical average molecular weight. This process is associated with non-linear gel and glass effects. For this purpose, an adaptive sampling technique is presented, which can select more samples around the regions where the values have a higher variation. Several machine learning methods are used for the modeling and their performance is compared: support vector machines, k-nearest neighbor, k-nearest neighbor and random forest, as well as an original algorithm, large margin nearest neighbor regression. The suggested method provides very good results compared to the other well-known regression algorithms.

Keywords: batch bulk methyl methacrylate polymerization, adaptive sampling, machine learning, large margin nearest neighbor regression

Procedia PDF Downloads 305
3811 Prioritization of Mutation Test Generation with Centrality Measure

Authors: Supachai Supmak, Yachai Limpiyakorn

Abstract:

Mutation testing can be applied for the quality assessment of test cases. Prioritization of mutation test generation has been a critical element of the industry practice that would contribute to the evaluation of test cases. The industry generally delivers the product under the condition of time to the market and thus, inevitably sacrifices software testing tasks, even though many test cases are required for software verification. This paper presents an approach of applying a social network centrality measure, PageRank, to prioritize mutation test generation. The source code with the highest values of PageRank will be focused first when developing their test cases as these modules are vulnerable to defects or anomalies which may cause the consequent defects in many other associated modules. Moreover, the approach would help identify the reducible test cases in the test suite, still maintaining the same criteria as the original number of test cases.

Keywords: software testing, mutation test, network centrality measure, test case prioritization

Procedia PDF Downloads 113
3810 Space Time Adaptive Algorithm in Bi-Static Passive Radar Systems for Clutter Mitigation

Authors: D. Venu, N. V. Koteswara Rao

Abstract:

Space – time adaptive processing (STAP) is an effective tool for detecting a moving target in spaceborne or airborne radar systems. Since airborne passive radar systems utilize broadcast, navigation and excellent communication signals to perform various surveillance tasks and also has attracted significant interest from the distinct past, therefore the need of the hour is to have cost effective systems as compared to conventional active radar systems. Moreover, requirements of small number of secondary samples for effective clutter suppression in bi-static passive radar offer abundant illuminator resources for passive surveillance radar systems. This paper presents a framework for incorporating knowledge sources directly in the space-time beam former of airborne adaptive radars. STAP algorithm for clutter mitigation for passive bi-static radar has better quantitation of the reduction in sample size thereby amalgamating the earlier data bank with existing radar data sets. Also, we proposed a novel method to estimate the clutter matrix and perform STAP for efficient clutter suppression based on small sample size. Furthermore, the effectiveness of the proposed algorithm is verified using MATLAB simulations in order to validate STAP algorithm for passive bi-static radar. In conclusion, this study highlights the importance for various applications which augments traditional active radars using cost-effective measures.

Keywords: bistatic radar, clutter, covariance matrix passive radar, STAP

Procedia PDF Downloads 296
3809 A Review of End-of-Term Oral Tests for English-Majored Students of HCMC Open University

Authors: Khoa K. Doan

Abstract:

Assessment plays an essential role in teaching and learning English as it aims to measure the learning outcomes. Designing appropriate test types and procedures for four skills, especially productive skills, is a very challenging task for teachers of English. The assessment scheme is supposed to provide precise measures and fair opportunities for students to demonstrate what they can do with their language skills. This involves content domains, measurement techniques, administrative feasibility, target populations, and potential sources of testing bias. Based on these elements, a review of end-of-term speaking tests for English-majored students at Ho Chi Minh City Open University (Viet Nam) was undertaken for the purpose of analyzing the strengths and limitations of the testing tool for the speaking assessment. It helped to identify what could be done to facilitate the process of teaching and learning in that context.

Keywords: assessment, oral tests, speaking, testing

Procedia PDF Downloads 320
3808 Comparison of Crossover Types to Obtain Optimal Queries Using Adaptive Genetic Algorithm

Authors: Wafa’ Alma'Aitah, Khaled Almakadmeh

Abstract:

this study presents an information retrieval system of using genetic algorithm to increase information retrieval efficiency. Using vector space model, information retrieval is based on the similarity measurement between query and documents. Documents with high similarity to query are judge more relevant to the query and should be retrieved first. Using genetic algorithms, each query is represented by a chromosome; these chromosomes are fed into genetic operator process: selection, crossover, and mutation until an optimized query chromosome is obtained for document retrieval. Results show that information retrieval with adaptive crossover probability and single point type crossover and roulette wheel as selection type give the highest recall. The proposed approach is verified using (242) proceedings abstracts collected from the Saudi Arabian national conference.

Keywords: genetic algorithm, information retrieval, optimal queries, crossover

Procedia PDF Downloads 294
3807 Online Prediction of Nonlinear Signal Processing Problems Based Kernel Adaptive Filtering

Authors: Hamza Nejib, Okba Taouali

Abstract:

This paper presents two of the most knowing kernel adaptive filtering (KAF) approaches, the kernel least mean squares and the kernel recursive least squares, in order to predict a new output of nonlinear signal processing. Both of these methods implement a nonlinear transfer function using kernel methods in a particular space named reproducing kernel Hilbert space (RKHS) where the model is a linear combination of kernel functions applied to transform the observed data from the input space to a high dimensional feature space of vectors, this idea known as the kernel trick. Then KAF is the developing filters in RKHS. We use two nonlinear signal processing problems, Mackey Glass chaotic time series prediction and nonlinear channel equalization to figure the performance of the approaches presented and finally to result which of them is the adapted one.

Keywords: online prediction, KAF, signal processing, RKHS, Kernel methods, KRLS, KLMS

Procedia PDF Downloads 401
3806 Comparison of Spiking Neuron Models in Terms of Biological Neuron Behaviours

Authors: Fikret Yalcinkaya, Hamza Unsal

Abstract:

To understand how neurons work, it is required to combine experimental studies on neural science with numerical simulations of neuron models in a computer environment. In this regard, the simplicity and applicability of spiking neuron modeling functions have been of great interest in computational neuron science and numerical neuroscience in recent years. Spiking neuron models can be classified by exhibiting various neuronal behaviors, such as spiking and bursting. These classifications are important for researchers working on theoretical neuroscience. In this paper, three different spiking neuron models; Izhikevich, Adaptive Exponential Integrate Fire (AEIF) and Hindmarsh Rose (HR), which are based on first order differential equations, are discussed and compared. First, the physical meanings, derivatives, and differential equations of each model are provided and simulated in the Matlab environment. Then, by selecting appropriate parameters, the models were visually examined in the Matlab environment and it was aimed to demonstrate which model can simulate well-known biological neuron behaviours such as Tonic Spiking, Tonic Bursting, Mixed Mode Firing, Spike Frequency Adaptation, Resonator and Integrator. As a result, the Izhikevich model has been shown to perform Regular Spiking, Continuous Explosion, Intrinsically Bursting, Thalmo Cortical, Low-Threshold Spiking and Resonator. The Adaptive Exponential Integrate Fire model has been able to produce firing patterns such as Regular Ignition, Adaptive Ignition, Initially Explosive Ignition, Regular Explosive Ignition, Delayed Ignition, Delayed Regular Explosive Ignition, Temporary Ignition and Irregular Ignition. The Hindmarsh Rose model showed three different dynamic neuron behaviours; Spike, Burst and Chaotic. From these results, the Izhikevich cell model may be preferred due to its ability to reflect the true behavior of the nerve cell, the ability to produce different types of spikes, and the suitability for use in larger scale brain models. The most important reason for choosing the Adaptive Exponential Integrate Fire model is that it can create rich ignition patterns with fewer parameters. The chaotic behaviours of the Hindmarsh Rose neuron model, like some chaotic systems, is thought to be used in many scientific and engineering applications such as physics, secure communication and signal processing.

Keywords: Izhikevich, adaptive exponential integrate fire, Hindmarsh Rose, biological neuron behaviours, spiking neuron models

Procedia PDF Downloads 183
3805 Adaptation Mechanism and Planning Response to Resiliency Shrinking of Small Towns Based on Complex Adaptive System by Taking Wuhan as an Example

Authors: Yanqun Li, Hong Geng

Abstract:

The rapid urbanization process taking big cities as the main body leads to the unequal configuration of urban and rural areas in the aspects of land supply, industrial division of labor, service supply and space allocation, and induces the shrinking characterization of service energy, industrial system and population vitality in small towns. As an important spatial unit in the spectrum of urbanization that serves, connects and couples urban and rural areas, the shrinking phenomenon faced by small towns has an important influence on the healthy development of urbanization. Based on the census of small towns in Wuhan metropolitan area, we have found that the shrinking of small towns is a passive contraction of elastic tension under the squeeze in cities. Once affected by the external forces such as policy regulation, planning guidance, and population return, small towns will achieve expansion and growth. Based on the theory of complex adaptive systems, this paper comprehensively constructs the development index evaluation system of small towns from five aspects of population, economy, space, society and ecology, measures the shrinking level of small towns, further analyzes the shrinking characteristics of small towns, and identifies whether the shrinking is elastic or not. And then this paper measures the resilience ability index of small town contract from the above-mentioned five aspects. Finally, this paper proposes an adaptive mechanism of urban-rural interaction evolution under fine division of labor to response the passive shrinking in small towns of Wuhan. Based on the above, the paper creatively puts forward the planning response measures of the small towns on the aspects of spatial layout, function orientation and service support, which can provide reference for other regions.

Keywords: complex adaptive systems, resiliency shrinking, adaptation mechanism, planning response

Procedia PDF Downloads 125
3804 Adaptive Beamforming with Steering Error and Mutual Coupling between Antenna Sensors

Authors: Ju-Hong Lee, Ching-Wei Liao

Abstract:

Owing to close antenna spacing between antenna sensors within a compact space, a part of data in one antenna sensor would outflow to other antenna sensors when the antenna sensors in an antenna array operate simultaneously. This phenomenon is called mutual coupling effect (MCE). It has been shown that the performance of antenna array systems can be degraded when the antenna sensors are in close proximity. Especially, in a systems equipped with massive antenna sensors, the degradation of beamforming performance due to the MCE is significantly inevitable. Moreover, it has been shown that even a small angle error between the true direction angle of the desired signal and the steering angle deteriorates the effectiveness of an array beamforming system. However, the true direction vector of the desired signal may not be exactly known in some applications, e.g., the application in land mobile-cellular wireless systems. Therefore, it is worth developing robust techniques to deal with the problem due to the MCE and steering angle error for array beamforming systems. In this paper, we present an efficient technique for performing adaptive beamforming with robust capabilities against the MCE and the steering angle error. Only the data vector received by an antenna array is required by the proposed technique. By using the received array data vector, a correlation matrix is constructed to replace the original correlation matrix associated with the received array data vector. Then, the mutual coupling matrix due to the MCE on the antenna array is estimated through a recursive algorithm. An appropriate estimate of the direction angle of the desired signal can also be obtained during the recursive process. Based on the estimated mutual coupling matrix, the estimated direction angle, and the reconstructed correlation matrix, the proposed technique can effectively cure the performance degradation due to steering angle error and MCE. The novelty of the proposed technique is that the implementation procedure is very simple and the resulting adaptive beamforming performance is satisfactory. Simulation results show that the proposed technique provides much better beamforming performance without requiring complicated complexity as compared with the existing robust techniques.

Keywords: adaptive beamforming, mutual coupling effect, recursive algorithm, steering angle error

Procedia PDF Downloads 323