Search results for: Scalable performance testing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15329

Search results for: Scalable performance testing

15149 Characterization of Martensitic Stainless Steel Japanese Grade AISI 420A

Authors: T. Z. Butt, T. A. Tabish, K. Anjum, H. Hafeez

Abstract:

A study of martensitic stainless steel surgical grade AISI 420A produced in Japan was carried out in this research work. The sample was already annealed at about 898˚C. The sample were subjected to chemical analysis, hardness, tensile and metallographic tests. These tests were performed on as received annealed and heat treated samples. In the annealed condition the sample showed 0HRC. However, on tensile testing, in annealed condition the sample showed maximum elongation. The heat treatment is carried out in vacuum furnace within temperature range 980-1035°C. The quenching of samples was carried out using liquid nitrogen. After hardening, the samples were subjected to tempering, which was carried out in vacuum tempering furnace at a temperature of 220˚C. The hardened samples were subjected to hardness and tensile testing. In hardness testing, the samples showed maximum hardness values. In tensile testing the sample showed minimum elongation. The sample in annealed state showed coarse plates of martensite structure. Therefore, the studied steels can be used as biomaterials.

Keywords: biomaterials, martensitic steel, microsrtucture, tensile testing, hardening, tempering, bioinstrumentation

Procedia PDF Downloads 277
15148 On the Resilience of Operational Technology Devices in Penetration Tests

Authors: Marko Schuba, Florian Kessels, Niklas Reitz

Abstract:

Operational technology (OT) controls physical processes in critical infrastructures and economically important industries. With the convergence of OT with classical information technology (IT), rising cybercrime worldwide and the increasingly difficult geopolitical situation, the risks of OT infrastructures being attacked are growing. Classical penetration testing, in which testers take on the role of an attacker, has so far found little acceptance in the OT sector - the risk that a penetration test could do more harm than good seems too great. This paper examines the resilience of various OT systems using typical penetration test tools. It is shown that such a test certainly involves risks, but is also feasible in OT if a cautious approach is taken. Therefore, OT penetration testing should be considered as a tool to improve the cyber security of critical infrastructures.

Keywords: penetration testing, OT, ICS, OT security

Procedia PDF Downloads 15
15147 A Comparative Soft Computing Approach to Supplier Performance Prediction Using GEP and ANN Models: An Automotive Case Study

Authors: Seyed Esmail Seyedi Bariran, Khairul Salleh Mohamed Sahari

Abstract:

In multi-echelon supply chain networks, optimal supplier selection significantly depends on the accuracy of suppliers’ performance prediction. Different methods of multi criteria decision making such as ANN, GA, Fuzzy, AHP, etc have been previously used to predict the supplier performance but the “black-box” characteristic of these methods is yet a major concern to be resolved. Therefore, the primary objective in this paper is to implement an artificial intelligence-based gene expression programming (GEP) model to compare the prediction accuracy with that of ANN. A full factorial design with %95 confidence interval is initially applied to determine the appropriate set of criteria for supplier performance evaluation. A test-train approach is then utilized for the ANN and GEP exclusively. The training results are used to find the optimal network architecture and the testing data will determine the prediction accuracy of each method based on measures of root mean square error (RMSE) and correlation coefficient (R2). The results of a case study conducted in Supplying Automotive Parts Co. (SAPCO) with more than 100 local and foreign supply chain members revealed that, in comparison with ANN, gene expression programming has a significant preference in predicting supplier performance by referring to the respective RMSE and R-squared values. Moreover, using GEP, a mathematical function was also derived to solve the issue of ANN black-box structure in modeling the performance prediction.

Keywords: Supplier Performance Prediction, ANN, GEP, Automotive, SAPCO

Procedia PDF Downloads 419
15146 Navigating Cyber Attacks with Quantum Computing: Leveraging Vulnerabilities and Forensics for Advanced Penetration Testing in Cybersecurity

Authors: Sayor Ajfar Aaron, Ashif Newaz, Sajjat Hossain Abir, Mushfiqur Rahman

Abstract:

This paper examines the transformative potential of quantum computing in the field of cybersecurity, with a focus on advanced penetration testing and forensics. It explores how quantum technologies can be leveraged to identify and exploit vulnerabilities more efficiently than traditional methods and how they can enhance the forensic analysis of cyber-attacks. Through theoretical analysis and practical simulations, this study highlights the enhanced capabilities of quantum algorithms in detecting and responding to sophisticated cyber threats, providing a pathway for developing more resilient cybersecurity infrastructures.

Keywords: cybersecurity, cyber forensics, penetration testing, quantum computing

Procedia PDF Downloads 67
15145 Modelling and Control of Electrohydraulic System Using Fuzzy Logic Algorithm

Authors: Hajara Abdulkarim Aliyu, Abdulbasid Ismail Isa

Abstract:

This research paper studies electrohydraulic system for its role in position and motion control system and develops as mathematical model describing the behaviour of the system. The research further proposes Fuzzy logic and conventional PID controllers in order to achieve both accurate positioning of the payload and overall improvement of the system performance. The simulation result shows Fuzzy logic controller has a superior tracking performance and high disturbance rejection efficiency for its shorter settling time, less overshoot, smaller values of integral of absolute and deviation errors over the conventional PID controller at all the testing conditions.

Keywords: electrohydraulic, fuzzy logic, modelling, NZ-PID

Procedia PDF Downloads 470
15144 Correlates of Peer Influence and Resistance to HIV/AIDS Counselling and Testing among Students in Tertiary Institutions in Kano State, Nigeria

Authors: A. S. Haruna, M. U. Tambawal, A. A. Salawu

Abstract:

The psychological impact of peer influence on its individual group members, can make them resist HIV/AIDS counselling and testing. This study investigated the correlate of peer influence and resistance to HIV/AIDS counselling and testing among students in tertiary institutions in Kano state, Nigeria. To achieve this, three null hypotheses were postulated and tested. Cross-Sectional Survey Design was employed in which 1512 sample was selected from a student population of 104,841.Simple Random Sampling was used in the selection. A self-developed 20-item scale called Peer Influence and Psychological Resistance Inventory (PIPRI) was used for data collection. Pearson Product Moment Correlation (PPMCC) via test-retest method was applied to estimate a reliability coefficient of 0.86 for the scale. Data obtained was analyzed using t-test and PPMCC at 0.05 level of confidence. Results reveal 26.3% (397) of the respondents being influenced by their peer group, while 39.8% showed resistance. Also, the t-tests and PPMCC statistics were greater than their respective critical values. This shows that there was a significant gender difference in peer influence and a difference between peer influence and resistance to HIV/AIDS counselling and testing. However, a positive relationship between peer influence and resistance to HIV/AIDS counselling and testing was shown. A major recommendation offered suggests the use of reinforcement and social support for positive attitudes and maintenance of safe behaviour among students who patronize HIV/AIDS counselling.

Keywords: peer group influence, HIV/AIDS counselling and testing, psychological resistance, students

Procedia PDF Downloads 390
15143 The Development and Testing of Greenhouse Comprehensive Environment Control System

Authors: Mohammed Alrefaie, Yaser Miaji

Abstract:

Greenhouses provide a convenient means to grow plants in the best environment. They achieve this by trapping heat from the sunlight and using artificial means to enhance the environment of the greenhouse. This includes controlling factors such as air flow, light intensity and amount of water among others that can have a big impact on plant growth. The aim of the greenhouse is to give maximum yield from plants possible. This report details the development and testing of greenhouse environment control system that can regulate light intensity, airflow and power supply inside the greenhouse. The details of the module development to control these three factors along with results of testing are presented.

Keywords: greenhouse, control system, light intensity, comprehensive environment

Procedia PDF Downloads 482
15142 Testing the Change in Correlation Structure across Markets: High-Dimensional Data

Authors: Malay Bhattacharyya, Saparya Suresh

Abstract:

The Correlation Structure associated with a portfolio is subjected to vary across time. Studying the structural breaks in the time-dependent Correlation matrix associated with a collection had been a subject of interest for a better understanding of the market movements, portfolio selection, etc. The current paper proposes a methodology for testing the change in the time-dependent correlation structure of a portfolio in the high dimensional data using the techniques of generalized inverse, singular valued decomposition and multivariate distribution theory which has not been addressed so far. The asymptotic properties of the proposed test are derived. Also, the performance and the validity of the method is tested on a real data set. The proposed test performs well for detecting the change in the dependence of global markets in the context of high dimensional data.

Keywords: correlation structure, high dimensional data, multivariate distribution theory, singular valued decomposition

Procedia PDF Downloads 125
15141 Electricity Market Categorization for Smart Grid Market Testing

Authors: Rebeca Ramirez Acosta, Sebastian Lenhoff

Abstract:

Decision makers worldwide need to determine if the implementation of a new market mechanism will contribute to the sustainability and resilience of the power system. Due to smart grid technologies, new products in the distribution and transmission system can be traded; however, the impact of changing a market rule will differ between several regions. To test systematically those impacts, a market categorization has been compiled and organized in a smart grid market testing toolbox. This toolbox maps all actual energy products and sets the basis for running a co-simulation test with the new rule to be implemented. It will help to measure the impact of the new rule, based on the sustainable and resilience indicators.

Keywords: co-simulation, electricity market, smart grid market, market testing

Procedia PDF Downloads 189
15140 Cost-Effective, Accuracy Preserving Scalar Characterization for mmWave Transceivers

Authors: Mohammad Salah Abdullatif, Salam Hajjar, Paul Khanna

Abstract:

The development of instrument grade mmWave transceivers comes with many challenges. A general rule of thumb is that the performance of the instrument must be higher than the performance of the unit under test in terms of accuracy and stability. The calibration and characterizing of mmWave transceivers are important pillars for testing commercial products. Using a Vector Network Analyzer (VNA) with a mixer option has proven a high performance as an approach to calibrate mmWave transceivers. However, this approach comes with a high cost. In this work, a reduced-cost method to calibrate mmWave transceivers is proposed. A comparison between the proposed method and the VNA technology is provided. A demonstration of significant challenges is discussed, and an approach to meet the requirements is proposed.

Keywords: mmWave transceiver, scalar characterization, coupler connection, magic tee connection, calibration, VNA, vector network analyzer

Procedia PDF Downloads 107
15139 Flame Spray Pyrolysis as a High-Throughput Method to Generate Gadolinium Doped Titania Nanoparticles for Augmented Radiotherapy

Authors: Malgorzata J. Rybak-Smith, Benedicte Thiebaut, Simon Johnson, Peter Bishop, Helen E. Townley

Abstract:

Gadolinium doped titania (TiO2:Gd) nanoparticles (NPs) can be activated by X-ray radiation to generate Reactive Oxygen Species (ROS), which can be effective in killing cancer cells. As such, treatment with these NPs can be used to enhance the efficacy of conventional radiotherapy. Incorporation of the NPs in to tumour tissue will permit the extension of radiotherapy to currently untreatable tumours deep within the body, and also reduce damage to neighbouring healthy cells. In an attempt to find a fast and scalable method for the synthesis of the TiO2:Gd NPs, the use of Flame Spray Pyrolysis (FSP) was investigated. A series of TiO2 NPs were generated with 1, 2, 5 and 7 mol% gadolinium dopant. Post-synthesis, the TiO2:Gd NPs were silica-coated to improve their biocompatibility. Physico-chemical characterisation was used to determine the size and stability in aqueous suspensions of the NPs. All analysed TiO2:Gd NPs were shown to have relatively high photocatalytic activity. Furthermore, the FSP synthesized silica-coated TiO2:Gd NPs generated enhanced ROS in chemico. Studies on rhabdomyosarcoma (RMS) cell lines (RD & RH30) demonstrated that in the absence of irradiation all TiO2:Gd NPs were inert. However, application of TiO2:Gd NPs to RMS cells, followed by irradiation, showed a significant decrease in cell proliferation. Consequently, our studies showed that the X-ray-activatable TiO2:Gd NPs can be prepared by a high-throughput scalable technique to provide a novel and affordable anticancer therapy.

Keywords: cancer, gadolinium, ROS, titania nanoparticles, X-ray

Procedia PDF Downloads 431
15138 Enhancing the Interpretation of Group-Level Diagnostic Results from Cognitive Diagnostic Assessment: Application of Quantile Regression and Cluster Analysis

Authors: Wenbo Du, Xiaomei Ma

Abstract:

With the empowerment of Cognitive Diagnostic Assessment (CDA), various domains of language testing and assessment have been investigated to dig out more diagnostic information. What is noticeable is that most of the extant empirical CDA-based research puts much emphasis on individual-level diagnostic purpose with very few concerned about learners’ group-level performance. Even though the personalized diagnostic feedback is the unique feature that differentiates CDA from other assessment tools, group-level diagnostic information cannot be overlooked in that it might be more practical in classroom setting. Additionally, the group-level diagnostic information obtained via current CDA always results in a “flat pattern”, that is, the mastery/non-mastery of all tested skills accounts for the two highest proportion. In that case, the outcome does not bring too much benefits than the original total score. To address these issues, the present study attempts to apply cluster analysis for group classification and quantile regression analysis to pinpoint learners’ performance at different proficiency levels (beginner, intermediate and advanced) thus to enhance the interpretation of the CDA results extracted from a group of EFL learners’ reading performance on a diagnostic reading test designed by PELDiaG research team from a key university in China. The results show that EM method in cluster analysis yield more appropriate classification results than that of CDA, and quantile regression analysis does picture more insightful characteristics of learners with different reading proficiencies. The findings are helpful and practical for instructors to refine EFL reading curriculum and instructional plan tailored based on the group classification results and quantile regression analysis. Meanwhile, these innovative statistical methods could also make up the deficiencies of CDA and push forward the development of language testing and assessment in the future.

Keywords: cognitive diagnostic assessment, diagnostic feedback, EFL reading, quantile regression

Procedia PDF Downloads 146
15137 Structural Evaluation of Airfield Pavement Using Finite Element Analysis Based Methodology

Authors: Richard Ji

Abstract:

Nondestructive deflection testing has been accepted widely as a cost-effective tool for evaluating the structural condition of airfield pavements. Backcalculation of pavement layer moduli can be used to characterize the pavement existing condition in order to compute the load bearing capacity of pavement. This paper presents an improved best-fit backcalculation methodology based on deflection predictions obtained using finite element method (FEM). The best-fit approach is based on minimizing the squared error between falling weight deflectometer (FWD) measured deflections and FEM predicted deflections. Then, concrete elastic modulus and modulus of subgrade reaction were back-calculated using Heavy Weight Deflectometer (HWD) deflections collected at the National Airport Pavement Testing Facility (NAPTF) test site. It is an alternative and more versatile method in considering concrete slab geometry and HWD testing locations compared to methods currently available.

Keywords: nondestructive testing, pavement moduli backcalculation, finite element method, concrete pavements

Procedia PDF Downloads 166
15136 A Study on Design for Parallel Test Based on Embedded System

Authors: Zheng Sun, Weiwei Cui, Xiaodong Ma, Hongxin Jin, Dongpao Hong, Jinsong Yang, Jingyi Sun

Abstract:

With the improvement of the performance and complexity of modern equipment, automatic test system (ATS) becomes widely used for condition monitoring and fault diagnosis. However, the conventional ATS mainly works in a serial mode, and lacks the ability of testing several equipments at the same time. That leads to low test efficiency and ATS redundancy. Especially for a large majority of equipment under test, the conventional ATS cannot meet the requirement of efficient testing. To reduce the support resource and increase test efficiency, we propose a method of design for the parallel test based on the embedded system in this paper. Firstly, we put forward the general framework of the parallel test system, and the system contains a central management system (CMS) and several distributed test subsystems (DTS). Then we give a detailed design of the system. For the hardware of the system, we use embedded architecture to design DTS. For the software of the system, we use test program set to improve the test adaption. By deploying the parallel test system, the time to test five devices is now equal to the time to test one device in the past. Compared with the conventional test system, the proposed test system reduces the size and improves testing efficiency. This is of great significance for equipment to be put into operation swiftly. Finally, we take an industrial control system as an example to verify the effectiveness of the proposed method. The result shows that the method is reasonable, and the efficiency is improved up to 500%.

Keywords: parallel test, embedded system, automatic test system, automatic test system (ATS), central management system, central management system (CMS), distributed test subsystems, distributed test subsystems (DTS)

Procedia PDF Downloads 305
15135 Optimization of Geometric Parameters of Microfluidic Channels for Flow-Based Studies

Authors: Parth Gupta, Ujjawal Singh, Shashank Kumar, Mansi Chandra, Arnab Sarkar

Abstract:

Microfluidic devices have emerged as indispensable tools across various scientific disciplines, offering precise control and manipulation of fluids at the microscale. Their efficacy in flow-based research, spanning engineering, chemistry, and biology, relies heavily on the geometric design of microfluidic channels. This work introduces a novel approach to optimise these channels through Response Surface Methodology (RSM), departing from the conventional practice of addressing one parameter at a time. Traditionally, optimising microfluidic channels involved isolated adjustments to individual parameters, limiting the comprehensive understanding of their combined effects. In contrast, our approach considers the simultaneous impact of multiple parameters, employing RSM to efficiently explore the complex design space. The outcome is an innovative microfluidic channel that consumes an optimal sample volume and minimises flow time, enhancing overall efficiency. The relevance of geometric parameter optimization in microfluidic channels extends significantly in biomedical engineering. The flow characteristics of porous materials within these channels depend on many factors, including fluid viscosity, environmental conditions (such as temperature and humidity), and specific design parameters like sample volume, channel width, channel length, and substrate porosity. This intricate interplay directly influences the performance and efficacy of microfluidic devices, which, if not optimized, can lead to increased costs and errors in disease testing and analysis. In the context of biomedical applications, the proposed approach addresses the critical need for precision in fluid flow. it mitigate manufacturing costs associated with trial-and-error methodologies by optimising multiple geometric parameters concurrently. The resulting microfluidic channels offer enhanced performance and contribute to a streamlined, cost-effective process for testing and analyzing diseases. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing. A key highlight of our methodology is its consideration of the interconnected nature of geometric parameters. For instance, the volume of the sample, when optimized alongside channel width, length, and substrate porosity, creates a synergistic effect that minimizes errors and maximizes efficiency. This holistic optimization approach ensures that microfluidic devices operate at their peak performance, delivering reliable results in disease testing.

Keywords: microfluidic device, minitab, statistical optimization, response surface methodology

Procedia PDF Downloads 68
15134 A Multi-Release Software Reliability Growth Models Incorporating Imperfect Debugging and Change-Point under the Simulated Testing Environment and Software Release Time

Authors: Sujit Kumar Pradhan, Anil Kumar, Vijay Kumar

Abstract:

The testing process of the software during the software development time is a crucial step as it makes the software more efficient and dependable. To estimate software’s reliability through the mean value function, many software reliability growth models (SRGMs) were developed under the assumption that operating and testing environments are the same. Practically, it is not true because when the software works in a natural field environment, the reliability of the software differs. This article discussed an SRGM comprising change-point and imperfect debugging in a simulated testing environment. Later on, we extended it in a multi-release direction. Initially, the software was released to the market with few features. According to the market’s demand, the software company upgraded the current version by adding new features as time passed. Therefore, we have proposed a generalized multi-release SRGM where change-point and imperfect debugging concepts have been addressed in a simulated testing environment. The failure-increasing rate concept has been adopted to determine the change point for each software release. Based on nine goodness-of-fit criteria, the proposed model is validated on two real datasets. The results demonstrate that the proposed model fits the datasets better. We have also discussed the optimal release time of the software through a cost model by assuming that the testing and debugging costs are time-dependent.

Keywords: software reliability growth models, non-homogeneous Poisson process, multi-release software, mean value function, change-point, environmental factors

Procedia PDF Downloads 74
15133 A Picture Naming Study of European Portuguese-English Bilinguals on Cognates Switch Effects

Authors: Minghui Zou

Abstract:

This study investigates whether and how cognate status influences switching costs in bilingual language production. Two picture naming tasks will be conducted in this proposed study by manipulating the conditions of how cognates and non-cognates are presented, i.e., separately in two testing blocks vs intermixed in one single testing block. Participants of each experiment will be 24 L1-European Portuguese L2-English unbalanced speakers. Stimuli will include 12 pictures of cognate nouns and 12 of non-cognate nouns. It is hypothesized that there will be cognate switch facilitation effects among unbalanced bilinguals in both of their languages when stimuli are presented either in two single testing blocks or one mixed testing block. Shorter reaction times and higher naming accuracy are expected to be found in cognate switch trials than in non-cognate switch trials.

Keywords: cognates, language switching costs, picture naming, European Portuguese, cognate facilitation effect

Procedia PDF Downloads 38
15132 Design Development and Qualification of a Magnetically Levitated Blower for C0₂ Scrubbing in Manned Space Missions

Authors: Larry Hawkins, Scott K. Sakakura, Michael J. Salopek

Abstract:

The Marshall Space Flight Center is designing and building a next-generation CO₂ removal system, the Four Bed Carbon Dioxide Scrubber (4BCO₂), which will use the International Space Station (ISS) as a testbed. The current ISS CO2 removal system has faced many challenges in both performance and reliability. Given that CO2 removal is an integral Environmental Control and Life Support System (ECLSS) subsystem, the 4BCO2 Scrubber has been designed to eliminate the shortfalls identified in the current ISS system. One of the key required upgrades was to improve the performance and reliability of the blower that provides the airflow through the CO₂ sorbent beds. A magnetically levitated blower, capable of higher airflow and pressure than the previous system, was developed to meet this need. The design and qualification testing of this next-generation blower are described here. The new blower features a high-efficiency permanent magnet motor, a five-axis, active magnetic bearing system, and a compact controller containing both a variable speed drive and a magnetic bearing controller. The blower uses a centrifugal impeller to pull air from the inlet port and drive it through an annular space around the motor and magnetic bearing components to the exhaust port. Technical challenges of the blower and controller development include survival of the blower system under launch random vibration loads, operation in microgravity, packaging under strict size and weight requirements, and successful operation during 4BCO₂ operational changeovers. An ANSYS structural dynamic model of the controller was used to predict response to the NASA defined random vibration spectrum and drive minor design changes. The simulation results are compared to measurements from qualification testing the controller on a vibration table. Predicted blower performance is compared to flow loop testing measurements. Dynamic response of the system to valve changeovers is presented and discussed using high bandwidth measurements from dynamic pressure probes, magnetic bearing position sensors, and actuator coil currents. The results presented in the paper show that the blower controller will survive launch vibration levels, the blower flow meets the requirements, and the magnetic bearings have adequate load capacity and control bandwidth to maintain the desired rotor position during the valve changeover transients.

Keywords: blower, carbon dioxide removal, environmental control and life support system, magnetic bearing, permanent magnet motor, validation testing, vibration

Procedia PDF Downloads 135
15131 Long-Term Field Performance of Paving Fabric Interlayer Systems to Reduce Reflective Cracking

Authors: Farshad Amini, Kejun Wen

Abstract:

The formation of reflective cracking of pavement overlays has confronted highway engineers for many years. Stress-relieving interlayers, such as paving fabrics, have been used in an attempt to reduce or delay reflective cracking. The effectiveness of paving fabrics in reducing reflection cracking is related to joint or crack movement in the underlying pavement, crack width, overlay thickness, subgrade conditions, climate, and traffic volume. The nonwoven geotextiles are installed between the old and new asphalt layers. Paving fabrics enhance performance through two mechanisms: stress relief and waterproofing. Several factors including proper installation, remedial work performed before overlay, overlay thickness, variability of pavement strength, existing pavement condition, base/subgrade support condition, and traffic volume affect the performance. The primary objective of this study was to conduct a long-term monitoring of the paving fabric interlayer systems to evaluate its effectiveness and performance. A comprehensive testing, monitoring, and analysis program were undertaken, where twelve 500-ft pavement sections of a four-lane highway were rehabilitated, and then monitored for seven years. A comparison between the performance of paving fabric treatment systems and control sections is reported. Lessons learned, and the various factors are discussed.

Keywords: monitoring, paving fabrics, performance, reflective cracking

Procedia PDF Downloads 333
15130 Diagnosis of the Heart Rhythm Disorders by Using Hybrid Classifiers

Authors: Sule Yucelbas, Gulay Tezel, Cuneyt Yucelbas, Seral Ozsen

Abstract:

In this study, it was tried to identify some heart rhythm disorders by electrocardiography (ECG) data that is taken from MIT-BIH arrhythmia database by subtracting the required features, presenting to artificial neural networks (ANN), artificial immune systems (AIS), artificial neural network based on artificial immune system (AIS-ANN) and particle swarm optimization based artificial neural network (PSO-NN) classifier systems. The main purpose of this study is to evaluate the performance of hybrid AIS-ANN and PSO-ANN classifiers with regard to the ANN and AIS. For this purpose, the normal sinus rhythm (NSR), atrial premature contraction (APC), sinus arrhythmia (SA), ventricular trigeminy (VTI), ventricular tachycardia (VTK) and atrial fibrillation (AF) data for each of the RR intervals were found. Then these data in the form of pairs (NSR-APC, NSR-SA, NSR-VTI, NSR-VTK and NSR-AF) is created by combining discrete wavelet transform which is applied to each of these two groups of data and two different data sets with 9 and 27 features were obtained from each of them after data reduction. Afterwards, the data randomly was firstly mixed within themselves, and then 4-fold cross validation method was applied to create the training and testing data. The training and testing accuracy rates and training time are compared with each other. As a result, performances of the hybrid classification systems, AIS-ANN and PSO-ANN were seen to be close to the performance of the ANN system. Also, the results of the hybrid systems were much better than AIS, too. However, ANN had much shorter period of training time than other systems. In terms of training times, ANN was followed by PSO-ANN, AIS-ANN and AIS systems respectively. Also, the features that extracted from the data affected the classification results significantly.

Keywords: AIS, ANN, ECG, hybrid classifiers, PSO

Procedia PDF Downloads 442
15129 Symmetry of Performance across Lower Limb Tests between the Dominant and Non-Dominant Legs

Authors: Ghulam Hussain, Herrington Lee, Comfort Paul, Jones Richard

Abstract:

Background: To determine the functional limitations of the lower limbs or readiness to return to sport, most rehabilitation programs use some form of testing; however, it is still unknown what the pass criteria is. This study aims to investigate the differences between the dominant and non-dominant leg performances across several lower limb tasks, which are hop tests, two-dimensional (2D) frontal plane projection angle (FPPA) tests, and isokinetic muscle tests. This study also provides the reference values for the limb symmetry index (LSI) for the hop and isokinetic muscle strength tests. Twenty recreationally active participants were recruited, 11 males and 9 females (age 23.65±2.79 years; height 169.9±3.74 cm; and body mass 74.72±5.81 kg. All tests were undertaken on the dominant and non-dominant legs. These tests are (1) Hop tests, which include horizontal hop for distance and crossover hop tests, (2) Frontal plane projection angle (FPPA): 2D capturing from two different tasks, which are forward hop landing and squatting, and (3) Isokinetic muscle strength tests: four different muscles were tested: quadriceps, hamstring, ankle plantar flexor, and hip extensor muscles. The main outcome measurements were, for the (1) hop tests: maximum distance was taken when undertaking single/crossover hop for distance using a standard tape measure, (2) for the FPPA: the knee valgus angle was measured from the maximum knee flexion position using a single 2D camera, and (3) for the isokinetic muscle strength tests: three different variables were measured: peak torque, peak torque to body weight, and the total work to body weight. All the muscle strength tests have been applied in both concentric and eccentric muscle actions at a speed of 60°/sec. This study revealed no differences between the dominant and non-dominant leg performance, and 85% of LSI was achieved by the majority of the subjects in both hop and isokinetic muscle tests, and; therefore, one leg’s hop performance can define the other.

Keywords: 2D FPPA, hop tests, isokinetic testing, LSI

Procedia PDF Downloads 66
15128 Kazakh Language Assessment in a New Multilingual Kazakhstan

Authors: Karlygash Adamova

Abstract:

This article is focused on the KazTest as one of the most important high-stakes tests and the key tool in Kazakh language assessment. The research will also include the brief introduction to the language policy in Kazakhstan. Particularly, it is going to be changed significantly and turn from bilingualism (Kazakh, Russian) to multilingual policy (three languages - Kazakh, Russian, English). Therefore, the current status of the abovementioned languages will be described. Due to the various educational reforms in the country, the language evaluation system should also be improved and moderated. The research will present the most significant test of Kazakhstan – the KazTest, which is aimed to evaluate the Kazakh language proficiency. Assessment is an ongoing process that encompasses a wide area of knowledge upon the productive performance of the learners. Test is widely defined as a standardized or standard method of research, testing, diagnostics, verification, etc. The two most important characteristics of any test, as the main element of the assessment - validity and reliability - will also be described in this paper. Therefore, the preparation and design of the test, which is assumed to be an indicator of knowledge, and it is highly important to take into account all these properties.

Keywords: multilingualism, language assessment, testing, language policy

Procedia PDF Downloads 136
15127 Effect of Blast Furnace Iron Slag on the Mechanical Performance of Hot Mix Asphalt (HMA)

Authors: Ayman M. Othman, Hassan Y. Ahmed

Abstract:

This paper discusses the effect of using blast furnace iron slag as a part of fine aggregate on the mechanical performance of hot mix asphalt (HMA). The mechanical performance was evaluated based on various mechanical properties that include; Marshall/stiffness, indirect tensile strength and unconfined compressive strength. The effect of iron slag content on the mechanical properties of the mixtures was also investigated. Four HMA with various iron slag contents, namely; 0%, 5%, 10% and 15% by weight of total mixture were studied. Laboratory testing has revealed an enhancement in the compressive strength of HMA when iron slag was used. Within the tested range of iron slag content, a considerable increase in the compressive strength of the mixtures was observed with the increase of slag content. No significant improvement on Marshall/stiffness and indirect tensile strength of the mixtures was observed when slag was used. Even so, blast furnace iron slag can still be used in asphalt paving for environmental advantages.

Keywords: blast furnace iron slag, compressive strength, HMA, indirect tensile strength, marshall/stiffness, mechanical performance, mechanical properties

Procedia PDF Downloads 438
15126 Pharmacodynamic Enhancement of Repetitive rTMS Treatment Outcomes for Major Depressive Disorder

Authors: A. Mech

Abstract:

Repetitive transcranial magnetic stimulation has proven to be a valuable treatment option for patients who have failed to respond to multiple courses of antidepressant medication. In fact, the American Psychiatric Association recommends TMS after one failed treatment course of antidepressant medication. Genetic testing has proven valuable for pharmacokinetic variables, which, if understood, could lead to more efficient dosing of psychotropic medications to improve outcomes. Pharmacodynamic testing can identify biomarkers, which, if addressed, can improve patients' outcomes in antidepressant therapy. Monotherapy treatment of major depressive disorder with methylated B vitamin treatment has been shown to be safe and effective in patients with MTHFR polymorphisms without waiting for multiple trials of failed medication treatment for depression. Such treatment has demonstrated remission rates similar to antidepressant clinical trials. Combining pharmacodynamics testing with repetitive TMS treatment with NeuroStar has shown promising potential for enhancing remission rates and durability of treatment. In this study, a retrospective chart review (ongoing) of patients who obtained repetitive TMS treatment enhanced by dietary supplementation guided by Pharmacodynamic testing, displayed a greater remission rate (90%) than patients treated with only NeuroStar TMS (62%).

Keywords: improved remission rate, major depressive disorder, pharmacodynamic testing, rTMS outcomes

Procedia PDF Downloads 57
15125 An Empirical Evaluation of Performance of Machine Learning Techniques on Imbalanced Software Quality Data

Authors: Ruchika Malhotra, Megha Khanna

Abstract:

The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.

Keywords: change proneness, empirical validation, imbalanced learning, machine learning techniques, object-oriented metrics

Procedia PDF Downloads 418
15124 Usability Testing with Children: BatiKids Case Study

Authors: Hestiasari Rante, Leonardo De Araújo, Heidi Schelhowe

Abstract:

Usability testing with children is similar in many aspects to usability testing with adults. However, there are a few differences that one needs to be aware of in order to get the most out of the sessions, and to ensure that children are comfortable and enjoying the process. This paper presents the need to acquire methodological knowledge for involving children as test users in usability testing, with consideration on Piaget’s theory of cognitive growth. As a case study, we use BatiKids, an application developed to evoke children’s enthusiasm to be involved in culture heritage preservation. The usability test was applied to 24 children from 9 to 10 years old. The children were divided into two groups; one interacted with the application through a graphic tablet with pen, and the other through touch screen. Both of the groups had to accomplish the same amount of tasks. In the end, children were asked to give feedback. The results suggested that children who interacted using the graphic tablet with pen had more difficulties rather than children who interacted through touch screen. However, the difficulty brought by the graphic tablet with pen is an important learning objective in order to understand the difficulties of using canting, which is an important part of batik.

Keywords: batikids, children, child-computer interaction, usability test

Procedia PDF Downloads 296
15123 Effect of Temperature on the Production of Fructose and Bioethanol from Date’s Syrup using S. cerevisiae ATCC 36859

Authors: M. A. Zeinelabdeen, A. E. Abasaeed, M. H. Gaily, A. K. Sulieman, M. D. Putra

Abstract:

The effect of temperature on the production of fructose and bioethanol from date syrup via selective fermentation by S. cerevisiae ATCC 36859 strain was studied. Various temperatures have been tested (27, 30 and 33 ᵒC). The fermentation experiments were conducted in a water shaker bath at the three temperatures under testing and 120 rpm. The results showed that a high fructose yield can be achieved at all temperatures under testing while the optimal is 27 ᵒC with 84% fructose yield. A high ethanol yield can be obtained for all temperatures under testing. However; the maximum biomass concentration and ethanol yield (86.22%) were obtained at 30 ᵒC.

Keywords: dates, ethanol, fructose, fermentation, S. cerevisiae

Procedia PDF Downloads 402
15122 Enhancing Scalability in Ethereum Network Analysis: Methods and Techniques

Authors: Stefan K. Behfar

Abstract:

The rapid growth of the Ethereum network has brought forth the urgent need for scalable analysis methods to handle the increasing volume of blockchain data. In this research, we propose efficient methodologies for making Ethereum network analysis scalable. Our approach leverages a combination of graph-based data representation, probabilistic sampling, and parallel processing techniques to achieve unprecedented scalability while preserving critical network insights. Data Representation: We develop a graph-based data representation that captures the underlying structure of the Ethereum network. Each block transaction is represented as a node in the graph, while the edges signify temporal relationships. This representation ensures efficient querying and traversal of the blockchain data. Probabilistic Sampling: To cope with the vastness of the Ethereum blockchain, we introduce a probabilistic sampling technique. This method strategically selects a representative subset of transactions and blocks, allowing for concise yet statistically significant analysis. The sampling approach maintains the integrity of the network properties while significantly reducing the computational burden. Graph Convolutional Networks (GCNs): We incorporate GCNs to process the graph-based data representation efficiently. The GCN architecture enables the extraction of complex spatial and temporal patterns from the sampled data. This combination of graph representation and GCNs facilitates parallel processing and scalable analysis. Distributed Computing: To further enhance scalability, we adopt distributed computing frameworks such as Apache Hadoop and Apache Spark. By distributing computation across multiple nodes, we achieve a significant reduction in processing time and enhanced memory utilization. Our methodology harnesses the power of parallelism, making it well-suited for large-scale Ethereum network analysis. Evaluation and Results: We extensively evaluate our methodology on real-world Ethereum datasets covering diverse time periods and transaction volumes. The results demonstrate its superior scalability, outperforming traditional analysis methods. Our approach successfully handles the ever-growing Ethereum data, empowering researchers and developers with actionable insights from the blockchain. Case Studies: We apply our methodology to real-world Ethereum use cases, including detecting transaction patterns, analyzing smart contract interactions, and predicting network congestion. The results showcase the accuracy and efficiency of our approach, emphasizing its practical applicability in real-world scenarios. Security and Robustness: To ensure the reliability of our methodology, we conduct thorough security and robustness evaluations. Our approach demonstrates high resilience against adversarial attacks and perturbations, reaffirming its suitability for security-critical blockchain applications. Conclusion: By integrating graph-based data representation, GCNs, probabilistic sampling, and distributed computing, we achieve network scalability without compromising analytical precision. This approach addresses the pressing challenges posed by the expanding Ethereum network, opening new avenues for research and enabling real-time insights into decentralized ecosystems. Our work contributes to the development of scalable blockchain analytics, laying the foundation for sustainable growth and advancement in the domain of blockchain research and application.

Keywords: Ethereum, scalable network, GCN, probabilistic sampling, distributed computing

Procedia PDF Downloads 76
15121 The Role of Validity and Reliability in the Development of Online Testing

Authors: Ani Demetrashvili

Abstract:

The purpose of this paper is to show how students trust online tests and determine validity and reliability in the development of online testing. The pandemic situation changed every field in the world, and it changed education as well. Educational institutions moved into the online space, which was the only decision they were able to make at that time. Online assessment through online proctoring was a totally new challenge for educational institutions, and they needed to deal with it successfully. Participants were chosen from the English language center. The validity of the questionnaire was identified according to the Likert scale and Cronbach’s alpha; later, data from the participants was analyzed as well. The article summarizes literature that is available about online assessment and is interesting for people who are interested in this kind of assessment. Based on the research findings, students favor in-person testing over online assessment due to their lack of experience and skills in the latter.

Keywords: online assessment, online proctoring

Procedia PDF Downloads 40
15120 An Investigation into the Effect of Broken Homes on Students Academic Performance

Authors: Hafsat Mustapha Hanga

Abstract:

The purpose of the this study was to investigate the effect of broken home on students' academic performance. Therefore, it focused on academic performance and Parental care of the student from and intact home from a cognitive motivational perceptive. The broken and intact home and also to find out if they differ in parental care this is done by using 376 subjects out of the population of 21,378. The sample was obtained using stratified random sampling techniques as the population contained sub-groups the study design was ex-post facto. The data was collected using 3 kind of instruments. To test the first and second hypotheses. Junior secondary school placement examination result was obtained to test the academic performance of the boys fron broken home and boys from and boys from intact home and then girl from broken home and girls from intact home.T-Test was used in the analysis of first and second hypotheses. For the third hypotheses two different kind of questionnaires were developed, the first was used to identify student that are from broken home while the second was for testing parental care between the subject. Chi-square was used to analyze the third hypotheses. Alkh the three 3 hypotheses were tested and rejected and were all in favor of student from intact home. The study found that there was a significant difference in the academic performance of the boys from brokeb and boys from intact home. When boys from intact home better then those boys from broken home. It also reveals that a student from a intact from intact home receives good parental care, love and concern than those from broken home.on the strength of these findings the need to establish an institution which will help those parent who have parenting problems was stressed and also the need to foster. Home school partnership was also stressed and advocate.

Keywords: broken homes, academic performance, parental care, foster

Procedia PDF Downloads 463