Search results for: paper pencil based testing
44671 Early Diagnosis of Alzheimer's Disease Using a Combination of Images Processing and Brain Signals
Authors: E. Irankhah, M. Zarif, E. Mazrooei Rad, K. Ghandehari
Abstract:
Alzheimer's prevalence is on the rise, and the disease comes with problems like cessation of treatment, high cost of treatment, and the lack of early detection methods. The pathology of this disease causes the formation of protein deposits in the brain of patients called plaque amyloid. Generally, the diagnosis of this disease is done by performing tests such as a cerebrospinal fluid, CT scan, MRI, and spinal cord fluid testing, or mental testing tests and eye tracing tests. In this paper, we tried to use the Medial Temporal Atrophy (MTA) method and the Leave One Out (LOO) cycle to extract the statistical properties of the three Fz, Pz, and Cz channels of ERP signals for early diagnosis of this disease. In the process of CT scan images, the accuracy of the results is 81% for the healthy person and 88% for the severe patient. After the process of ERP signaling, the accuracy of the results for a healthy person in the delta band in the Cz channel is 81% and in the alpha band the Pz channel is 90%. In the results obtained from the signal processing, the results of the severe patient in the delta band of the Cz channel were 89% and in the alpha band Pz channel 92%.Keywords: Alzheimer's disease, image and signal processing, LOO cycle, medial temporal atrophy
Procedia PDF Downloads 19844670 Towards Modern Approaches of Intelligence Measurement for Clinical and Educational Practices
Authors: Alena Kulikova, Tatjana Kanonire
Abstract:
Intelligence research is one of the oldest fields of psychology. Many factors have made a research on intelligence, defined as reasoning and problem solving [1, 2], a very acute and urgent problem. Thus, it has been repeatedly shown that intelligence is a predictor of academic, professional, and social achievement in adulthood (for example, [3]); Moreover, intelligence predicts these achievements better than any other trait or ability [4]. The individual level, a comprehensive assessment of intelligence is a necessary criterion for the diagnosis of various mental conditions. For example, it is a necessary condition for psychological, medical and pedagogical commissions when deciding on educational needs and the most appropriate educational programs for school children. Assessment of intelligence is crucial in clinical psychodiagnostic and needs high-quality intelligence measurement tools. Therefore, it is not surprising that the development of intelligence tests is an essential part of psychological science and practice. Many modern intelligence tests have a long history and have been used for decades, for example, the Stanford-Binet test or the Wechsler test. However, the vast majority of these tests are based on the classic linear test structure, in which all respondents receive all tasks (see, for example, a critical review by [5]). This understanding of the testing procedure is a legacy of the pre-computer era, in which blank testing was the only diagnostic procedure available [6] and has some significant limitations that affect the reliability of the data obtained [7] and increased time costs. Another problem with measuring IQ is that classical line-structured tests do not fully allow to measure respondent's intellectual progress [8], which is undoubtedly a critical limitation. Advances in modern psychometrics allow for avoiding the limitations of existing tools. However, as in any rapidly developing industry, at the moment, psychometrics does not offer ready-made and straightforward solutions and requires additional research. In our presentation we would like to discuss the strengths and weaknesses of the current approaches to intelligence measurement and highlight “points of growth” for creating a test in accordance with modern psychometrics. Whether it is possible to create the instrument that will use all achievements of modern psychometric and remain valid and practically oriented. What would be the possible limitations for such an instrument? The theoretical framework and study design to create and validate the original Russian comprehensive computer test for measuring the intellectual development in school-age children will be presented.Keywords: Intelligence, psychometrics, psychological measurement, computerized adaptive testing, multistage testing
Procedia PDF Downloads 8044669 Analysis of Control by Flattening of the Welded Tubes
Authors: Hannachi Med Tahar, H. Djebaili, B. Daheche
Abstract:
In this approach, we have tried to describe the flattening of welded tubes, and its experimental application. The test is carried out at the (National product processing company dishes and tubes production). Usually, the final products (tubes) undergo a series of non-destructive inspection online and offline welding, and obviously destructive mechanical testing (bending, flattening, flaring, etc.). For this and for the purpose of implementing the flattening test, which applies to the processing of round tubes in other forms, it took four sections of welded tubes draft (before stretching hot) and welded tubes finished (after drawing hot and annealing), it was also noted the report 'health' flattened tubes must not show or crack or tear. The test is considered poor if it reveals a lack of ductility of the metal.Keywords: flattening, destructive testing, tube drafts, finished tube, Castem 2001
Procedia PDF Downloads 44644668 ANFIS Based Technique to Estimate Remnant Life of Power Transformer by Predicting Furan Contents
Authors: Priyesh Kumar Pandey, Zakir Husain, R. K. Jarial
Abstract:
Condition monitoring and diagnostic is important for testing of power transformer in order to estimate the remnant life. Concentration of furan content in transformer oil can be a promising indirect measurement of the aging of transformer insulation. The oil gets contaminated mainly due to ageing. The present paper introduces adaptive neuro fuzzy technique to correlate furanic compounds obtained by high performance liquid chromatography (HPLC) test and remnant life of the power transformer. The results are obtained by conducting HPLC test at TIFAC-CORE lab, NIT Hamirpur on thirteen power transformer oil samples taken from Himachal State Electricity Board, India.Keywords: adaptive neuro fuzzy technique, furan compounds, remnant life, transformer oil
Procedia PDF Downloads 46444667 Attention and Memory in the Music Learning Process in Individuals with Visual Impairments
Authors: Lana Burmistrova
Abstract:
Introduction: The influence of visual impairments on several cognitive processes used in the music learning process is an increasingly important area in special education and cognitive musicology. Many children have several visual impairments due to the refractive errors and irreversible inhibitors. However, based on the compensatory neuroplasticity and functional reorganization, congenitally blind (CB) and early blind (EB) individuals use several areas of the occipital lobe to perceive and process auditory and tactile information. CB individuals have greater memory capacity, memory reliability, and less false memory mechanisms are used while executing several tasks, they have better working memory (WM) and short-term memory (STM). Blind individuals use several strategies while executing tactile and working memory n-back tasks: verbalization strategy (mental recall), tactile strategy (tactile recall) and combined strategies. Methods and design: The aim of the pilot study was to substantiate similar tendencies while executing attention, memory and combined auditory tasks in blind and sighted individuals constructed for this study, and to investigate attention, memory and combined mechanisms used in the music learning process. For this study eight (n=8) blind and eight (n=8) sighted individuals aged 13-20 were chosen. All respondents had more than five years music performance and music learning experience. In the attention task, all respondents had to identify pitch changes in tonal and randomized melodic pairs. The memory task was based on the mismatch negativity (MMN) proportion theory: 80 percent standard (not changed) and 20 percent deviant (changed) stimuli (sequences). Every sequence was named (na-na, ra-ra, za-za) and several items (pencil, spoon, tealight) were assigned for each sequence. Respondents had to recall the sequences, to associate them with the item and to detect possible changes. While executing the combined task, all respondents had to focus attention on the pitch changes and had to detect and describe these during the recall. Results and conclusion: The results support specific features in CB and EB, and similarities between late blind (LB) and sighted individuals. While executing attention and memory tasks, it was possible to observe the tendency in CB and EB by using more precise execution tactics and usage of more advanced periodic memory, while focusing on auditory and tactile stimuli. While executing memory and combined tasks, CB and EB individuals used passive working memory to recall standard sequences, active working memory to recall deviant sequences and combined strategies. Based on the observation results, assessment of blind respondents and recording specifics, following attention and memory correlations were identified: reflective attention and STM, reflective attention and periodic memory, auditory attention and WM, tactile attention and WM, auditory tactile attention and STM. The results and the summary of findings highlight the attention and memory features used in the music learning process in the context of blindness, and the tendency of the several attention and memory types correlated based on the task, strategy and individual features.Keywords: attention, blindness, memory, music learning, strategy
Procedia PDF Downloads 18444666 A Comparative Study of Generalized Autoregressive Conditional Heteroskedasticity (GARCH) and Extreme Value Theory (EVT) Model in Modeling Value-at-Risk (VaR)
Authors: Longqing Li
Abstract:
The paper addresses the inefficiency of the classical model in measuring the Value-at-Risk (VaR) using a normal distribution or a Student’s t distribution. Specifically, the paper focuses on the one day ahead Value-at-Risk (VaR) of major stock market’s daily returns in US, UK, China and Hong Kong in the most recent ten years under 95% confidence level. To improve the predictable power and search for the best performing model, the paper proposes using two leading alternatives, Extreme Value Theory (EVT) and a family of GARCH models, and compares the relative performance. The main contribution could be summarized in two aspects. First, the paper extends the GARCH family model by incorporating EGARCH and TGARCH to shed light on the difference between each in estimating one day ahead Value-at-Risk (VaR). Second, to account for the non-normality in the distribution of financial markets, the paper applies Generalized Error Distribution (GED), instead of the normal distribution, to govern the innovation term. A dynamic back-testing procedure is employed to assess the performance of each model, a family of GARCH and the conditional EVT. The conclusion is that Exponential GARCH yields the best estimate in out-of-sample one day ahead Value-at-Risk (VaR) forecasting. Moreover, the discrepancy of performance between the GARCH and the conditional EVT is indistinguishable.Keywords: Value-at-Risk, Extreme Value Theory, conditional EVT, backtesting
Procedia PDF Downloads 32144665 Improvement of Analysis Vertical Oil Exploration Wells (Case Study)
Authors: Azza Hashim Abbas, Wan Rosli Wan Suliman
Abstract:
The old school of study, well testing reservoir engineers used the transient pressure analyses to get certain parameters and variable factors on the reservoir's physical properties, such as, (permeability-thickness). Recently, the difficulties facing the newly discovered areas are the convincing fact that the exploration and production (E&p) team should have sufficiently accurate and appropriate data to work with due to different sources of errors. The well-test analyst does the work without going through well-informed and reliable data from colleagues which may consequently cause immense environmental damage and unnecessary financial losses as well as opportunity losses to the project. In 2003, new potential oil field (Moga) face circulation problem well-22 was safely completed. However the high mud density had caused an extensive damage to the nearer well area which also distracted the hypothetical oil rate of flow that was not representive of the real reservoir characteristics This paper presents methods to analyze and interpret the production rate and pressure data of an oil field. Specifically for Well- 22 using the Deconvolution technique to enhance the transient pressure .Applying deconvolution to get the best range of certainty of results needed for the next subsequent operation. The range determined and analysis of skin factor range was reasonable.Keywords: well testing, exploration, deconvolution, skin factor, un certainity
Procedia PDF Downloads 44544664 A Nonlinear Approach for System Identification of a Li-Ion Battery Based on a Non-Linear Autoregressive Exogenous Model
Authors: Meriem Mossaddek, El Mehdi Laadissi, El Mehdi Loualid, Chouaib Ennawaoui, Sohaib Bouzaid, Abdelowahed Hajjaji
Abstract:
An electrochemical system is a subset of mechatronic systems that includes a wide variety of batteries and nickel-cadmium, lead-acid batteries, and lithium-ion. Those structures have several non-linear behaviors and uncertainties in their running range. This paper studies an effective technique for modeling Lithium-Ion (Li-Ion) batteries using a Nonlinear Auto-Regressive model with exogenous input (NARX). The Artificial Neural Network (ANN) is trained to employ the data collected from the battery testing process. The proposed model is implemented on a Li-Ion battery cell. Simulation of this model in MATLAB shows good accuracy of the proposed model.Keywords: lithium-ion battery, neural network, energy storage, battery model, nonlinear models
Procedia PDF Downloads 11444663 E-learning resources for radiology training: Is an ideal program available?
Authors: Eric Fang, Robert Chen, Ghim Song Chia, Bien Soo Tan
Abstract:
Objective and Rationale: Training of radiology residents hinges on practical, on-the-job training in all facets and modalities of diagnostic radiology. Although residency is structured to be comprehensive, clinical exposure depends on the case mix available locally and during the posting period. To supplement clinical training, there are several e-learning resources available to allow for greater exposure to radiological cases. The objective of this study was to survey residents and faculty on the usefulness of these e-learning resources. Methods: E-learning resources were shortlisted with input from radiology residents, Google search and online discussion groups, and screened by their purported focus. Twelve e-learning resources were found to meet the criteria. Both radiology residents and experienced radiology faculty were then surveyed electronically. The e-survey asked for ratings on breadth, depth, testing capability and user-friendliness for each resource, as well as for rankings for the top 3 resources. Statistical analysis was performed using SAS 9.4. Results: Seventeen residents and fifteen faculties completed an e-survey. Mean response rate was 54% ± 8% (Range: 14- 96%). Ratings and rankings were statistically identical between residents and faculty. On a 5-point rating scale, breadth was 3.68 ± 0.18, depth was 3.95 ± 0.14, testing capability was 2.64 ± 0.16 and user-friendliness was 3.39 ± 0.13. Top-ranked resources were STATdx (first), Radiopaedia (second) and Radiology Assistant (third). 9% of responders singled out R-ITI as potentially good but ‘prohibitively costly’. Statistically significant predictive factors for higher rankings are familiarity with the resource (p = 0.001) and user-friendliness (p = 0.006). Conclusion: A good e-learning system will complement on-the-job training with a broad case base, deep discussion and quality trainee evaluation. Based on our study on twelve e-learning resources, no single program fulfilled all requirements. The perception and use of radiology e-learning resources depended more on familiarity and user-friendliness than on content differences and testing capability.Keywords: e-learning, medicine, radiology, survey
Procedia PDF Downloads 33344662 Design and Evaluation of a Pneumatic Muscle Actuated Gripper
Authors: Tudor Deaconescu, Andrea Deaconescu
Abstract:
Deployment of pneumatic muscles in various industrial applications is still in its early days, considering the relative newness of these components. The field of robotics holds particular future potential for pneumatic muscles, especially in view of their specific behaviour known as compliance. The paper presents and discusses an innovative constructive solution for a gripper system mountable on an industrial robot, based on actuation by a linear pneumatic muscle and transmission of motion by gear and rack mechanism. The structural, operational and constructive models of the new gripper are presented, along with some of the experimental results obtained subsequently to the testing of a prototype. Further presented are two control variants of the gripper system, one by means of a 3/2-way fast-switching solenoid valve, the other by means of a proportional pressure regulator. Advantages and disadvantages are discussed for both variants.Keywords: gripper system, pneumatic muscle, structural modelling, robotics
Procedia PDF Downloads 23544661 Predicting the Diagnosis of Alzheimer’s Disease: Development and Validation of Machine Learning Models
Authors: Jay L. Fu
Abstract:
Patients with Alzheimer's disease progressively lose their memory and thinking skills and, eventually, the ability to carry out simple daily tasks. The disease is irreversible, but early detection and treatment can slow down the disease progression. In this research, publicly available MRI data and demographic data from 373 MRI imaging sessions were utilized to build models to predict dementia. Various machine learning models, including logistic regression, k-nearest neighbor, support vector machine, random forest, and neural network, were developed. Data were divided into training and testing sets, where training sets were used to build the predictive model, and testing sets were used to assess the accuracy of prediction. Key risk factors were identified, and various models were compared to come forward with the best prediction model. Among these models, the random forest model appeared to be the best model with an accuracy of 90.34%. MMSE, nWBV, and gender were the three most important contributing factors to the detection of Alzheimer’s. Among all the models used, the percent in which at least 4 of the 5 models shared the same diagnosis for a testing input was 90.42%. These machine learning models allow early detection of Alzheimer’s with good accuracy, which ultimately leads to early treatment of these patients.Keywords: Alzheimer's disease, clinical diagnosis, magnetic resonance imaging, machine learning prediction
Procedia PDF Downloads 14344660 Behavior Factors Evaluation for Reinforced Concrete Structures
Authors: Muhammad Rizwan, Naveed Ahmad, Akhtar Naeem Khan
Abstract:
Seismic behavior factors are evaluated for the performance assessment of low rise reinforced concrete RC frame structures based on experimental study of unidirectional dynamic shake table testing of two 1/3rd reduced scaled two storey frames, with a code confirming special moment resisting frame (SMRF) model and a noncompliant model of similar characteristics but built in low strength concrete .The models were subjected to a scaled accelerogram record of 1994 Northridge earthquake to deformed the test models to final collapse stage in order to obtain the structural response parameters. The fully compliant model was observed with more stable beam-sway response, experiencing beam flexure yielding and ground-storey column base yielding upon subjecting to 100% of the record. The response modification factor - R factor obtained for the code complaint and deficient prototype structures were 7.5 and 4.5 respectively, which is about 10% and 40% less than the UBC-97 specified value for special moment resisting reinforced concrete frame structures.Keywords: Northridge 1994 earthquake, reinforced concrete frame, response modification factor, shake table testing
Procedia PDF Downloads 17144659 Design, Construction and Evaluation of Ultra-High-Performance Concrete (UHPC) Bridge Deck Overlays
Authors: Jordy Padilla
Abstract:
The New Jersey Department of Transportation (NJDOT) initiated a research project to install and evaluate Ultra-High-Performance Concrete (UHPC) as an overlay on existing bridges. The project aims to implement UHPC overlays in NJDOT bridge deck strategies for preservation and repair. During design, four bridges were selected for construction. The construction involved the removal of the existing bridge asphalt overlays, partially removing the existing concrete deck surface, and resurfacing the deck with a UHPC overlay. In some cases, a new asphalt riding surface was placed. Additionally, existing headers were replaced with full-depth UHPC. The UHPC overlay is monitored through coring and Non-destructive testing (NDT) to ensure that the interfacial bond is intact and that the desired conditions are maintained. The NDT results show no evidence that the bond between the new UHPC overlay and the existing concrete deck is compromised. Bond strength test data demonstrates that, in general, the desired bond was achieved between UHPC and the substrate concrete, although the results were lower than anticipated. Chloride content is also within expectations except for one anomaly. The baseline testing was successful, and no significant defects were encountered.Keywords: ultra-high performance concrete, rehabilitation, non-destructive testing
Procedia PDF Downloads 8044658 The Fracture Resistance of Zirconia Based Dental Crowns from Cyclic Loading: A Function of Relative Wear Depth
Authors: T. Qasim, B. El Masoud, D. Ailabouni
Abstract:
This in vitro study focused on investigating the fatigue resistance of veneered zirconia molar crowns with different veneering ceramic thicknesses, simulating the relative wear depths under simulated cyclic loading. A mandibular first molar was prepared and then scanned using computer-aided design/computer-aided manufacturing (CAD/CAM) technology to fabricate 32 zirconia copings of uniform 0.5 mm thickness. The manufactured copings then veneered with 1.5 mm, 1.0 mm, 0.5 mm, and 0.0 mm representing 0%, 33%, 66%, and 100% relative wear of a normal ceramic thickness of 1.5 mm. All samples were thermally aged to 6000 thermo-cycles for 2 minutes with distilled water between 5 ˚C and 55 ˚C. The samples subjected to cyclic fatigue and fracture testing using SD Mechatronik chewing simulator. These samples are loaded up to 1.25x10⁶ cycles or until they fail. During fatigue, testing, extensive cracks were observed in samples with 0.5 mm veneering layer thickness. Veneering layer thickness 1.5-mm group and 1.0-mm group were not different in terms of resisting loads necessary to cause an initial crack or final failure. All ceramic zirconia-based crown restorations with varying occlusal veneering layer thicknesses appeared to be fatigue resistant. Fracture load measurement for all tested groups before and after fatigue loading exceeded the clinical chewing forces in the posterior region. In general, the fracture loads increased after fatigue loading and with the increase in the thickness of the occlusal layering ceramic.Keywords: all ceramic, cyclic loading, chewing simulator, dental crowns, relative wear, thermally ageing
Procedia PDF Downloads 14244657 Experimental Procedure of Identifying Ground Type by Downhole Test: A Case Study
Authors: Seyed Abolhassan Naeini, Maedeh Akhavan Tavakkoli
Abstract:
Evaluating the shear wave velocity (V_s) and primary wave velocity (Vₚ) is necessary to identify the ground type of the site. Identifying the soil type based on different codes can affect the dynamic analysis of geotechnical properties. This study aims to separate the underground layers at the project site based on the shear wave and primary wave velocity (Sₚ) in different depths and determine dynamic elastic modulus based on the shear wave velocity. Bandar Anzali is located in a tectonically very active area. Several active faults surround the study site. In this case, a field investigation of downhole testing is conducted as a geophysics method to identify the ground type.Keywords: downhole, geophysics, shear wave velocity, case-study
Procedia PDF Downloads 13844656 Numerical Simulation of Supersonic Gas Jet Flows and Acoustics Fields
Authors: Lei Zhang, Wen-jun Ruan, Hao Wang, Peng-Xin Wang
Abstract:
The source of the jet noise is generated by rocket exhaust plume during rocket engine testing. A domain decomposition approach is applied to the jet noise prediction in this paper. The aerodynamic noise coupling is based on the splitting into acoustic sources generation and sound propagation in separate physical domains. Large Eddy Simulation (LES) is used to simulate the supersonic jet flow. Based on the simulation results of the flow-fields, the jet noise distribution of the sound pressure level is obtained by applying the Ffowcs Williams-Hawkings (FW-H) acoustics equation and Fourier transform. The calculation results show that the complex structures of expansion waves, compression waves and the turbulent boundary layer could occur due to the strong interaction between the gas jet and the ambient air. In addition, the jet core region, the shock cell and the sound pressure level of the gas jet increase with the nozzle size increasing. Importantly, the numerical simulation results of the far-field sound are in good agreement with the experimental measurements in directivity.Keywords: supersonic gas jet, Large Eddy Simulation(LES), acoustic noise, Ffowcs Williams-Hawkings(FW-H) equations, nozzle size
Procedia PDF Downloads 41344655 Constructing the Joint Mean-Variance Regions for Univariate and Bivariate Normal Distributions: Approach Based on the Measure of Cumulative Distribution Functions
Authors: Valerii Dashuk
Abstract:
The usage of the confidence intervals in economics and econometrics is widespread. To be able to investigate a random variable more thoroughly, joint tests are applied. One of such examples is joint mean-variance test. A new approach for testing such hypotheses and constructing confidence sets is introduced. Exploring both the value of the random variable and its deviation with the help of this technique allows checking simultaneously the shift and the probability of that shift (i.e., portfolio risks). Another application is based on the normal distribution, which is fully defined by mean and variance, therefore could be tested using the introduced approach. This method is based on the difference of probability density functions. The starting point is two sets of normal distribution parameters that should be compared (whether they may be considered as identical with given significance level). Then the absolute difference in probabilities at each 'point' of the domain of these distributions is calculated. This measure is transformed to a function of cumulative distribution functions and compared to the critical values. Critical values table was designed from the simulations. The approach was compared with the other techniques for the univariate case. It differs qualitatively and quantitatively in easiness of implementation, computation speed, accuracy of the critical region (theoretical vs. real significance level). Stable results when working with outliers and non-normal distributions, as well as scaling possibilities, are also strong sides of the method. The main advantage of this approach is the possibility to extend it to infinite-dimension case, which was not possible in the most of the previous works. At the moment expansion to 2-dimensional state is done and it allows to test jointly up to 5 parameters. Therefore the derived technique is equivalent to classic tests in standard situations but gives more efficient alternatives in nonstandard problems and on big amounts of data.Keywords: confidence set, cumulative distribution function, hypotheses testing, normal distribution, probability density function
Procedia PDF Downloads 17444654 Temporal Case-Based Reasoning System for Automatic Parking Complex
Authors: Alexander P. Eremeev, Ivan E. Kurilenko, Pavel R. Varshavskiy
Abstract:
In this paper, the problem of the application of temporal reasoning and case-based reasoning in intelligent decision support systems is considered. The method of case-based reasoning with temporal dependences for the solution of problems of real-time diagnostics and forecasting in intelligent decision support systems is described. This paper demonstrates how the temporal case-based reasoning system can be used in intelligent decision support systems of the car access control. This work was supported by RFBR.Keywords: analogous reasoning, case-based reasoning, intelligent decision support systems, temporal reasoning
Procedia PDF Downloads 52944653 A Palmprint Identification System Based Multi-Layer Perceptron
Authors: David P. Tantua, Abdulkader Helwan
Abstract:
Biometrics has been recently used for the human identification systems using the biological traits such as the fingerprints and iris scanning. Identification systems based biometrics show great efficiency and accuracy in such human identification applications. However, these types of systems are so far based on some image processing techniques only, which may decrease the efficiency of such applications. Thus, this paper aims to develop a human palmprint identification system using multi-layer perceptron neural network which has the capability to learn using a backpropagation learning algorithms. The developed system uses images obtained from a public database available on the internet (CASIA). The processing system is as follows: image filtering using median filter, image adjustment, image skeletonizing, edge detection using canny operator to extract features, clear unwanted components of the image. The second phase is to feed those processed images into a neural network classifier which will adaptively learn and create a class for each different image. 100 different images are used for training the system. Since this is an identification system, it should be tested with the same images. Therefore, the same 100 images are used for testing it, and any image out of the training set should be unrecognized. The experimental results shows that this developed system has a great accuracy 100% and it can be implemented in real life applications.Keywords: biometrics, biological traits, multi-layer perceptron neural network, image skeletonizing, edge detection using canny operator
Procedia PDF Downloads 37144652 Mechanical Testing on Bioplastics Obtained from Banana and Potato Peels in the City of Bogotá, Colombia
Authors: Juan Eduardo Rolon Rios, Fredy Alejandro Orjuela, Alexander Garcia Mariaca
Abstract:
For banana and potato wastes, their peels are processed in order to make animal food with the condition that those wastes must not have started the decomposition process. One alternative to taking advantage of those wastes is to obtain a bioplastic based on starch from banana and potato shells. These products are 100% biodegradables, and researchers have been studying them for different applications, helping in the reduction of organic wastes and ordinary plastic wastes. Without petroleum affecting the prices of bioplastics, bioplastics market has a growing tendency and it is seen that it can keep this tendency in the medium term up to 350%. In this work, it will be shown the results for elasticity module and percent elongation for bioplastics obtained from a mixture of starch of bananas and potatoes peels, with glycerol as plasticizer. The experimental variables were the plasticizer percentage and the mixture between banana starch and potato starch. The results show that the bioplastics obtained can be used in different applications such as plastic bags or sorbets, verifying their admissible degradation percentages for each one of these applications. The results also show that they agree with the data found in the literature due to the fact that mixtures with a major amount of potato starch had the best mechanical properties because of the potato starch characteristics.Keywords: bioplastics, fruit waste, mechanical testing, mechanical properties
Procedia PDF Downloads 29344651 Air-Coupled Ultrasonic Testing for Non-Destructive Evaluation of Various Aerospace Composite Materials by Laser Vibrometry
Authors: J. Vyas, R. Kazys, J. Sestoke
Abstract:
Air-coupled ultrasonic is the contactless ultrasonic measurement approach which has become widespread for material characterization in Aerospace industry. It is always essential for the requirement of lightest weight, without compromising the durability. To archive the requirements, composite materials are widely used. This paper yields analysis of the air-coupled ultrasonics for composite materials such as CFRP (Carbon Fibre Reinforced Polymer) and GLARE (Glass Fiber Metal Laminate) and honeycombs for the design of modern aircrafts. Laser vibrometry could be the key source of characterization for the aerospace components. The air-coupled ultrasonics fundamentals, including principles, working modes and transducer arrangements used for this purpose is also recounted in brief. The emphasis of this paper is to approach the developed NDT techniques based on the ultrasonic guided waves applications and the possibilities of use of laser vibrometry in different materials with non-contact measurement of guided waves. 3D assessment technique which employs the single point laser head using, automatic scanning relocation of the material to assess the mechanical displacement including pros and cons of the composite materials for aerospace applications with defects and delaminations.Keywords: air-coupled ultrasonics, contactless measurement, laser interferometry, NDT, ultrasonic guided waves
Procedia PDF Downloads 23944650 Cognitive Weighted Polymorphism Factor: A New Cognitive Complexity Metric
Authors: T. Francis Thamburaj, A. Aloysius
Abstract:
Polymorphism is one of the main pillars of the object-oriented paradigm. It induces hidden forms of class dependencies which may impact software quality, resulting in higher cost factor for comprehending, debugging, testing, and maintaining the software. In this paper, a new cognitive complexity metric called Cognitive Weighted Polymorphism Factor (CWPF) is proposed. Apart from the software structural complexity, it includes the cognitive complexity on the basis of type. The cognitive weights are calibrated based on 27 empirical studies with 120 persons. A case study and experimentation of the new software metric shows positive results. Further, a comparative study is made and the correlation test has proved that CWPF complexity metric is a better, more comprehensive, and more realistic indicator of the software complexity than Abreu’s Polymorphism Factor (PF) complexity metric.Keywords: cognitive complexity metric, object-oriented metrics, polymorphism factor, software metrics
Procedia PDF Downloads 45844649 IoT Based Information Processing and Computing
Authors: Mannan Ahmad Rasheed, Sawera Kanwal, Mansoor Ahmad Rasheed
Abstract:
The Internet of Things (IoT) has revolutionized the way we collect and process information, making it possible to gather data from a wide range of connected devices and sensors. This has led to the development of IoT-based information processing and computing systems that are capable of handling large amounts of data in real time. This paper provides a comprehensive overview of the current state of IoT-based information processing and computing, as well as the key challenges and gaps that need to be addressed. This paper discusses the potential benefits of IoT-based information processing and computing, such as improved efficiency, enhanced decision-making, and cost savings. Despite the numerous benefits of IoT-based information processing and computing, several challenges need to be addressed to realize the full potential of these systems. These challenges include security and privacy concerns, interoperability issues, scalability and reliability of IoT devices, and the need for standardization and regulation of IoT technologies. Moreover, this paper identifies several gaps in the current research related to IoT-based information processing and computing. One major gap is the lack of a comprehensive framework for designing and implementing IoT-based information processing and computing systems.Keywords: IoT, computing, information processing, Iot computing
Procedia PDF Downloads 18544648 Optimizing Usability Testing with Collaborative Method in an E-Commerce Ecosystem
Authors: Markandeya Kunchi
Abstract:
Usability testing (UT) is one of the vital steps in the User-centred design (UCD) process when designing a product. In an e-commerce ecosystem, UT becomes primary as new products, features, and services are launched very frequently. And, there are losses attached to the company if an unusable and inefficient product is put out to market and is rejected by customers. This paper tries to answer why UT is important in the product life-cycle of an E-commerce ecosystem. Secondary user research was conducted to find out work patterns, development methods, type of stakeholders, and technology constraints, etc. of a typical E-commerce company. Qualitative user interviews were conducted with product managers and designers to find out the structure, project planning, product management method and role of the design team in a mid-level company. The paper tries to address the usual apprehensions of the company to inculcate UT within the team. As well, it stresses upon factors like monetary resources, lack of usability expert, narrow timelines, and lack of understanding of higher management as some primary reasons. Outsourcing UT to vendors is also very prevalent with mid-level e-commerce companies, but it has its own severe repercussions like very little team involvement, huge cost, misinterpretation of the findings, elongated timelines, and lack of empathy towards the customer, etc. The shortfalls of the unavailability of a UT process in place within the team and conducting UT through vendors are bad user experiences for customers while interacting with the product, badly designed products which are neither useful and nor utilitarian. As a result, companies see dipping conversions rates in apps and websites, huge bounce rates and increased uninstall rates. Thus, there was a need for a more lean UT system in place which could solve all these issues for the company. This paper highlights on optimizing the UT process with a collaborative method. The degree of optimization and structure of collaborative method is the highlight of this paper. Collaborative method of UT is one in which the centralised design team of the company takes for conducting and analysing the UT. The UT is usually a formative kind where designers take findings into account and uses in the ideation process. The success of collaborative method of UT is due to its ability to sync with the product management method employed by the company or team. The collaborative methods focus on engaging various teams (design, marketing, product, administration, IT, etc.) each with its own defined roles and responsibility in conducting a smooth UT with users In-house. The paper finally highlights the positive results of collaborative UT method after conducting more than 100 In-lab interviews with users across the different lines of businesses. Some of which are the improvement of interaction between stakeholders and the design team, empathy towards users, improved design iteration, better sanity check of design solutions, optimization of time and money, effective and efficient design solution. The future scope of collaborative UT is to make this method leaner, by reducing the number of days to complete the entire project starting from planning between teams to publishing the UT report.Keywords: collaborative method, e-commerce, product management method, usability testing
Procedia PDF Downloads 11944647 Efficient Moment Frame Structure
Authors: Mircea I. Pastrav, Cornelia Baera, Florea Dinu
Abstract:
A different concept for designing and detailing of reinforced concrete precast frame structures is analyzed in this paper. The new detailing of the joints derives from the special hybrid moment frame joints. The special reinforcements of this alternative detailing, named modified special hybrid joint, are bondless with respect to both column and beams. Full scale tests were performed on a plan model, which represents a part of 5 story structure, cropped in the middle of the beams and columns spans. Theoretical approach was developed, based on testing results on twice repaired model, subjected to lateral seismic type loading. Discussion regarding the modified special hybrid joint behavior and further on widening research needed concludes the presentation.Keywords: modified hybrid joint, repair, seismic loading type, acceptance criteria
Procedia PDF Downloads 52344646 Development of Biodegradable Plastic as Mango Fruit Bag
Authors: Andres M. Tuates Jr., Ofero A. Caparino
Abstract:
Plastics have achieved a dominant position in agriculture because of their transparency, lightness in weight, impermeability to water and their resistance to microbial attack. However, this generates a higher quantity of wastes that are difficult to dispose of by farmers. To address these problems, the project aim to develop and evaluate the biodegradable film for mango fruit bag during development. The PBS and starch were melt-blended in a twin-screw extruder and then blown into film extrusion machine. The physic-chemical-mechanical properties of biodegradable fruit bag were done following standard methods of test. Field testing of fruit bag was also conducted to evaluate its durability and efficiency field condition. The PHilMech-FiC fruit bag is made of biodegradable material measuring 6 x 8 inches with a thickness of 150 microns. The tensile strength is within the range of LDPE while the elongation is within the range of HDPE. It is projected that after thirty-six (36) weeks, the film will be totally degraded. Results of field testing show that the quality of harvested fruits using PHilMech-FiC biodegradable fruit bag in terms of percent marketable, non-marketable and export, peel color at the ripe stage, flesh color, TSS, oBrix, percent edible portion is comparable with the existing bagging materials such as Chinese brown paper bag and old newspaper.Keywords: cassava starch, PBS, biodegradable, chemical, mechanical properties
Procedia PDF Downloads 27744645 Analysis and Prediction of COVID-19 by Using Recurrent LSTM Neural Network Model in Machine Learning
Authors: Grienggrai Rajchakit
Abstract:
As we all know that coronavirus is announced as a pandemic in the world by WHO. It is speeded all over the world with few days of time. To control this spreading, every citizen maintains social distance and self-preventive measures are the best strategies. As of now, many researchers and scientists are continuing their research in finding out the exact vaccine. The machine learning model finds that the coronavirus disease behaves in an exponential manner. To abolish the consequence of this pandemic, an efficient step should be taken to analyze this disease. In this paper, a recurrent neural network model is chosen to predict the number of active cases in a particular state. To make this prediction of active cases, we need a database. The database of COVID-19 is downloaded from the KAGGLE website and is analyzed by applying a recurrent LSTM neural network with univariant features to predict the number of active cases of patients suffering from the corona virus. The downloaded database is divided into training and testing the chosen neural network model. The model is trained with the training data set and tested with a testing dataset to predict the number of active cases in a particular state; here, we have concentrated on Andhra Pradesh state.Keywords: COVID-19, coronavirus, KAGGLE, LSTM neural network, machine learning
Procedia PDF Downloads 16044644 A Novel Approach of NPSO on Flexible Logistic (S-Shaped) Model for Software Reliability Prediction
Authors: Pooja Rani, G. S. Mahapatra, S. K. Pandey
Abstract:
In this paper, we propose a novel approach of Neural Network and Particle Swarm Optimization methods for software reliability prediction. We first explain how to apply compound function in neural network so that we can derive a Flexible Logistic (S-shaped) Growth Curve (FLGC) model. This model mathematically represents software failure as a random process and can be used to evaluate software development status during testing. To avoid trapping in local minima, we have applied Particle Swarm Optimization method to train proposed model using failure test data sets. We drive our proposed model using computational based intelligence modeling. Thus, proposed model becomes Neuro-Particle Swarm Optimization (NPSO) model. We do test result with different inertia weight to update particle and update velocity. We obtain result based on best inertia weight compare along with Personal based oriented PSO (pPSO) help to choose local best in network neighborhood. The applicability of proposed model is demonstrated through real time test data failure set. The results obtained from experiments show that the proposed model has a fairly accurate prediction capability in software reliability.Keywords: software reliability, flexible logistic growth curve model, software cumulative failure prediction, neural network, particle swarm optimization
Procedia PDF Downloads 34444643 Characterization of Filled HNBR Elastomers for Sealing Application in Cold Climate Areas
Authors: Anton G. Akulichev, Avinash Tiwari, Ben Alcock, Andreas Echtermeyer
Abstract:
Low temperatures are known to pose a major threat for polymers; many are prone to excessive stiffness or even brittleness. There is a technology gap between the properties of existing elastomeric sealing materials and the properties needed for service in extremely cold regions. Moreover, some aspects of low temperature behaviour of rubber are not thoroughly studied and understood. The paper presents results of laboratory testing of a conventional oilfield HNBR (hydrogenated nitrile butadiene rubber) elastomer at low climatic temperatures above and below its glass transition point, as well as the performance of some filled HNBR formulations. Particular emphasis in the experiments is put on rubber viscoelastic characteristics studied by Dynamic Mechanical Analysis (DMA) and quasi-static mechanical testing results at low temperatures. As demonstrated by the stress relaxation and DMA experiments the transition region near Tg of the studied compound has the most striking features, like rapid stress relaxation, as compared to the glassy and rubbery plateau. In addition the quasi-static experiments show that molecular movement below Tg is not completely frozen, but rather evident and manifested in a certain stress decay as well. The effect of temperature and filler additions on typical mechanical and other properties of the materials is also discussed.Keywords: characterization, filled elastomers, HNBR, low temperature
Procedia PDF Downloads 31344642 Study of the Effect of Sewing on Non Woven Textile Waste at Dry and Composite Scales
Authors: Wafa Baccouch, Adel Ghith, Xavier Legrand, Faten Fayala
Abstract:
Textile waste recycling has become a necessity considering the augmentation of the amount of waste generated each year and the ecological problems that landfilling and burning can cause. Textile waste can be recycled into many different forms according to its composition and its final utilization. Using this waste as reinforcement to composite panels is a new recycling area that is being studied. Compared to virgin fabrics, recycled ones present the disadvantage of having lower structural characteristics, when they are eco-friendly and with low cost. The objective of this work is transforming textile waste into composite material with good characteristic and low price. In this study, we used sewing as a method to improve the characteristics of the recycled textile waste in order to use it as reinforcement to composite material. Textile non-woven waste was afforded by a local textile recycling industry. Performances tests were evaluated using tensile testing machine and based on the testing direction for both reinforcements and composite panels; machine and transverse direction. Tensile tests were conducted on sewed and non sewed fabrics, and then they were used as reinforcements to composite panels via epoxy resin infusion method. Rule of mixtures is used to predict composite characteristics and then compared to experimental ones.Keywords: composite material, epoxy resin, non woven waste, recycling, sewing, textile
Procedia PDF Downloads 586