Search results for: dispensing errors
620 Compression-Extrusion Test to Assess Texture of Thickened Liquids for Dysphagia
Authors: Jesus Salmeron, Carmen De Vega, Maria Soledad Vicente, Mireia Olabarria, Olaia Martinez
Abstract:
Dysphagia or difficulty in swallowing affects mostly elder people: 56-78% of the institutionalized and 44% of the hospitalized. Liquid food thickening is a necessary measure in this situation because it reduces the risk of penetration-aspiration. Until now, and as proposed by the American Dietetic Association in 2002, possible consistencies have been categorized in three groups attending to their viscosity: nectar (50-350 mPa•s), honey (350-1750 mPa•s) and pudding (>1750 mPa•s). The adequate viscosity level should be identified for every patient, according to her/his impairment. Nevertheless, a systematic review on dysphagia diet performed recently indicated that there is no evidence to suggest that there is any transition of clinical relevance between the three levels proposed. It was also stated that other physical properties of the bolus (slipperiness, density or cohesiveness, among others) could influence swallowing in affected patients and could contribute to the amount of remaining residue. Texture parameters need to be evaluated as possible alternative to viscosity. The aim of this study was to evaluate the instrumental extrusion-compression test as a possible tool to characterize changes along time in water thickened with various products and in the three theoretical consistencies. Six commercial thickeners were used: NM® (NM), Multi-thick® (M), Nutilis Powder® (Nut), Resource® (R), Thick&Easy® (TE) and Vegenat® (V). All of them with a modified starch base. Only one of them, Nut, also had a 6,4% of gum (guar, tara and xanthan). They were prepared as indicated in the instructions of each product and dispensing the correspondent amount for nectar, honey and pudding consistencies in 300 mL of tap water at 18ºC-20ºC. The mixture was stirred for about 30 s. Once it was homogeneously spread, it was dispensed in 30 mL plastic glasses; always to the same height. Each of these glasses was used as a measuring point. Viscosity was measured using a rotational viscometer (ST-2001, Selecta, Barcelona). Extrusion-compression test was performed using a TA.XT2i texture analyzer (Stable Micro Systems, UK) with a 25 mm diameter cylindrical probe (SMSP/25). Penetration distance was set at 10 mm and a speed of 3 mm/s. Measurements were made at 1, 5, 10, 20, 30, 40, 50 and 60 minutes from the moment samples were mixed. From the force (g)–time (s) curves obtained in the instrumental assays, maximum force peak (F) was chosen a reference parameter. Viscosity (mPa•s) and F (g) showed to be highly correlated and had similar development along time, following time-dependent quadratic models. It was possible to predict viscosity using F as an independent variable, as they were linearly correlated. In conclusion, compression-extrusion test could be an alternative and a useful tool to assess physical characteristics of thickened liquids.Keywords: compression-extrusion test, dysphagia, texture analyzer, thickener
Procedia PDF Downloads 368619 Localization of Mobile Robots with Omnidirectional Cameras
Authors: Tatsuya Kato, Masanobu Nagata, Hidetoshi Nakashima, Kazunori Matsuo
Abstract:
Localization of mobile robots are important tasks for developing autonomous mobile robots. This paper proposes a method to estimate positions of a mobile robot using an omnidirectional camera on the robot. Landmarks for points of references are set up on a field where the robot works. The omnidirectional camera which can obtain 360 [deg] around images takes photographs of these landmarks. The positions of the robots are estimated from directions of these landmarks that are extracted from the images by image processing. This method can obtain the robot positions without accumulative position errors. Accuracy of the estimated robot positions by the proposed method are evaluated through some experiments. The results show that it can obtain the positions with small standard deviations. Therefore the method has possibilities of more accurate localization by tuning of appropriate offset parameters.Keywords: mobile robots, localization, omnidirectional camera, estimating positions
Procedia PDF Downloads 442618 Process Safety Management Digitalization via SHEQTool based on Occupational Safety and Health Administration and Center for Chemical Process Safety, a Case Study in Petrochemical Companies
Authors: Saeed Nazari, Masoom Nazari, Ali Hejazi, Siamak Sanoobari Ghazi Jahani, Mohammad Dehghani, Javad Vakili
Abstract:
More than ever, digitization is an imperative for businesses to keep their competitive advantages, foster innovation and reduce paperwork. To design and successfully implement digital transformation initiatives within process safety management system, employees need to be equipped with the right tool, frameworks, and best practices. we developed a unique full stack application so-called SHEQTool which is entirely dynamic based on our extensive expertise, experience, and client feedback to help business processes particularly operations safety management. We use our best knowledge and scientific methodologies published by CCPS and OSHA Guidelines to streamline operations and integrated them into task management within Petrochemical Companies. We digitalize their main process safety management system elements and their sub elements such as hazard identification and risk management, training and communication, inspection and audit, critical changes management, contractor management, permit to work, pre-start-up safety review, incident reporting and investigation, emergency response plan, personal protective equipment, occupational health, and action management in a fully customizable manner with no programming needs for users. We review the feedback from main actors within petrochemical plant which highlights improving their business performance and productivity as well as keep tracking their functions’ key performance indicators (KPIs) because it; 1) saves time, resources, and costs of all paperwork on our businesses (by Digitalization); 2) reduces errors and improve performance within management system by covering most of daily software needs of the organization and reduce complexity and associated costs of numerous tools and their required training (One Tool Approach); 3) focuses on management systems and integrate functions and put them into traceable task management (RASCI and Flowcharting); 4) helps the entire enterprise be resilient to any change of your processes, technologies, assets with minimum costs (through Organizational Resilience); 5) reduces significantly incidents and errors via world class safety management programs and elements (by Simplification); 6) gives the companies a systematic, traceable, risk based, process based, and science based integrated management system (via proper Methodologies); 7) helps business processes complies with ISO 9001, ISO 14001, ISO 45001, ISO 31000, best practices as well as legal regulations by PDCA approach (Compliance).Keywords: process, safety, digitalization, management, risk, incident, SHEQTool, OSHA, CCPS
Procedia PDF Downloads 66617 Critical Comparison of Two Teaching Methods: The Grammar Translation Method and the Communicative Teaching Method
Authors: Aicha Zohbie
Abstract:
The purpose of this paper is to critically compare two teaching methods: the communicative method and the grammar-translation method. The paper presents the importance of language awareness as an approach to teaching and learning language and some challenges that language teachers face. In addition, the paper strives to determine whether the adoption of communicative teaching methods or the grammar teaching method would be more effective to teach a language. A variety of features are considered for comparing the two methods: the purpose of each method, techniques used, teachers’ and students’ roles, the use of L1, the skills that are emphasized, the correction of students’ errors, and the students’ assessments. Finally, the paper includes suggestions and recommendations for implementing an approach that best meets the students’ needs in a classroom.Keywords: language teaching methods, language awareness, communicative method grammar translation method, advantages and disadvantages
Procedia PDF Downloads 151616 Understanding Project Failures in Construction: The Critical Impact of Financial Capacity
Authors: Nnadi Ezekiel Oluwaseun Ejiofor
Abstract:
This research investigates the effects of poor cost estimation, material cost variations, and payment punctuality on the financial health and execution of construction projects in Nigeria. To achieve the objectives of the study, a quantitative research approach was employed, and data was gathered through an online survey of 74 construction industry professionals consisting of quantity surveyors, contractors, and other professionals. The study surveyed input on cost estimation errors, price fluctuations, and payment delays, among other factors. The responses of the respondents were analyzed using a five-point Likert scale and the Relative Importance Index (RII). The findings demonstrated that the errors in cost estimating in the Bill of Quantity (BOQ) have a high degree of negative impact on the reputation and image of the participants in the projects. The greatest effect was experienced on the likelihood of obtaining future endeavors for contractors (mean value = 3.42), followed by the likelihood of obtaining new commissions by quantity surveyors (mean value = 3.40). The level of inaccuracy in costing that undershoots exposes them to risks was most serious in terms of easement of construction and effects of shortage of funds to pursue bankruptcy (hence fears of mean value = 3.78). There was also considerable financial damage as a result of cost underestimation, whereby contractors suffered the worst loss in profit (mean value = 3.88). Every expense comes with its own peculiar risk and uncertainty. Pressure on the cost of materials and every other expense attributed to the building and completion of a structure adds risks to the performance figures of a project. The greatest weight (mean importance score = 4.92) was attributed to issues like market inflation in building materials, while the second greatest weight (mean importance score = 4.76) was due to increased transportation charges. On the other hand, delays in payments arising from issues of the clients like poor availability of funds (RII=0.71) and contracting issues such as disagreements on the valuation of works done (RII=0.72) or other reasons were also found to lead to project delays and additional costs. The results affirm the importance of proper cost estimation on the health of organization finances and project risks and finishes within set time limits. As for the suggestions, it is proposed to progress on the methods of costing, engender better communications with the stakeholders, and manage the delays by way of contracting and financial control. This study enhances the existing literature on construction project management by suggesting ways to deal with adverse cost inaccuracies and availability of materials due to delays in payments which, if addressed, would greatly improve the economic performance of the construction business.Keywords: cost estimation, construction project management, material price fluctuations, payment delays, financial impact
Procedia PDF Downloads 8615 Determinants of Profitability in Indian Pharmaceutical Firms in the New Intellectual Property Rights Regime
Authors: Shilpi Tyagi, D. K. Nauriyal
Abstract:
This study investigates the firm level determinants of profitability of Indian drug and pharmaceutical industry. The study uses inflation adjusted panel data for a period 2000-2013 and applies OLS regression model with Driscoll-Kraay standard errors. It has been found that export intensity, A&M intensity, firm’s market power and stronger patent regime dummy have exercised positive influence on profitability. The negative and statistically significant influence of R&D intensity and raw material import intensity points to the need for firms to adopt suitable investment strategies. The study suggests that firms are required to pay far more attention to optimize their operating expenditures, advertisement and marketing expenditures and improve their export orientation, as part of the long term strategy.Keywords: Indian pharmaceutical industry, profits, TRIPS, performance
Procedia PDF Downloads 436614 A Novel Approach to Design of EDDR Architecture for High Speed Motion Estimation Testing Applications
Authors: T. Gangadhararao, K. Krishna Kishore
Abstract:
Motion Estimation (ME) plays a critical role in a video coder, testing such a module is of priority concern. While focusing on the testing of ME in a video coding system, this work presents an error detection and data recovery (EDDR) design, based on the residue-and-quotient (RQ) code, to embed into ME for video coding testing applications. An error in processing Elements (PEs), i.e. key components of a ME, can be detected and recovered effectively by using the proposed EDDR design. The proposed EDDR design for ME testing can detect errors and recover data with an acceptable area overhead and timing penalty.Keywords: area overhead, data recovery, error detection, motion estimation, reliability, residue-and-quotient (RQ) code
Procedia PDF Downloads 432613 The Mitigation of Quercetin on Lead-Induced Neuroinflammation in a Rat Model: Changes in Neuroinflammatory Markers and Memory
Authors: Iliyasu Musa Omoyine, Musa Sunday Abraham, Oladele Sunday Blessing, Iliya Ibrahim Abdullahi, Ibegbu Augustine Oseloka, Nuhu Nana-Hawau, Animoku Abdulrazaq Amoto, Yusuf Abdullateef Onoruoiza, Sambo Sohnap James, Akpulu Steven Peter, Ajayi Abayomi
Abstract:
The neuroprotective role of inflammation from detrimental intrinsic and extrinsic factors has been reported. However, the overactivation of astrocytes and microglia due to lead toxicity produce excessive pro-inflammatory cytokines, mediating neurodegenerative diseases. The present study investigated the mitigatory effects of quercetin on neuroinflammation, correlating with memory function in lead-exposed rats. In this study, Wistar rats were administered orally with Quercetin (Q: 60 mg/kg) and Succimer as a standard drug (S: 10 mg/kg) for 21 days after lead exposure (Pb: 125 mg/kg) of 21 days or in combination with Pb, once daily for 42 days. Working and reference memory was assessed using an Eight-arm radial water maze (8-ARWM). The changes in brain lead level, the neuronal nitric oxide synthase (nNOS) activity, and the level of neuroinflammatory markers such as tumour necrosis factor-alpha (TNF-α) and Interleukin 1 Beta (IL-1β) were determined. Immunohistochemically, astrocyte expression was evaluated. The results showed that the brain level of lead was increased significantly in lead-exposed rats. The expression of astrocytes increased in the CA3 and CA1 regions of the hippocampus, and the levels of brain TNF-α and IL-1β increased in lead-exposed rats. Lead impaired reference and working memory by increasing reference memory errors and working memory incorrect errors in lead-exposed rats. However, quercetin treatment effectively improved memory and inhibited neuroinflammation by reducing astrocytes’ expression and the levels of TNF-α and IL-1β. The expression of astrocytes and the levels of TNF-α and IL-1β correlated with memory function. The possible explanation for quercetin’s anti-neuroinflammatory effect is that it modulates the activity of cellular proteins involved in the inflammatory response; inhibits the transcription factor of nuclear factor-kappa B (NF-κB), which regulates the expression of proinflammatory molecules; inhibits kinases required for the synthesis of Glial fibrillary acidic protein (GFAP) and modifies the phosphorylation of some proteins, which affect the structure and function of intermediate filament proteins; and, lastly, induces Cyclic-AMP Response Element Binding (CREB) activation and neurogenesis as a compensatory mechanism for memory deficits and neuronal cell death. In conclusion, the levels of neuroinflammatory markers negatively correlated with memory function. Thus, quercetin may be a promising therapy in neuroinflammation and memory dysfunction in populations prone to lead exposure.Keywords: lead, quercetin, neuroinflammation, memory
Procedia PDF Downloads 54612 Case Study Analysis of 2017 European Railway Traffic Management Incident: The Application of System for Investigation of Railway Interfaces Methodology
Authors: Sanjeev Kumar Appicharla
Abstract:
This paper presents the results of the modelling and analysis of the European Railway Traffic Management (ERTMS) safety-critical incident to raise awareness of biases in the systems engineering process on the Cambrian Railway in the UK using the RAIB 17/2019 as a primary input. The RAIB, the UK independent accident investigator, published the Report- RAIB 17/2019 giving the details of their investigation of the focal event in the form of immediate cause, causal factors, and underlying factors and recommendations to prevent a repeat of the safety-critical incident on the Cambrian Line. The Systems for Investigation of Railway Interfaces (SIRI) is the methodology used to model and analyze the safety-critical incident. The SIRI methodology uses the Swiss Cheese Model to model the incident and identify latent failure conditions (potentially less than adequate conditions) by means of the management oversight and risk tree technique. The benefits of the systems for investigation of railway interfaces methodology (SIRI) are threefold: first is that it incorporates the “Heuristics and Biases” approach advanced by 2002 Nobel laureate in Economic Sciences, Prof Daniel Kahneman, in the management oversight and risk tree technique to identify systematic errors. Civil engineering and programme management railway professionals are aware of the role “optimism bias” plays in programme cost overruns and are aware of bow tie (fault and event tree) model-based safety risk modelling techniques. However, the role of systematic errors due to “Heuristics and Biases” is not appreciated as yet. This overcomes the problems of omission of human and organizational factors from accident analysis. Second, the scope of the investigation includes all levels of the socio-technical system, including government, regulatory, railway safety bodies, duty holders, signaling firms and transport planners, and front-line staff such that lessons are learned at the decision making and implementation level as well. Third, the author’s past accident case studies are supplemented with research pieces of evidence drawn from the practitioner's and academic researchers’ publications as well. This is to discuss the role of system thinking to improve the decision-making and risk management processes and practices in the IEC 15288 systems engineering standard and in the industrial context such as the GB railways and artificial intelligence (AI) contexts as well.Keywords: accident analysis, AI algorithm internal audit, bounded rationality, Byzantine failures, heuristics and biases approach
Procedia PDF Downloads 188611 Cubic Trigonometric B-Spline Approach to Numerical Solution of Wave Equation
Authors: Shazalina Mat Zin, Ahmad Abd. Majid, Ahmad Izani Md. Ismail, Muhammad Abbas
Abstract:
The generalized wave equation models various problems in sciences and engineering. In this paper, a new three-time level implicit approach based on cubic trigonometric B-spline for the approximate solution of wave equation is developed. The usual finite difference approach is used to discretize the time derivative while cubic trigonometric B-spline is applied as an interpolating function in the space dimension. Von Neumann stability analysis is used to analyze the proposed method. Two problems are discussed to exhibit the feasibility and capability of the method. The absolute errors and maximum error are computed to assess the performance of the proposed method. The results were found to be in good agreement with known solutions and with existing schemes in literature.Keywords: collocation method, cubic trigonometric B-spline, finite difference, wave equation
Procedia PDF Downloads 542610 Secure Optical Communication System Using Quantum Cryptography
Authors: Ehab AbdulRazzaq Hussein
Abstract:
Quantum cryptography (QC) is an emerging technology for secure key distribution with single-photon transmissions. In contrast to classical cryptographic schemes, the security of QC schemes is guaranteed by the fundamental laws of nature. Their security stems from the impossibility to distinguish non-orthogonal quantum states with certainty. A potential eavesdropper introduces errors in the transmissions, which can later be discovered by the legitimate participants of the communication. In this paper, the modeling approach is proposed for QC protocol BB84 using polarization coding. The single-photon system is assumed to be used in the designed models. Thus, Eve cannot use beam-splitting strategy to eavesdrop on the quantum channel transmission. The only eavesdropping strategy possible to Eve is the intercept/resend strategy. After quantum transmission of the QC protocol, the quantum bit error rate (QBER) is estimated and compared with a threshold value. If it is above this value the procedure must be stopped and performed later again.Keywords: security, key distribution, cryptography, quantum protocols, Quantum Cryptography (QC), Quantum Key Distribution (QKD).
Procedia PDF Downloads 406609 Modification of Fick’s First Law by Introducing the Time Delay
Authors: H. Namazi, H. T. N. Kuan
Abstract:
Fick's first law relates the diffusive flux to the concentration field, by postulating that the flux goes from regions of high concentration to regions of low concentration, with a magnitude that is proportional to the concentration gradient (spatial derivative). It is clear that the diffusion of flux cannot be instantaneous and should be some time delay in this propagation. But Fick’s first law doesn’t consider this delay which results in some errors especially when there is a considerable time delay in the process. In this paper, we introduce a time delay to Fick’s first law. By this modification, we consider that the diffusion of flux cannot be instantaneous. In order to verify this claim an application sample in fluid diffusion is discussed and the results of modified Fick’s first law, Fick’s first law and the experimental results are compared. The results of this comparison stand for the accuracy of the modified model. The modified model can be used in any application where the time delay has considerable value and neglecting its effect reflects in undesirable results.Keywords: Fick's first law, flux, diffusion, time delay, modified Fick’s first law
Procedia PDF Downloads 408608 Behavioral and EEG Reactions in Native Turkic-Speaking Inhabitants of Siberia and Siberian Russians during Recognition of Syntactic Errors in Sentences in Native and Foreign Languages
Authors: Tatiana N. Astakhova, Alexander E. Saprygin, Tatyana A. Golovko, Alexander N. Savostyanov, Mikhail S. Vlasov, Natalia V. Borisova, Alexandera G. Karpova, Urana N. Kavai-ool, Elena D. Mokur-ool, Nikolay A. Kolchanov, Lubomir I. Aftanas
Abstract:
The aim of the study is to compare behaviorally and EEG reactions in Turkic-speaking inhabitants of Siberia (Tuvinians and Yakuts) and Russians during the recognition of syntax errors in native and foreign languages. 63 healthy aboriginals of the Tyva Republic, 29 inhabitants of the Sakha (Yakutia) Republic, and 55 Russians from Novosibirsk participated in the study. All participants completed a linguistic task, in which they had to find a syntax error in the written sentences. Russian participants completed the task in Russian and in English. Tuvinian and Yakut participants completed the task in Russian, English, and Tuvinian or Yakut, respectively. EEG’s were recorded during the solving of tasks. For Russian participants, EEG's were recorded using 128-channels. The electrodes were placed according to the extended International 10-10 system, and the signals were amplified using ‘Neuroscan (USA)’ amplifiers. For Tuvinians and Yakuts EEG's were recorded using 64-channels and amplifiers Brain Products, Germany. In all groups 0.3-100 Hz analog filtering, sampling rate 1000 Hz were used. Response speed and the accuracy of recognition error were used as parameters of behavioral reactions. Event-related potentials (ERP) responses P300 and P600 were used as indicators of brain activity. The accuracy of solving tasks and response speed in Russians were higher for Russian than for English. The P300 amplitudes in Russians were higher for English; the P600 amplitudes in the left temporal cortex were higher for the Russian language. Both Tuvinians and Yakuts have no difference in accuracy of solving tasks in Russian and in their respective national languages (Tuvinian and Yakut). However, the response speed was faster for tasks in Russian than for tasks in their national language. Tuvinians and Yakuts showed bad accuracy in English, but the response speed was higher for English than for Russian and the national languages. With Tuvinians, there were no differences in the P300 and P600 amplitudes and in cortical topology for Russian and Tuvinian, but there was a difference for English. In Yakuts, the P300 and P600 amplitudes and topology of ERP for Russian were the same as Russians had for Russian. In Yakuts, brain reactions during Yakut and English comprehension had no difference and were reflected foreign language comprehension -while the Russian language comprehension was reflected native language comprehension. We found out that the Tuvinians recognized both Russian and Tuvinian as native languages, and English as a foreign language. The Yakuts recognized both English and Yakut as a foreign language, only Russian as a native language. According to the inquirer, both Tuvinians and Yakuts use the national language as a spoken language, whereas they don’t use it for writing. It can well be a reason that Yakuts perceive the Yakut writing language as a foreign language while writing Russian as their native.Keywords: EEG, language comprehension, native and foreign languages, Siberian inhabitants
Procedia PDF Downloads 532607 Kinetic Study of Thermal Degradation of a Lignin Nanoparticle-Reinforced Phenolic Foam
Authors: Juan C. Domínguez, Belén Del Saz-Orozco, María V. Alonso, Mercedes Oliet, Francisco Rodríguez
Abstract:
In the present study, the kinetics of thermal degradation of a phenolic and lignin reinforced phenolic foams, and the lignin used as reinforcement were studied and the activation energies of their degradation processes were obtained by a DAEM model. The average values for five heating rates of the mean activation energies obtained were: 99.1, 128.2, and 144.0 kJ.mol-1 for the phenolic foam, 109.5, 113.3, and 153.0 kJ.mol-1 for the lignin reinforcement, and 82.1, 106.9, and 124.4 kJ. mol-1 for the lignin reinforced phenolic foam. The standard deviation ranges calculated for each sample were 1.27-8.85, 2.22-12.82, and 3.17-8.11 kJ.mol-1 for the phenolic foam, lignin and the reinforced foam, respectively. The DAEM model showed low mean square errors (< 1x10-5), proving that is a suitable model to study the kinetics of thermal degradation of the foams and the reinforcement.Keywords: kinetics, lignin, phenolic foam, thermal degradation
Procedia PDF Downloads 488606 Sliding Mode Control of Autonomous Underwater Vehicles
Authors: Ahmad Forouzantabar, Mohammad Azadi, Alireza Alesaadi
Abstract:
This paper describes a sliding mode controller for autonomous underwater vehicles (AUVs). The dynamic of AUV model is highly nonlinear because of many factors, such as hydrodynamic drag, damping, and lift forces, Coriolis and centripetal forces, gravity and buoyancy forces, as well as forces from thruster. To address these difficulties, a nonlinear sliding mode controller is designed to approximate the nonlinear dynamics of AUV and improve trajectory tracking. Moreover, the proposed controller can profoundly attenuate the effects of uncertainties and external disturbances in the closed-loop system. Using the Lyapunov theory the boundedness of AUV tracking errors and the stability of the proposed control system are also guaranteed. Numerical simulation studies of an AUV are included to illustrate the effectiveness of the presented approach.Keywords: lyapunov stability, autonomous underwater vehicle, sliding mode controller, electronics engineering
Procedia PDF Downloads 612605 Power Control of DFIG in WECS Using Backstipping and Sliding Mode Controller
Authors: Abdellah Boualouch, Ahmed Essadki, Tamou Nasser, Ali Boukhriss, Abdellatif Frigui
Abstract:
This paper presents a power control for a Doubly Fed Induction Generator (DFIG) using in Wind Energy Conversion System (WECS) connected to the grid. The proposed control strategy employs two nonlinear controllers, Backstipping (BSC) and sliding-mode controller (SMC) scheme to directly calculate the required rotor control voltage so as to eliminate the instantaneous errors of active and reactive powers. In this paper the advantages of BSC and SMC are presented, the performance and robustness of this two controller’s strategy are compared between them. First, we present a model of wind turbine and DFIG machine, then a synthesis of the controllers and their application in the DFIG power control. Simulation results on a 1.5MW grid-connected DFIG system are provided by MATLAB/Simulink.Keywords: backstipping, DFIG, power control, sliding-mode, WESC
Procedia PDF Downloads 594604 A Formal Verification Approach for Linux Kernel Designing
Authors: Zi Wang, Xinlei He, Jianghua Lv, Yuqing Lan
Abstract:
Kernel though widely used, is complicated. Errors caused by some bugs are often costly. Statically, more than half of the mistakes occur in the design phase. Thus, we introduce a modeling method, KMVM (Linux Kernel Modeling and verification Method), based on type theory for proper designation and correct exploitation of the Kernel. In the model, the Kernel is separated into six levels: subsystem, dentry, file, struct, func, and base. Each level is treated as a type. The types are specified in the structure and relationship. At the same time, we use a demanding path to express the function to be implemented. The correctness of the design is verified by recursively checking the type relationship and type existence. The method has been applied to verify the OPEN business of VFS (virtual file system) in Linux Kernel. Also, we have designed and developed a set of security communication mechanisms in the Kernel with verification.Keywords: formal approach, type theory, Linux Kernel, software program
Procedia PDF Downloads 137603 Shared Vision System Support for Maintenance Tasks of Wind Turbines
Authors: Buket Celik Ünal, Onur Ünal
Abstract:
Communication is the most challenging part of maintenance operations. Communication between expert and fieldworker is crucial for effective maintenance and this also affects the safety of the fieldworkers. To support a machine user in a remote collaborative physical task, both, a mobile and a stationary device are needed. Such a system is called a shared vision system and the system supports two people to solve a problem from different places. This system reduces the errors and provides a reliable support for qualified and less qualified users. Through this research, it was aimed to validate the effectiveness of using a shared vision system to facilitate communication between on-site workers and those issuing instructions regarding maintenance or inspection works over long distances. The system is designed with head-worn display which is called a shared vision system. As a part of this study, a substitute system is used and implemented by using a shared vision system for maintenance operation. The benefits of the use of a shared vision system are analyzed and results are adapted to the wind turbines to improve the occupational safety and health for maintenance technicians. The motivation for the research effort in this study can be summarized in the following research questions: -How can expert support technician over long distances during maintenance operation? -What are the advantages of using a shared vision system? Experience from the experiment shows that using a shared vision system is an advantage for both electrical and mechanical system failures. Results support that the shared vision system can be used for wind turbine maintenance and repair tasks. Because wind turbine generator/gearbox and the substitute system have similar failures. Electrical failures, such as voltage irregularities, wiring failures and mechanical failures, such as alignment, vibration, over-speed conditions are the common and similar failures for both. Furthermore, it was analyzed the effectiveness of the shared vision system by using a smart glasses in connection with the maintenance task performed by a substitute system under four different circumstances, namely by using a shared vision system, an audio communication, a smartphone and by yourself condition. A suitable method for determining dependencies between factors measured in Chi Square Test, and Chi Square Test for Independence measured for determining a relationship between two qualitative variables and finally Mann Whitney U Test is used to compare any two data sets. While based on this experiment, no relation was found between the results and the gender. Participants` responses confirmed that the shared vision system is efficient and helpful for maintenance operations. From the results of the research, there was a statistically significant difference in the average time taken by subjects on works using a shared vision system under the other conditions. Additionally, this study confirmed that a shared vision system provides reduction in time to diagnose and resolve maintenance issues, reduction in diagnosis errors, reduced travel costs for experts, and increased reliability in service.Keywords: communication support, maintenance and inspection tasks, occupational health and safety, shared vision system
Procedia PDF Downloads 260602 Quality Control of Automotive Gearbox Based On Vibration Signal Analysis
Authors: Nilson Barbieri, Bruno Matos Martins, Gabriel de Sant'Anna Vitor Barbieri
Abstract:
In more complex systems, such as automotive gearbox, a rigorous treatment of the data is necessary because there are several moving parts (gears, bearings, shafts, etc.), and in this way, there are several possible sources of errors and also noise. The basic objective of this work is the detection of damage in automotive gearbox. The detection methods used are the wavelet method, the bispectrum; advanced filtering techniques (selective filtering) of vibrational signals and mathematical morphology. Gearbox vibration tests were performed (gearboxes in good condition and with defects) of a production line of a large vehicle assembler. The vibration signals are obtained using five accelerometers in different positions of the sample. The results obtained using the kurtosis, bispectrum, wavelet and mathematical morphology showed that it is possible to identify the existence of defects in automotive gearboxes.Keywords: automotive gearbox, mathematical morphology, wavelet, bispectrum
Procedia PDF Downloads 474601 Application of Statistical Linearized Models for Investigations of Digital Dynamic Pulse-Frequency Control Systems
Authors: B. H. Aitchanov, Sh. K. Aitchanova, O. A. Baimuratov
Abstract:
This paper is focused on dynamic pulse-frequency modulation (DPFM) control systems. Currently, the control law based on DPFM control signals is widely used in direct digital control subsystems introduced in the automated control systems of technological processes. Statistical analysis of automatic control systems is reduced to its construction of functional relationships between the statistical characteristics of the errors processes and input processes. Structural and dynamic Volterra models of digital pulse-frequency control systems can be used to develop methods for generating the dependencies, differing accuracy, requiring the amount of information about the statistical characteristics of input processes and computing labor intensity of their use.Keywords: digital dynamic pulse-frequency control systems, dynamic pulse-frequency modulation, control object, discrete filter, impulse device, microcontroller
Procedia PDF Downloads 495600 A Consideration on the Offset Frontal Impact Modeling Using Spring-Mass Model
Authors: Jaemoon Lim
Abstract:
To construct the lumped spring-mass model considering the occupants for the offset frontal crash, the SISAME software and the NHTSA test data were used. The data on 56 kph 40% offset frontal vehicle to deformable barrier crash test of a MY2007 Mazda 6 4-door sedan were obtained from NHTSA test database. The overall behaviors of B-pillar and engine of simulation models agreed very well with the test data. The trends of accelerations at the driver and passenger head were similar but big differences in peak values. The differences of peak values caused the large errors of the HIC36 and 3 ms chest g’s. To predict well the behaviors of dummies, the spring-mass model for the offset frontal crash needs to be improved.Keywords: chest g’s, HIC36, lumped spring-mass model, offset frontal impact, SISAME
Procedia PDF Downloads 457599 Rule-Based Expert System for Headache Diagnosis and Medication Recommendation
Authors: Noura Al-Ajmi, Mohammed A. Almulla
Abstract:
With the increased utilization of technology devices around the world, healthcare and medical diagnosis are critical issues that people worry about these days. Doctors are doing their best to avoid any medical errors while diagnosing diseases and prescribing the wrong medication. Subsequently, artificial intelligence applications that can be installed on mobile devices such as rule-based expert systems facilitate the task of assisting doctors in several ways. Due to their many advantages, the usage of expert systems has increased recently in health sciences. This work presents a backward rule-based expert system that can be used for a headache diagnosis and medication recommendation system. The structure of the system consists of three main modules, namely the input unit, the processing unit, and the output unit.Keywords: headache diagnosis system, prescription recommender system, expert system, backward rule-based system
Procedia PDF Downloads 215598 Scalable Performance Testing: Facilitating The Assessment Of Application Performance Under Substantial Loads And Mitigating The Risk Of System Failures
Authors: Solanki Ravirajsinh
Abstract:
In the software testing life cycle, failing to conduct thorough performance testing can result in significant losses for an organization due to application crashes and improper behavior under high user loads in production. Simulating large volumes of requests, such as 5 million within 5-10 minutes, is challenging without a scalable performance testing framework. Leveraging cloud services to implement a performance testing framework makes it feasible to handle 5-10 million requests in just 5-10 minutes, helping organizations ensure their applications perform reliably under peak conditions. Implementing a scalable performance testing framework using cloud services and tools like JMeter, EC2 instances (Virtual machine), cloud logs (Monitor errors and logs), EFS (File storage system), and security groups offers several key benefits for organizations. Creating performance test framework using this approach helps optimize resource utilization, effective benchmarking, increased reliability, cost savings by resolving performance issues before the application is released. In performance testing, a master-slave framework facilitates distributed testing across multiple EC2 instances to emulate many concurrent users and efficiently handle high loads. The master node orchestrates the test execution by coordinating with multiple slave nodes to distribute the workload. Slave nodes execute the test scripts provided by the master node, with each node handling a portion of the overall user load and generating requests to the target application or service. By leveraging JMeter's master-slave framework in conjunction with cloud services like EC2 instances, EFS, CloudWatch logs, security groups, and command-line tools, organizations can achieve superior scalability and flexibility in their performance testing efforts. In this master-slave framework, JMeter must be installed on both the master and each slave EC2 instance. The master EC2 instance functions as the "brain," while the slave instances operate as the "body parts." The master directs each slave to execute a specified number of requests. Upon completion of the execution, the slave instances transmit their results back to the master. The master then consolidates these results into a comprehensive report detailing metrics such as the number of requests sent, encountered errors, network latency, response times, server capacity, throughput, and bandwidth. Leveraging cloud services, the framework benefits from automatic scaling based on the volume of requests. Notably, integrating cloud services allows organizations to handle more than 5-10 million requests within 5 minutes, depending on the server capacity of the hosted website or application.Keywords: identify crashes of application under heavy load, JMeter with cloud Services, Scalable performance testing, JMeter master and slave using cloud Services
Procedia PDF Downloads 27597 Analysis of Simple Mechanisms to Continuously Vary Mach Number in a Supersonic Wind Tunnel Facility
Authors: Prateek Kishore, T. M. Muruganandam
Abstract:
Supersonic wind tunnel nozzles are generally capable of producing a constant Mach number flow in the test section of the wind tunnel. As a result, most of the supersonic vehicles are widely designed using steady state flow characteristics which may have errors while facing unsteady situations. This study aims to explore the possibility of varying the Mach number of the flow during wind tunnel operation. The nozzle walls are restricted to be inflexible for cooling near the throat due to high stagnation temperature requirement of the flow to simulate the conditions as experienced by the vehicle. Two simple independent mechanisms, rotation and translation of nozzle walls have been analyzed and the nozzle ranges have been optimized to vary the Mach number from Mach 2 to Mach 5 using minimum number of nozzles in the wind tunnel.Keywords: method of characteristics, nozzle, supersonic wind tunnel, variable mach number
Procedia PDF Downloads 295596 Mathematical Modeling for Diabetes Prediction: A Neuro-Fuzzy Approach
Authors: Vijay Kr. Yadav, Nilam Rathi
Abstract:
Accurate prediction of glucose level for diabetes mellitus is required to avoid affecting the functioning of major organs of human body. This study describes the fundamental assumptions and two different methodologies of the Blood glucose prediction. First is based on the back-propagation algorithm of Artificial Neural Network (ANN), and second is based on the Neuro-Fuzzy technique, called Fuzzy Inference System (FIS). Errors between proposed methods further discussed through various statistical methods such as mean square error (MSE), normalised mean absolute error (NMAE). The main objective of present study is to develop mathematical model for blood glucose prediction before 12 hours advanced using data set of three patients for 60 days. The comparative studies of the accuracy with other existing models are also made with same data set.Keywords: back-propagation, diabetes mellitus, fuzzy inference system, neuro-fuzzy
Procedia PDF Downloads 257595 GPU-Accelerated Triangle Mesh Simplification Using Parallel Vertex Removal
Authors: Thomas Odaker, Dieter Kranzlmueller, Jens Volkert
Abstract:
We present an approach to triangle mesh simplification designed to be executed on the GPU. We use a quadric error metric to calculate an error value for each vertex of the mesh and order all vertices based on this value. This step is followed by the parallel removal of a number of vertices with the lowest calculated error values. To allow for the parallel removal of multiple vertices we use a set of per-vertex boundaries that prevent mesh foldovers even when simplification operations are performed on neighbouring vertices. We execute multiple iterations of the calculation of the vertex errors, ordering of the error values and removal of vertices until either a desired number of vertices remains in the mesh or a minimum error value is reached. This parallel approach is used to speed up the simplification process while maintaining mesh topology and avoiding foldovers at every step of the simplification.Keywords: computer graphics, half edge collapse, mesh simplification, precomputed simplification, topology preserving
Procedia PDF Downloads 367594 Liver Lesion Extraction with Fuzzy Thresholding in Contrast Enhanced Ultrasound Images
Authors: Abder-Rahman Ali, Adélaïde Albouy-Kissi, Manuel Grand-Brochier, Viviane Ladan-Marcus, Christine Hoeffl, Claude Marcus, Antoine Vacavant, Jean-Yves Boire
Abstract:
In this paper, we present a new segmentation approach for focal liver lesions in contrast enhanced ultrasound imaging. This approach, based on a two-cluster Fuzzy C-Means methodology, considers type-II fuzzy sets to handle uncertainty due to the image modality (presence of speckle noise, low contrast, etc.), and to calculate the optimum inter-cluster threshold. Fine boundaries are detected by a local recursive merging of ambiguous pixels. The method has been tested on a representative database. Compared to both Otsu and type-I Fuzzy C-Means techniques, the proposed method significantly reduces the segmentation errors.Keywords: defuzzification, fuzzy clustering, image segmentation, type-II fuzzy sets
Procedia PDF Downloads 485593 Analytical Study of Data Mining Techniques for Software Quality Assurance
Authors: Mariam Bibi, Rubab Mehboob, Mehreen Sirshar
Abstract:
Satisfying the customer requirements is the ultimate goal of producing or developing any product. The quality of the product is decided on the bases of the level of customer satisfaction. There are different techniques which have been reported during the survey which enhance the quality of the product through software defect prediction and by locating the missing software requirements. Some mining techniques were proposed to assess the individual performance indicators in collaborative environment to reduce errors at individual level. The basic intention is to produce a product with zero or few defects thereby producing a best product quality wise. In the analysis of survey the techniques like Genetic algorithm, artificial neural network, classification and clustering techniques and decision tree are studied. After analysis it has been discovered that these techniques contributed much to the improvement and enhancement of the quality of the product.Keywords: data mining, defect prediction, missing requirements, software quality
Procedia PDF Downloads 468592 A Review on Web-Based Attendance Management System
Authors: Arvind Lal, Chumphila Bhutia, Bidhan Pradhan, Retika Sharma, Monisha Limboo
Abstract:
There have been many proposals to optimize the students’ management system in higher education. Managing student attendance during lecture periods have become a difficult challenge. Manual calculation of attendance produces errors and wastes a lot of time. This proposed system manages the student’s attendance in a web portal and the records of the attendance will be stored in a database. The attendance of the students will be further forwarded to their HOD (Head OF Department), class teacher and their parents/guardians. This system will use MySQL for the database. The template of the website will be built using HTML and CSS (Cascading StyleSheet) code. JavaScript will be added to improve the use of the system. Student’s details will be stored in the database. Also, it will contain the details of the teachers according to their subjects and the classes they teach. The system will be responsive which can be used in mobile phones. Also, the development of this project will be user-friendly by facilitating with clear and understandable tabs. Hence, this website will be beneficial to institutes.Keywords: website, student's attendance, MySQL database, HTML, CSS, PHP, JavaScript
Procedia PDF Downloads 182591 Enhancing Students' Utilization of Written Corrective Feedback through Teacher-Student Writing Conferences: A Case Study in English Writing Instruction
Authors: Tsao Jui-Jung
Abstract:
Previous research findings have shown that most students do not fully utilize the written corrective feedback provided by teachers (Stone, 2014). This common phenomenon results in the ineffective utilization of teachers' written corrective feedback. As Ellis (2010) points out, the effectiveness of written corrective feedback depends on the level of student engagement with it. Therefore, it is crucial to understand how students utilize the written corrective feedback from their teachers. Previous studies have confirmed the positive impact of teacher-student writing conferences on students' engagement in the writing process and their writing abilities (Hum, 2021; Nosratinia & Nikpanjeh, 2019; Wong, 1996; Yeh, 2016, 2019). However, due to practical constraints such as time limitations, this instructional activity is not fully utilized in writing classrooms (Alfalagg, 2020). Therefore, to address this research gap, the purpose of this study was to explore several aspects of teacher-student writing conferences, including the frequency of meaning negotiation (i.e., comprehension checks, confirmation checks, and clarification checks) and teacher scaffolding techniques (i.e., feedback, prompts, guidance, explanations, and demonstrations) in teacher-student writing conferences, examining students’ self-assessment of their writing strengths and weaknesses in post-conference journals and their experiences with teacher-student writing conferences (i.e., interaction styles, communication levels, how teachers addressed errors, and overall perspectives on the conferences), and gathering insights from their responses to open-ended questions in the final stage of the study (i.e., their preferences and reasons for different written corrective feedback techniques used by teachers and their perspectives and suggestions on teacher-student writing conferences). Data collection methods included transcripts of audio recordings of teacher-student writing conferences, students’ post-conference journals, and open-ended questionnaires. The participants of this study were sophomore students enrolled in an English writing course for a duration of one school year. Key research findings are as follows: Firstly, in terms of meaning negotiation, students attempted to clearly understand the corrective feedback provided by the teacher-researcher twice as often as the teacher-researcher attempted to clearly understand the students' writing content. Secondly, the most commonly used scaffolding technique in the conferences was prompting (indirect feedback). Thirdly, the majority of participants believed that teacher-student writing conferences had a positive impact on their writing abilities. Fourthly, most students preferred direct feedback from the teacher-research as it directly pointed out their errors and saved them time in revision. However, some students still preferred indirect feedback, as they believed it encouraged them to think and self-correct. Based on the research findings, this study proposes effective teaching recommendations for English writing instruction aimed at optimizing teaching strategies and enhancing students' writing abilities.Keywords: written corrective feedback, student engagement, teacher-student writing conferences, action research
Procedia PDF Downloads 77