Search results for: Errors and Mistakes
681 The Correlation between Users’ Star Rating and Usability on Mobile Applications
Authors: Abdulmohsen A. AlBesher, Richard T. Stone
Abstract:
Star rating for mobile applications is a very useful way to differentiate between the best and worst rated applications. However, the question is whether the rating reflects the level of usability or not. The aim of this paper is to find out if the user’ star ratings on mobile apps correlate with the usability of those apps. Thus, we tested three mobile apps, which have different star ratings: low, medium, and high. Participating in the study, 15 mobile phone users were asked to do one single task for each of the three tested apps. After each task, the participant evaluated the app by answering a survey based on the System Usability Scale (SUS). The results found that there is no major correlation between the star rating and the usability. However, it was found that the task completion time and the numbers of errors that may happen while completing the task were significantly correlated to the usability.Keywords: mobile applications, SUS, star rating, usability
Procedia PDF Downloads 321680 Improvement of Piezoresistive Pressure Sensor Accuracy by Means of Current Loop Circuit Using Optimal Digital Signal Processing
Authors: Peter A. L’vov, Roman S. Konovalov, Alexey A. L’vov
Abstract:
The paper presents the advanced digital modification of the conventional current loop circuit for pressure piezoelectric transducers. The optimal DSP algorithms of current loop responses by the maximum likelihood method are applied for diminishing of measurement errors. The loop circuit has some additional advantages such as the possibility to operate with any type of resistance or reactance sensors, and a considerable increase in accuracy and quality of measurements to be compared with AC bridges. The results obtained are dedicated to replace high-accuracy and expensive measuring bridges with current loop circuits.Keywords: current loop, maximum likelihood method, optimal digital signal processing, precise pressure measurement
Procedia PDF Downloads 529679 New HCI Design Process Education
Authors: Jongwan Kim
Abstract:
Human Computer Interaction (HCI) is a subject covering the study, plan, and design of interactions between humans and computers. The prevalent use of digital mobile devices is increasing the need for education and research on HCI. This work is focused on a new education method geared towards reducing errors while developing application programs that incorporate role-changing brainstorming techniques during HCI design process. The proposed method has been applied to a capstone design course in the last spring semester. Students discovered some examples about UI design improvement and their error discovering and reducing capability was promoted. An UI design improvement, PC voice control for people with disabilities as an assistive technology examplar, will be presented. The improvement of these students' design ability will be helpful to the real field work.Keywords: HCI, design process, error reducing education, role-changing brainstorming, assistive technology
Procedia PDF Downloads 490678 Digitalization and High Audit Fees: An Empirical Study Applied to US Firms
Authors: Arpine Maghakyan
Abstract:
The purpose of this paper is to study the relationship between the level of industry digitalization and audit fees, especially, the relationship between Big 4 auditor fees and industry digitalization level. On the one hand, automation of business processes decreases internal control weakness and manual mistakes; increases work effectiveness and integrations. On the other hand, it may cause serious misstatements, high business risks or even bankruptcy, typically in early stages of automation. Incomplete automation can bring high audit risk especially if the auditor does not fully understand client’s business automation model. Higher audit risk consequently will cause higher audit fees. Higher audit fees for clients with high automation level are more highlighted in Big 4 auditor’s behavior. Using data of US firms from 2005-2015, we found that industry level digitalization is an interaction for the auditor quality on audit fees. Moreover, the choice of Big4 or non-Big4 is correlated with client’s industry digitalization level. Big4 client, which has higher digitalization level, pays more than one with low digitalization level. In addition, a high-digitalized firm that has Big 4 auditor pays higher audit fee than non-Big 4 client. We use audit fees and firm-specific variables from Audit Analytics and Compustat databases. We analyze collected data by using fixed effects regression methods and Wald tests for sensitivity check. We use fixed effects regression models for firms for determination of the connections between technology use in business and audit fees. We control for firm size, complexity, inherent risk, profitability and auditor quality. We chose fixed effects model as it makes possible to control for variables that have not or cannot be measured.Keywords: audit fees, auditor quality, digitalization, Big4
Procedia PDF Downloads 302677 The Impact of Failure-tolerant Restaurant Culture on Curbing Employees’ Withdrawal Behavior: The Roles of Psychological Empowerment and Mindful Leadership
Authors: Omar Alsetoohy, Mohamed Ezzat, Mahmoud Abou Kamar
Abstract:
The success of a restaurant or hotel depends very much on the quality and quantity of its human resources. Thus, establishing a competitive edge through human assets requires careful attention to the practices that best leverage these assets. Usually, hotel or restaurant employees recognize customer defection as an unfavorable or unpleasant occurrence associated with failure. These failures could be in handling, communication, learning, or encouragement. Besides, employees could be afraid of blame from their colleagues and managers, which prevents them from freely discussing these mistakes with them. Such behaviors, in turn, would push employees to withdraw from the workplace. However, we have a good knowledge of the leadership outcomes, but less is known about how and why these effects occur. Accordingly, mindful leaders usually analyze the causes and underlying mechanisms of failures for work improvement. However, despite the excessive literature in the field of leadership and employee behaviors, to date, no research studies had investigated the impact of a failure-tolerant restaurant culture on the employees’ withdrawal behaviors considering the moderating role of psychological empowerment and mindful leadership. Thus, this study seeks to investigate the impact of a failure-tolerant culture on the employees’ withdrawal behaviors in fast-food restaurants in Egypt considering the moderating effects of employee empowerment and mindful leaders. This study may contribute to the existing literature by filling the gap between failure-tolerant cultures and employee withdrawal behaviors in the hospitality literature. The study may also identify the best practices for restaurant operators and managers to deal with employees' failures as an improvement tool for their performance.Keywords: failure-tolerant culture, employees’ withdrawal behaviors psychological empowerment, mindful leadership, restaurants
Procedia PDF Downloads 108676 Development of a Web Exploration Support System Focusing on Accumulation of Search Contexts
Authors: T. Yamazaki, R. Onuma, H. Kaminaga, Y. Miyadera, S. Nakamura
Abstract:
Web exploration has increasingly diversified in accordance with the development of browsing environments on the Internet. Moreover, advanced exploration often conducted in intellectual activities such as surveys in research activities. This kind of exploration is conducted for a long period with trials and errors. In such a case, it is extremely important for a user to accumulate the search contexts and understand them. However, existing support systems were not effective enough since most systems could not handle the various factors involved in the exploration. This research aims to develop a novel system to support web exploration focusing on the accumulation of the search contexts. This paper mainly describes the outline of the system. An experiment using the system is also described. Finally, features of the system are discussed based on the results.Keywords: web exploration context, refinement of search intention, accumulation of context, exploration support, information visualization
Procedia PDF Downloads 309675 Fault Diagnosis in Induction Motor
Authors: Kirti Gosavi, Anita Bhole
Abstract:
The paper demonstrates simulation and steady-state performance of three phase squirrel cage induction motor and detection of rotor broken bar fault using MATLAB. This simulation model is successfully used in the fault detection of rotor broken bar for the induction machines. A dynamic model using PWM inverter and mathematical modelling of the motor is developed. The dynamic simulation of the small power induction motor is one of the key steps in the validation of the design process of the motor drive system and it is needed for eliminating advertent design errors and the resulting error in the prototype construction and testing. The simulation model will be helpful in detecting the faults in three phase induction motor using Motor current signature analysis.Keywords: squirrel cage induction motor, pulse width modulation (PWM), fault diagnosis, induction motor
Procedia PDF Downloads 633674 A New Reliability Allocation Method Based on Fuzzy Numbers
Authors: Peng Li, Chuanri Li, Tao Li
Abstract:
Reliability allocation is quite important during early design and development stages for a system to apportion its specified reliability goal to subsystems. This paper improves the reliability fuzzy allocation method and gives concrete processes on determining the factor set, the factor weight set, judgment set, and multi-grade fuzzy comprehensive evaluation. To determine the weight of factor set, the modified trapezoidal numbers are proposed to reduce errors caused by subjective factors. To decrease the fuzziness in the fuzzy division, an approximation method based on linear programming is employed. To compute the explicit values of fuzzy numbers, centroid method of defuzzification is considered. An example is provided to illustrate the application of the proposed reliability allocation method based on fuzzy arithmetic.Keywords: reliability allocation, fuzzy arithmetic, allocation weight, linear programming
Procedia PDF Downloads 342673 Residential High-Rises and Meaningful Places: Missing Actions in the Isle of Dogs Regeneration
Authors: Elena Kalcheva, Ahmad Taki, Yuri Hadi
Abstract:
Urban regeneration often includes residential high-rises as a way of optimum use of land. However, high-rises are in many cases connected to placelessness, this is not due to some intrinsic characteristic of the typology, but more to a failure to provide meaningful places in connection to them. The reason to study the Isle of the Dogs regeneration is the successful process that led to vibrant area with strong identity and social sustainability. Therefore, the purpose of this research is to identify the gaps into the sound strategy for the development of the area and in its implementation which will make the place more sustainable. The paper addresses four research questions: are the residential high-rises supporting a proper physical form; is there deployed properly scaled mix of land uses and functions in connection with residential high-rises; are there possible quality activities in quality places near the residential high-rises; and is there a strong sense of place created with the residential high-rise buildings and their surroundings. The methodology relies on observational survey of the researched area together with structured questions, to evaluate the external qualities of the residential high-rises and their surroundings. Visual information can help identify the mistakes and the omissions of the provided project examples. It can provide insight on how can be improved imageability, legibility and human scale. In this connection, the paper argues that although the quality of the architecture of the high-rises is superb, there is a failure to create meaningful, high quality public realm in connection with them. As such, it does not function as well as the designers intended to do: the functional quality of the public realm is quite low. The implications of the study suggest that actions need to take place in order to improve and foster further regeneration of the area.Keywords: high-rises, isle of the dogs, public realm, regeneration
Procedia PDF Downloads 282672 The Effect of Written Corrective Feedback on the Accurate Use of Grammatical Forms by Japanese Low-Intermediate EFL Learners
Authors: Ayako Hasegawa, Ken Ubukata
Abstract:
The purpose of this study is to investigate whether corrective feedback has any significant effect on Japanese low-intermediate EFL learners’ performance on a specific set of linguistic features. The subjects are Japanese college students majoring in English. They have studied English for about 7 years, but their inter-language seems to fossilize because non-target like errors is frequently observed in traditional deductive teacher-fronted approach. It has been reported that corrective feedback plays an important role in diminishing or overcoming inter-language fossilization and achieving TL competency. Therefore, it was examined how the corrective feedback (the focus of this study was metalinguistic feedback) and self-correction raised the students’ awareness and helped them notice the gaps between their inter-language and the TL.Keywords: written corrective feedback, fossilized error, grammar teaching, language teaching
Procedia PDF Downloads 360671 A Research on Inference from Multiple Distance Variables in Hedonic Regression Focus on Three Variables
Authors: Yan Wang, Yasushi Asami, Yukio Sadahiro
Abstract:
In urban context, urban nodes such as amenity or hazard will certainly affect house price, while classic hedonic analysis will employ distance variables measured from each urban nodes. However, effects from distances to facilities on house prices generally do not represent the true price of the property. Distance variables measured on the same surface are suffering a problem called multicollinearity, which is usually presented as magnitude variance and mean value in regression, errors caused by instability. In this paper, we provided a theoretical framework to identify and gather the data with less bias, and also provided specific sampling method on locating the sample region to avoid the spatial multicollinerity problem in three distance variable’s case.Keywords: hedonic regression, urban node, distance variables, multicollinerity, collinearity
Procedia PDF Downloads 465670 Passive Voice in SLA: Armenian Learners’ Case Study
Authors: Emma Nemishalyan
Abstract:
It is believed that learners’ mother tongue (L1 hereafter) has a huge impact on their second language acquisition (L2 hereafter). This hypothesis has been exposed to both positive and negative criticism. Based on research results of a wide range of learners’ corpora (Chinese, Japanese, Spanish among others) the hypothesis has either been proved or disproved. However, no such study has been conducted on the Armenian learners. The aim of this paper is to understand the implication of the hypothesis on the Armenian learners’ corpus in terms of the use of the passive voice. To this end, the method of Contrastive Interlanguage Analysis (hereafter CIA) has been used on native speakers’ corpus (Louvain Corpus of Native English Essays (LOCNESS)) and Armenian learners’ corpus which has been compiled by me in compliance with International Corpus of Learner English (ICLE) guidelines. CIA compares the interlanguage (the language produced by learners) with the one produced by native speakers. With the help of this method, it is possible not only to highlight the mistakes that learners make, but also to underline the under or overuses. The choice of the grammar issue (passive voice) is conditioned by the fact that typologically Armenian and English are drastically different as they belong to different branches. Moreover, the passive voice is considered to be one of the most problematic grammar topics to be acquired by learners of the English language. Based on this difference, we hypothesized that Armenian learners would either overuse or underuse some types of the passive voice. With the help of Lancsbox software, we have identified the frequency rates of passive voice usage in LOCNESS and Armenian learners’ corpus to understand whether the latter have the same usage pattern of the passive voice as the native speakers. Secondly, we have identified the types of the passive voice used by the Armenian leaners trying to track down the reasons in their mother tongue. The results of the study showed that Armenian learners underused the passive voices in contrast to native speakers. Furthermore, the hypothesis that learners’ L1 has an impact on learners’ L2 acquisition and production was proved.Keywords: corpus linguistics, applied linguistics, second language acquisition, corpus compilation
Procedia PDF Downloads 109669 Recent Developments in the Application of Deep Learning to Stock Market Prediction
Authors: Shraddha Jain Sharma, Ratnalata Gupta
Abstract:
Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume
Procedia PDF Downloads 90668 Verification of Satellite and Observation Measurements to Build Solar Energy Projects in North Africa
Authors: Samy A. Khalil, U. Ali Rahoma
Abstract:
The measurements of solar radiation, satellite data has been routinely utilize to estimate solar energy. However, the temporal coverage of satellite data has some limits. The reanalysis, also known as "retrospective analysis" of the atmosphere's parameters, is produce by fusing the output of NWP (Numerical Weather Prediction) models with observation data from a variety of sources, including ground, and satellite, ship, and aircraft observation. The result is a comprehensive record of the parameters affecting weather and climate. The effectiveness of reanalysis datasets (ERA-5) for North Africa was evaluate against high-quality surfaces measured using statistical analysis. Estimating the distribution of global solar radiation (GSR) over five chosen areas in North Africa through ten-years during the period time from 2011 to 2020. To investigate seasonal change in dataset performance, a seasonal statistical analysis was conduct, which showed a considerable difference in mistakes throughout the year. By altering the temporal resolution of the data used for comparison, the performance of the dataset is alter. Better performance is indicate by the data's monthly mean values, but data accuracy is degraded. Solar resource assessment and power estimation are discuses using the ERA-5 solar radiation data. The average values of mean bias error (MBE), root mean square error (RMSE) and mean absolute error (MAE) of the reanalysis data of solar radiation vary from 0.079 to 0.222, 0.055 to 0.178, and 0.0145 to 0.198 respectively during the period time in the present research. The correlation coefficient (R2) varies from 0.93 to 99% during the period time in the present research. This research's objective is to provide a reliable representation of the world's solar radiation to aid in the use of solar energy in all sectors.Keywords: solar energy, ERA-5 analysis data, global solar radiation, North Africa
Procedia PDF Downloads 98667 Positive Psychology and Parenting: A Case Study
Authors: Victor William Harris
Abstract:
Objective. This study examined the impact of the Positive Behavioral Management Skills (PBMS) online educational program on participants (n = 624) in a Southeastern region of the United States. The PBMS program incorporates established positive psychology behavioral management principles with new research-based practices designed to promote healthy and satisfying relationships between adults and children while constructively managing and preventing problematic behavior. Additionally, the PBMS program assists parents and teachers in recognizing the motivations behind a wide range of misbehaviors. The program also offers to forewarn some of the most common mistakes (or “parent traps”) in child behavioral management and describes how they can be avoided. It also describes how to recognize and capitalize on “teachable moments,” which are indispensable in the developmental process. Design. A retrospective-pre-test-then-post-test design was used to reduce response shift bias when assessing knowledge and skill intervention outcomes for twenty-two behavioral management variables. Results. The PBMS program was shown to be effective for increasing knowledge and skills related to managing misbehavior while reinforcing interpersonal relationships and fostering a sense of responsibility and capability within the child. Large standardized mean effect size changes from before to after program intervention was documented for PBMS participants on all twenty-two variables studied. Conclusion. The PBMS program showed initial positive outcomes to assist participants in the sample studied to increase their knowledge and skills in managing child behavior successfully. Implications for parents, educators and practitioners are discussed.Keywords: behavioral management, discipline, parent education, positive parenting, positive psychology-parenting
Procedia PDF Downloads 221666 Numerical Method for Heat Transfer Problem in a Block Having an Interface
Authors: Beghdadi Lotfi, Bouziane Abdelhafid
Abstract:
A finite volume method for quadrilaterals unstructured mesh is developed to predict the two dimensional steady-state solutions of conduction equation. In this scheme, based on the integration around the polygonal control volume, the derivatives of conduction equation must be converted into closed line integrals using same formulation of the Stokes theorem. To valid the accuracy of the method two numerical experiments s are used: conduction in a regular block (with known analytical solution) and conduction in a rotated block (case with curved boundaries).The numerical results show good agreement with analytical results. To demonstrate the accuracy of the method, the absolute and root-mean square errors versus the grid size are examined quantitatively.Keywords: Stokes theorem, unstructured grid, heat transfer, complex geometry
Procedia PDF Downloads 290665 Translation Skills and Language Acquisition
Authors: Frieda Amitai
Abstract:
The field of Translation Studies includes both descriptive and applied aspects, one of which is developing curricula. Within this topic there are theories dealing with curricula aimed at translator training, and theories meant to explore teaching translation as means through which awareness to language is developed in order to enhance language knowledge. An example of the latter is a unique study program in Israeli high schools – Teaching Translation Skills Program (TTSP). This study program has been taught in Israel for more than two decades and is aimed at raising students' meta-linguistic awareness as well as their language proficiency in both source language and target language in order to enable them become better language learners. The objective of the current research was to examine whether the goals of this program are achieved – increase in students' metalinguistic awareness and language proficiency. A follow-up case study was aimed at examining the level of proficiency which would develop most by this way of teaching English. The study was conducted in two stages – before and after participating in the program. 400 subjects took part in the first stage, and 100 took part in the second. In both parts of the study, participants were given the same five tasks in both Hebrew and English in addition to a questionnaire, in which they were asked about their own knowledge of Hebrew and in comparison to that of their peers. Their teachers were asked about the success of the program and about the methodology they use in class. Findings show significant change in the level of meta-linguistic awareness of the students as well as their language proficiency. A comparison between their answers before and after the program shows that their meta-linguistic awareness increased, as did their ability to recognize linguistic mistakes. These findings serve as strong evidence for the positive effect such study program has on the development of meta-linguistic awareness and linguistic knowledge. The follow-up case study tests the change among weaker language learners.Keywords: comparison, metalinguistic awareness, language learning, translation skills
Procedia PDF Downloads 356664 Software Defect Analysis- Eclipse Dataset
Authors: Amrane Meriem, Oukid Salyha
Abstract:
The presence of defects or bugs in software can lead to costly setbacks, operational inefficiencies, and compromised user experiences. The integration of Machine Learning(ML) techniques has emerged to predict and preemptively address software defects. ML represents a proactive strategy aimed at identifying potential anomalies, errors, or vulnerabilities within code before they manifest as operational issues. By analyzing historical data, such as code changes, feature im- plementations, and defect occurrences. This en- ables development teams to anticipate and mitigate these issues, thus enhancing software quality, reducing maintenance costs, and ensuring smoother user interactions. In this work, we used a recommendation system to improve the performance of ML models in terms of predicting the code severity and effort estimation.Keywords: software engineering, machine learning, bugs detection, effort estimation
Procedia PDF Downloads 87663 Getting It Right Before Implementation: Using Simulation to Optimize Recommendations and Interventions After Adverse Event Review
Authors: Melissa Langevin, Natalie Ward, Colleen Fitzgibbons, Christa Ramsey, Melanie Hogue, Anna Theresa Lobos
Abstract:
Description: Root Cause Analysis (RCA) is used by health care teams to examine adverse events (AEs) to identify causes which then leads to recommendations for prevention Despite widespread use, RCA has limitations. Best practices have not been established for implementing recommendations or tracking the impact of interventions after AEs. During phase 1 of this study, we used simulation to analyze two fictionalized AEs that occurred in hospitalized paediatric patients to identify and understand how the errors occurred and generated recommendations to mitigate and prevent recurrences. Scenario A involved an error of commission (inpatient drug error), and Scenario B involved detecting an error that already occurred (critical care drug infusion error). Recommendations generated were: improved drug labeling, specialized drug kids, alert signs and clinical checklists. Aim: Use simulation to optimize interventions recommended post critical event analysis prior to implementation in the clinical environment. Methods: Suggested interventions from Phase 1 were designed and tested through scenario simulation in the clinical environment (medicine ward or pediatric intensive care unit). Each scenario was simulated 8 times. Recommendations were tested using different, voluntary teams and each scenario was debriefed to understand why the error was repeated despite interventions and how interventions could be improved. Interventions were modified with subsequent simulations until recommendations were felt to have an optimal effect and data saturation was achieved. Along with concrete suggestions for design and process change, qualitative data pertaining to employee communication and hospital standard work was collected and analyzed. Results: Each scenario had a total of three interventions to test. In, scenario 1, the error was reproduced in the initial two iterations and mitigated following key intervention changes. In scenario 2, the error was identified immediately in all cases where the intervention checklist was utilized properly. Independently of intervention changes and improvements, the simulation was beneficial to identify which of these should be prioritized for implementation and highlighted that even the potential solutions most frequently suggested by participants did not always translate into error prevention in the clinical environment. Conclusion: We conclude that interventions that help to change process (epinephrine kit or mandatory checklist) were more successful at preventing errors than passive interventions (signage, change in memory aids). Given that even the most successful interventions needed modifications and subsequent re-testing, simulation is key to optimizing suggested changes. Simulation is a safe, practice changing modality for institutions to use prior to implementing recommendations from RCA following AE reviews.Keywords: adverse events, patient safety, pediatrics, root cause analysis, simulation
Procedia PDF Downloads 152662 The Impact of the Number of Neurons in the Hidden Layer on the Performance of MLP Neural Network: Application to the Fast Identification of Toxics Gases
Authors: Slimane Ouhmad, Abdellah Halimi
Abstract:
In this work, we have applied neural networks method MLP type to a database from an array of six sensors for the detection of three toxic gases. As the choice of the number of hidden layers and the weight values has a great influence on the convergence of the learning algorithm, we proposed, in this article, a mathematical formulation to determine the optimal number of hidden layers and good weight values based on the method of back propagation of errors. The results of this modeling have improved discrimination of these gases on the one hand, and optimize the computation time on the other hand, the comparison to other results achieved in this case.Keywords: MLP Neural Network, back-propagation, number of neurons in the hidden layer, identification, computing time
Procedia PDF Downloads 347661 Renewable Energy Interfaced Shunt Active Filter Using a Virtual Flux Direct Power Control
Authors: M. R. Bengourina, M. Rahli, L. Hassaine, S. Saadi
Abstract:
In this study, we present a control method entitled virtual flux direct power control of a grid connected photovoltaic system associated with an active power filter. The virtual flux direct control of power (VF-DPC) is employed for the calculation of reference current generation. In this technique, the switches states of inverter are selected from a table of switching based on the immediate errors between the active and reactive powers and their reference values. The objectives of this paper are the reduction of Total Harmonic Distortion (THD) of source current, compensating reactive power and injecting the maximum active power available from the PV array into the load and/or grid. MATLAB/SIMULINK simulations are provided to demonstrate the performance of the proposed approach.Keywords: shunt active power filter, VF-DPC, photovoltaic, MPPT
Procedia PDF Downloads 323660 Detecting Logical Errors in Haskell
Authors: Vanessa Vasconcelos, Mariza A. S. Bigonha
Abstract:
In order to facilitate both processes, this paper presents HaskellFL, a tool that uses fault localization techniques to locate a logical error in Haskell code. The Haskell subset used in this work is sufficiently expressive for those studying functional programming to get immediate help debugging their code and to answer questions about key concepts associated with the functional paradigm. HaskellFL was tested against functional programming assignments submitted by students enrolled at the functional programming class at the Federal University of Minas Gerais and against exercises from the Exercism Haskell track that are publicly available on GitHub. Furthermore, the EXAM score was chosen to evaluate the tool’s effectiveness, and results showed that HaskellFL reduced the effort needed to locate an error for all tested scenarios. Results also showed that the Ochiai method was more effective than Tarantula.Keywords: debug, fault localization, functional programming, Haskell
Procedia PDF Downloads 298659 Study of the Behavior of Bolted Joints with and Without Reinforcement
Authors: Karim Akkouche
Abstract:
Many methods have been developed for characterizing the behavior of bolted joints. However, in the presence of a certain model of stiffeners, no orientation was given in relation to their modeling. To this end, multitude of coarse errors can arise in the reproduction of the propagation of efforts and in representation of the modes of deformations. Considering these particularities, a numerical investigation was carried out in our laboratory. In this paper we will present a comparative study between three types of assemblies. A non-linear 3D modeling was chosen, given that it takes into consideration geometric and material non-linearity, using the Finite Element calculation code ABAQUS. Initially, we evaluated the influence of the presence of each stiffener on the "global" behavior of the assemblies, this by analyzing their Moment-Rotation curves, also by referring to the classification system proposed by NF EN 1993- 1.8 which is based on the resisting moment Mj-Rd and the initial stiffness Sj.int. In a second step, we evaluated the "local" behavior of their components by referring to the stress-strain curves.Keywords: assembly, post-beam, end plate, nonlinearity
Procedia PDF Downloads 74658 Stable Time Reversed Integration of the Navier-Stokes Equation Using an Adjoint Gradient Method
Authors: Jurriaan Gillissen
Abstract:
This work is concerned with stabilizing the numerical integration of the Navier-Stokes equation (NSE), backwards in time. Applications involve the detection of sources of, e.g., sound, heat, and pollutants. Stable reverse numerical integration of parabolic differential equations is also relevant for image de-blurring. While the literature addresses the reverse integration problem of the advection-diffusion equation, the problem of numerical reverse integration of the NSE has, to our knowledge, not yet been addressed. Owing to the presence of viscosity, the NSE is irreversible, i.e., when going backwards in time, the fluid behaves, as if it had a negative viscosity. As an effect, perturbations from the perfect solution, due to round off errors or discretization errors, grow exponentially in time, and reverse integration of the NSE is inherently unstable, regardless of using an implicit time integration scheme. Consequently, some sort of filtering is required, in order to achieve a stable, numerical, reversed integration. The challenge is to find a filter with a minimal adverse affect on the accuracy of the reversed integration. In the present work, we explore an adjoint gradient method (AGM) to achieve this goal, and we apply this technique to two-dimensional (2D), decaying turbulence. The AGM solves for the initial velocity field u0 at t = 0, that, when integrated forward in time, produces a final velocity field u1 at t = 1, that is as close as is feasibly possible to some specified target field v1. The initial field u0 defines a minimum of a cost-functional J, that measures the distance between u1 and v1. In the minimization procedure, the u0 is updated iteratively along the gradient of J w.r.t. u0, where the gradient is obtained by transporting J backwards in time from t = 1 to t = 0, using the adjoint NSE. The AGM thus effectively replaces the backward integration by multiple forward and backward adjoint integrations. Since the viscosity is negative in the adjoint NSE, each step of the AGM is numerically stable. Nevertheless, when applied to turbulence, the AGM develops instabilities, which limit the backward integration to small times. This is due to the exponential divergence of phase space trajectories in turbulent flow, which produces a multitude of local minima in J, when the integration time is large. As an effect, the AGM may select unphysical, noisy initial conditions. In order to improve this situation, we propose two remedies. First, we replace the integration by a sequence of smaller integrations, i.e., we divide the integration time into segments, where in each segment the target field v1 is taken as the initial field u0 from the previous segment. Second, we add an additional term (regularizer) to J, which is proportional to a high-order Laplacian of u0, and which dampens the gradients of u0. We show that suitable values for the segment size and for the regularizer, allow a stable reverse integration of 2D decaying turbulence, with accurate results for more then O(10) turbulent, integral time scales.Keywords: time reversed integration, parabolic differential equations, adjoint gradient method, two dimensional turbulence
Procedia PDF Downloads 224657 Safety Conditions Analysis of Scaffolding on Construction Sites
Authors: M. Pieńko, A. Robak, E. Błazik-Borowa, J. Szer
Abstract:
This paper presents the results of analysis of 100 full-scale scaffolding structures in terms of compliance with legal acts and safety of use. In 2016 and 2017, authors examined scaffolds in Poland located at buildings which were at construction or renovation stage. The basic elements affecting the safety of scaffolding use such as anchors, supports, platforms, guardrails and toe-boards have been taken into account. All of these elements were checked in each of considered scaffolding. Based on the analyzed scaffoldings, the most common errors concerning assembly process and use of scaffolding were collected. Legal acts on the scaffoldings are not always clear, and this causes many issues. In practice, people realize how dangerous the use of incomplete scaffolds is only when the accident occurs. Despite the fact that the scaffolding should ensure the safety of its users, most accidents on construction sites are caused by fall from a height.Keywords: façade scaffolds, load capacity, practice, safety of people
Procedia PDF Downloads 403656 Patient Safety of Eating Ready-Made Meals at Government Hospitals
Authors: Hala Kama Ahmed Rashwan
Abstract:
Ensuring the patient safety especially at intensive care units and those exposed to hospital tools and equipment is one of the most important challenges facing healthcare today. Outbreak of food poisoning as a result of food-borne pathogens has been reported in many hospitals and care homes all over the world due to hospital meals. Patient safety of eating hospital meals is a fundamental principle of healthcare; it is new healthcare disciplines that assure the food raw materials, food storage, meals processing, and control of kitchen errors that often lead to adverse healthcare events. The aim of this article is to promote any hospital in attaining the hygienic practices and better quality system during processing of the ready-to- eat meals for intensive care units patients according to the WHO safety guidelines.Keywords: hospitals, meals, safety, intensive care
Procedia PDF Downloads 510655 Evidence of Scientific-Ness of Scriptures
Authors: Shyam Sunder Gupta
Abstract:
Written scriptures are created out of Words of God, revealed or inspired. This process of conversion, from revealed Words to written scriptures, happens after a long gap of time and with the involvement of a large number of persons, and unintentionally, scientific and other types of errors get into scriptures; otherwise, scriptures are, in reality, truly scientific. Description of Chronology of life in the womb (Fetal Development), Rotation of Universe, spherical shape of the earth, evolution process of non-living matter and living species, classification of species by nature of birth, etc., most convincing prove that scriptures are truly scientific. In fact, there are many facts for which, to date, science has not found answers but are available in scriptures, like the creation of singularity from which the Big Bang took place and the Universe got created innumerable universes, and the most fundamental particle Param-anu. These findings demonstrate that scriptures contain scientific knowledge that predates scientific discoveries.Keywords: Big Bang, evolution, Param-anu, scientific, scriptures, singularity, universe
Procedia PDF Downloads 33654 Critical Review of the Democracy in Pakistan in Light of Dr. Israr Ahmed and Western Philosophers
Authors: Zoaib Mirza
Abstract:
Pakistan is an Islamic country that got its partition from India in 1947 so that the people could practice the religion of Islam. The ideology of Pakistan was based on the notion that sovereignty only belonged to God Almighty (in Arabic, God means “Allah”), and Muslims will live in accordance with Islam principles. The Quran (Holy Book) and Sunnah (authentic practices of Prophet Mohammad, Peace Be Upon Him, that explains the application of the Quran) are foundations of the Islamic principles. It has been over 75 years, but unfortunately, Pakistan, due to its own political, social, and economic mistakes, is responsible for not being able to become a true Islamic nation to justify its partition from India. The rationale for writing this paper is to analyze the factors that led to changes in the democratic movements impacting the country's political, social, and economic growth. The methodology to examine the historical and political context of Pakistan’s history is by referencing the scholarly work of Israr Ahmed, a Pakistani Islamic theologian, philosopher, and Islamic scholar. While from a Western perspective, Karl Marx, Mar Weber, Hannah Arendt, Sheldon Wolin, Paulo Freire, and Jacques Ranciere's philosophies specific to totalitarianism, politics, military rule, religion, capitalism, and superpower are used as the framework to analyze Pakistan’s democracy. The study's findings conclude that Pakistan's democracy is unstable and has been impacted by military and civilian governance, which led to political, social, and economic downfall. To improve the current situation, the citizens of Pakistan have to realize that the success of a nation is only dependent on the level of consciousness of the leader and not the political system. Therefore, it is the responsibility of every citizen to be conscious of how they select their leader and take responsibility for the current situation in Pakistan.Keywords: Pakistan, Islam, democracy, totalitarianism, military, religion, capitalism
Procedia PDF Downloads 90653 Finding the Free Stream Velocity Using Flow Generated Sound
Authors: Saeed Hosseini, Ali Reza Tahavvor
Abstract:
Sound processing is one the subjects that newly attracts a lot of researchers. It is efficient and usually less expensive than other methods. In this paper the flow generated sound is used to estimate the flow speed of free flows. Many sound samples are gathered. After analyzing the data, a parameter named wave power is chosen. For all samples, the wave power is calculated and averaged for each flow speed. A curve is fitted to the averaged data and a correlation between the wave power and flow speed is founded. Test data are used to validate the method and errors for all test data were under 10 percent. The speed of the flow can be estimated by calculating the wave power of the flow generated sound and using the proposed correlation.Keywords: the flow generated sound, free stream, sound processing, speed, wave power
Procedia PDF Downloads 415652 Hybrid EMPCA-Scott Approach for Estimating Probability Distributions of Mutual Information
Authors: Thuvanan Borvornvitchotikarn, Werasak Kurutach
Abstract:
Mutual information (MI) is widely used in medical image registration. In the different medical images analysis, it is difficult to choose an optimal bins size number for calculating the probability distributions in MI. As the result, this paper presents a new adaptive bins number selection approach that named a hybrid EMPCA-Scott approach. This work combines an expectation maximization principal component analysis (EMPCA) and the modified Scott’s rule. The proposed approach solves the binning problem from the various intensity values in medical images. Experimental results of this work show the lower registration errors compared to other adaptive binning approaches.Keywords: mutual information, EMPCA, Scott, probability distributions
Procedia PDF Downloads 249