Search results for: testing order
15374 The Impact of Language Anxiety on EFL Learners' Proficiency: Case Study of University of Jeddah
Authors: Saleh Mohammad Alqahtani
Abstract:
Foreign language Anxiety has been found to be a key issue in learning English as foreign language in the classroom. This study investigated the impact of foreign language anxiety on Saudi EFL learners' proficiency in the classroom. A total of 197 respondents had participated in the study, comprising of 96 male and 101 female, who enrolled in preparatory year, first year, second year, and fourth year of English language department at the University of Jeddah. Two instruments were used to answer the study questions. The Foreign Language Classroom Anxiety Scale (FLCAS) was used to identify the levels of foreign language (FL) anxiety for Saudi learners. Moreover, an International English Language Testing System (IELTS) test was used as an objective measure of the learners’ English language proficiency. The data were analyzed using descriptive analyses, t-test, one-way ANOVA, correlation, and regression analysis. The findings revealed that Saudi EFL learners' experience a level of anxiety in the classroom, and there is a significant differences between the course levels in their level of language anxiety. Moreover, it is also found that female students are less anxious in learning English as a foreign language than male students. The results show that foreign language anxiety and English proficiency are negatively related to each other. Furthermore, the study revealed that there were significant differences between Saudi learners in language use anxiety, while there were no significant differences in language class anxiety. The study suggested that teachers should employ a diversity of designed techniques to encourage the environment of the classroom in order to control learners’ FLA, which in turns will improve their EFL proficiency.Keywords: foreign language anxiety, FLA, language use anxiety, language class anxiety, gender, L2 proficiency
Procedia PDF Downloads 26015373 Solid State Drive End to End Reliability Prediction, Characterization and Control
Authors: Mohd Azman Abdul Latif, Erwan Basiron
Abstract:
A flaw or drift from expected operational performance in one component (NAND, PMIC, controller, DRAM, etc.) may affect the reliability of the entire Solid State Drive (SSD) system. Therefore, it is important to ensure the required quality of each individual component through qualification testing specified using standards or user requirements. Qualification testing is time-consuming and comes at a substantial cost for product manufacturers. A highly technical team, from all the eminent stakeholders is embarking on reliability prediction from beginning of new product development, identify critical to reliability parameters, perform full-blown characterization to embed margin into product reliability and establish control to ensure the product reliability is sustainable in the mass production. The paper will discuss a comprehensive development framework, comprehending SSD end to end from design to assembly, in-line inspection, in-line testing and will be able to predict and to validate the product reliability at the early stage of new product development. During the design stage, the SSD will go through intense reliability margin investigation with focus on assembly process attributes, process equipment control, in-process metrology and also comprehending forward looking product roadmap. Once these pillars are completed, the next step is to perform process characterization and build up reliability prediction modeling. Next, for the design validation process, the reliability prediction specifically solder joint simulator will be established. The SSD will be stratified into Non-Operating and Operating tests with focus on solder joint reliability and connectivity/component latent failures by prevention through design intervention and containment through Temperature Cycle Test (TCT). Some of the SSDs will be subjected to the physical solder joint analysis called Dye and Pry (DP) and Cross Section analysis. The result will be feedbacked to the simulation team for any corrective actions required to further improve the design. Once the SSD is validated and is proven working, it will be subjected to implementation of the monitor phase whereby Design for Assembly (DFA) rules will be updated. At this stage, the design change, process and equipment parameters are in control. Predictable product reliability at early product development will enable on-time sample qualification delivery to customer and will optimize product development validation, effective development resource and will avoid forced late investment to bandage the end-of-life product failures. Understanding the critical to reliability parameters earlier will allow focus on increasing the product margin that will increase customer confidence to product reliability.Keywords: e2e reliability prediction, SSD, TCT, solder joint reliability, NUDD, connectivity issues, qualifications, characterization and control
Procedia PDF Downloads 17415372 Predictive Analytics for Theory Building
Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim
Abstract:
Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building
Procedia PDF Downloads 27615371 Global Mittag-Leffler Stability of Fractional-Order Bidirectional Associative Memory Neural Network with Discrete and Distributed Transmission Delays
Authors: Swati Tyagi, Syed Abbas
Abstract:
Fractional-order Hopfield neural networks are generally used to model the information processing among the interacting neurons. To show the constancy of the processed information, it is required to analyze the stability of these systems. In this work, we perform Mittag-Leffler stability for the corresponding Caputo fractional-order bidirectional associative memory (BAM) neural networks with various time-delays. We derive sufficient conditions to ensure the existence and uniqueness of the equilibrium point by using the theory of topological degree theory. By applying the fractional Lyapunov method and Mittag-Leffler functions, we derive sufficient conditions for the global Mittag-Leffler stability, which further imply the global asymptotic stability of the network equilibrium. Finally, we present two suitable examples to show the effectiveness of the obtained results.Keywords: bidirectional associative memory neural network, existence and uniqueness, fractional-order, Lyapunov function, Mittag-Leffler stability
Procedia PDF Downloads 36415370 All-In-One Universal Cartridge Based Truly Modular Electrolyte Analyzer
Authors: S. Dalvi, N. Sane, V. Patil, D. Bansode, A. Tharakan, V. Mathur
Abstract:
Measurement of routine clinical electrolyte tests is common in labs worldwide for screening of illness or diseases. All the analyzers for the measurement of electrolyte parameters have sensors, reagents, sampler, pump tubing, valve, other tubing’s separate that are either expensive, require heavy maintenance and have a short shelf-life. Moreover, the costs required to maintain such Lab instrumentation is high and this limits the use of the device to only highly specialized personnel and sophisticated labs. In order to provide Healthcare Diagnostics to ALL at affordable costs, there is a need for an All-in-one Universal Modular Cartridge that contains sensors, reagents, sampler, valve, pump tubing, and other tubing’s in one single integrated module-in-module cartridge that is affordable, reliable, easy-to-use, requires very low sample volume and is truly modular and maintenance-free. DiaSys India has developed a World’s first, Patent Pending, Versatile All-in-one Universal Module-in-Module Cartridge based Electrolyte Analyzer (QDx InstaLyte) that can perform sodium, potassium, chloride, calcium, pH, lithium tests. QDx InstaLyte incorporates High Performance, Inexpensive All-in-one Universal Cartridge for rapid quantitative measurement of electrolytes in body fluids. Our proposed methodology utilizes Advanced & Improved long life ISE sensors to provide a sensitive and accurate result in 120 sec with just 100 µl of sample volume. The All-in-One Universal Cartridge has a very low reagent consumption capable of maximum of 1000 tests with a Use-life of 3-4 months and a long Shelf life of 12-18 months at 4-25°C making it very cost-effective. Methods: QDx InstaLyte analyzers with All-in-one Universal Modular Cartridges were independently evaluated with three R&D lots for Method Performance (Linearity, Precision, Method Comparison, Cartridge Stability) to measure Sodium, Potassium, Chloride. Method Comparison was done against Medica EasyLyte Plus Na/K/Cl Electrolyte Analyzer, a mid-size lab based clinical chemistry analyzer with N = 100 samples run over 10 days. Within-run precision study was done using modified CLSI guidelines with N = 20 samples and day-to-day precision study was done for 7 consecutive days using Trulab N & P Quality Control Samples. Accelerated stability testing was done at 45oC for 4 weeks with Production Lots. Results: Data analysis indicates that the CV for within-run precision for Na is ≤ 1%, for K is ≤2%, and for Cl is ≤2% and with R2 ≥ 0.95 for Method Comparison. Further, the All-in-One Universal Cartridge is stable up to 12-18 months at 4-25oC storage temperature based on preliminary extrapolated data. Conclusion: The Developed Technology Platform of All-in-One Universal Module-in-Module Cartridge based QDx InstaLyte is Reliable and meets all the performance specifications of the lab and is Truly Modular and Maintenance-Free. Hence, it can be easily adapted for low cost, sensitive and rapid measurement of electrolyte tests in low resource settings such as in urban, semi-urban and rural areas in the developing countries and can be used as a Point-of-care testing system for worldwide applications.Keywords: all-in-one modular catridge, electrolytes, maintenance free, QDx instalyte
Procedia PDF Downloads 3115369 Evaluation of the Grammar Questions at the Undergraduate Level
Authors: Preeti Gacche
Abstract:
A considerable part of undergraduate level English Examination papers is devoted to grammar. Hence the grammar questions in the question papers are evaluated and the opinions of both students and teachers about them are obtained and analyzed. A grammar test of 100 marks is administered to 43 students to check their performance. The question papers have been evaluated by 10 different teachers and their scores compared. The analysis of 38 University question papers reveals that on an average 20 percent marks are allotted to grammar. Almost all the grammar topics are tested. Abundant use of grammatical terminology is observed in the questions. Decontextualization, repetition, possibility of multiple correct answers and grammatical errors in framing the questions have been observed. Opinions of teachers and students about grammar questions vary in many respects. The students responses are analyzed medium-wise and sex-wise. The Medium at the School level and the sex of the students are found to play no role as far as interest in the study of grammar is concerned. English medium students solve grammar questions intuitively whereas non-English medium students are required to recollect the rules of grammar. Prepositions, Verbs, Articles and Model auxiliaries are found to be easy topics for most students whereas the use of conjunctions is the most difficult topic. Out of context items of grammar are difficult to answer in comparison with contextualized items of grammar. Hence contextualized texts to test grammar items are desirable. No formal training in setting questions is imparted to teachers by the competent authorities like the University. They need to be trained in testing. Statistically there is no significant change of score with the change in the rater in testing of grammar items. There is scope of future improvement. The question papers need to be evaluated and feedback needs to be obtained from students and teachers for future improvement.Keywords: context, evaluation, grammar, tests
Procedia PDF Downloads 35315368 Predicting Foreign Direct Investment of IC Design Firms from Taiwan to East and South China Using Lotka-Volterra Model
Authors: Bi-Huei Tsai
Abstract:
This work explores the inter-region investment behaviors of integrated circuit (IC) design industry from Taiwan to China using the amount of foreign direct investment (FDI). According to the mutual dependence among different IC design industrial locations, Lotka-Volterra model is utilized to explore the FDI interactions between South and East China. Effects of inter-regional collaborations on FDI flows into China are considered. Evolutions of FDIs into South China for IC design industry significantly inspire the subsequent FDIs into East China, while FDIs into East China for Taiwan’s IC design industry significantly hinder the subsequent FDIs into South China. The supply chain along IC industry includes IC design, manufacturing, packing and testing enterprises. I C manufacturing, packaging and testing industries depend on IC design industry to gain advanced business benefits. The FDI amount from Taiwan’s IC design industry into East China is the greatest among the four regions: North, East, Mid-West and South China. The FDI amount from Taiwan’s IC design industry into South China is the second largest. If IC design houses buy more equipment and bring more capitals in South China, those in East China will have pressure to undertake more FDIs into East China to maintain the leading position advantages of the supply chain in East China. On the other hand, as the FDIs in East China rise, the FDIs in South China will successively decline since capitals have concentrated in East China. Prediction of Lotka-Volterra model in FDI trends is accurate because the industrial interactions between the two regions are included. Finally, this work confirms that the FDI flows cannot reach a stable equilibrium point, so the FDI inflows into East and South China will expand in the future.Keywords: Lotka-Volterra model, foreign direct investment, competitive, Equilibrium analysis
Procedia PDF Downloads 36315367 In vitro Skin Model for Enhanced Testing of Antimicrobial Textiles
Authors: Steven Arcidiacono, Robert Stote, Erin Anderson, Molly Richards
Abstract:
There are numerous standard test methods for antimicrobial textiles that measure activity against specific microorganisms. However, many times these results do not translate to the performance of treated textiles when worn by individuals. Standard test methods apply a single target organism grown under optimal conditions to a textile, then recover the organism to quantitate and determine activity; this does not reflect the actual performance environment that consists of polymicrobial communities in less than optimal conditions or interaction of the textile with the skin substrate. Here we propose the development of in vitro skin model method to bridge the gap between lab testing and wear studies. The model will consist of a defined polymicrobial community of 5-7 commensal microbes simulating the skin microbiome, seeded onto a solid tissue platform to represent the skin. The protocol would entail adding a non-commensal test organism of interest to the defined community and applying a textile sample to the solid substrate. Following incubation, the textile would be removed and the organisms recovered, which would then be quantitated to determine antimicrobial activity. Important parameters to consider include identification and assembly of the defined polymicrobial community, growth conditions to allow the establishment of a stable community, and choice of skin surrogate. This model could answer the following questions: 1) is the treated textile effective against the target organism? 2) How is the defined community affected? And 3) does the textile cause unwanted effects toward the skin simulant? The proposed model would determine activity under conditions comparable to the intended application and provide expanded knowledge relative to current test methods.Keywords: antimicrobial textiles, defined polymicrobial community, in vitro skin model, skin microbiome
Procedia PDF Downloads 13715366 A Unique Professional Development of Teacher Educators: Teaching Colleagues
Authors: Naomi Weiner-Levy
Abstract:
The Mofet Institute of Research, established a School of Professional Development, the only one of its kind in Israel and throughout the world. It offers specialized programs for teacher educators, providing them with the professional knowledge and skills. The studies aim at updating teachers about rapidly changing knowledge and skills. Teacher educators are conceptualized as shifting from first order practitioners (school teachers) to second order practitioners. Those who train teachers are referred to as third order practitioners. The instructors in the School of Professional Development are third-order practitioners – teacher educators specializing in teaching their colleagues. Collegial guidance by teachers’ college staff members is no simple task: Tutors must be expert in their field of specialization, as well as in instruction. Moreover, although colleagues, they have to position themselves within the group as authoritative figures in terms of instruction and knowledge. To date, the role and professional identity of these third-order practitioners, has not been studied. To understand the nature and development of professional identity, a qualitative study was conducted in which 12 tutors of various subjects were interviewed. These were analyzed by categorical content analysis. The findings, assessed professional identity through a post-modern prism, while examining the interplay among events that tutors experienced, the knowledge they acquired and the structuring of their professional identity. The Tutors’ identity transformed through negotiating with ‘self’ and ‘other’ in the class, and constructed by their mutual experiences as tutors and learners. Understanding the function and identity of tutors facilitates comprehension of this unique training process for teacher educators.Keywords: professional development, professional identity, teacher education, tutoring
Procedia PDF Downloads 22315365 Edible Active Antimicrobial Coatings onto Plastic-Based Laminates and Its Performance Assessment on the Shelf Life of Vacuum Packaged Beef Steaks
Authors: Andrey A. Tyuftin, David Clarke, Malco C. Cruz-Romero, Declan Bolton, Seamus Fanning, Shashi K. Pankaj, Carmen Bueno-Ferrer, Patrick J. Cullen, Joe P. Kerry
Abstract:
Prolonging of shelf-life is essential in order to address issues such as; supplier demands across continents, economical profit, customer satisfaction, and reduction of food wastage. Smart packaging solutions presented in the form of naturally occurred antimicrobially-active packaging may be a solution to these and other issues. Gelatin film forming solution with adding of natural sourced antimicrobials is a promising tool for the active smart packaging. The objective of this study was to coat conventional plastic hydrophobic packaging material with hydrophilic antimicrobial active beef gelatin coating and conduct shelf life trials on beef sub-primal cuts. Minimal inhibition concentration (MIC) of Caprylic acid sodium salt (SO) and commercially available Auranta FV (AFV) (bitter oranges extract with mixture of nutritive organic acids) were found of 1 and 1.5 % respectively against bacterial strains Bacillus cereus, Pseudomonas fluorescens, Escherichia coli, Staphylococcus aureus and aerobic and anaerobic beef microflora. Therefore SO or AFV were incorporated in beef gelatin film forming solution in concentration of two times of MIC which was coated on a conventional plastic LDPE/PA film on the inner cold plasma treated polyethylene surface. Beef samples were vacuum packed in this material and stored under chilling conditions, sampled at weekly intervals during 42 days shelf life study. No significant differences (p < 0.05) in the cook loss was observed among the different treatments compared to control samples until the day 29. Only for AFV coated beef sample it was 3% higher (37.3%) than the control (34.4 %) on the day 36. It was found antimicrobial films did not protect beef against discoloration. SO containing packages significantly (p < 0.05) reduced Total viable bacterial counts (TVC) compared to the control and AFV samples until the day 35. No significant reduction in TVC was observed between SO and AFV films on the day 42 but a significant difference was observed compared to control samples with a 1.40 log of bacteria reduction on the day 42. AFV films significantly (p < 0.05) reduced TVC compared to control samples from the day 14 until the day 42. Control samples reached the set value of 7 log CFU/g on day 27 of testing, AFV films did not reach this set limit until day 35 and SO films until day 42 of testing. The antimicrobial AFV and SO coated films significantly prolonged the shelf-life of beef steaks by 33 or 55% (on 7 and 14 days respectively) compared to control film samples. It is concluded antimicrobial coated films were successfully developed by coating the inner polyethylene layer of conventional LDPE/PA laminated films after plasma surface treatment. The results indicated that the use of antimicrobial active packaging coated with SO or AFV increased significantly (p < 0.05) the shelf life of the beef sub-primal. Overall, AFV or SO containing gelatin coatings have the potential of being used as effective antimicrobials for active packaging applications for muscle-based food products.Keywords: active packaging, antimicrobials, edible coatings, food packaging, gelatin films, meat science
Procedia PDF Downloads 30315364 Integrating Dependent Material Planning Cycle into Building Information Management: A Building Information Management-Based Material Management Automation Framework
Authors: Faris Elghaish, Sepehr Abrishami, Mark Gaterell, Richard Wise
Abstract:
The collaboration and integration between all building information management (BIM) processes and tasks are necessary to ensure that all project objectives can be delivered. The literature review has been used to explore the state of the art BIM technologies to manage construction materials as well as the challenges which have faced the construction process using traditional methods. Thus, this paper aims to articulate a framework to integrate traditional material planning methods such as ABC analysis theory (Pareto principle) to analyse and categorise the project materials, as well as using independent material planning methods such as Economic Order Quantity (EOQ) and Fixed Order Point (FOP) into the BIM 4D, and 5D capabilities in order to articulate a dependent material planning cycle into BIM, which relies on the constructability method. Moreover, we build a model to connect between the material planning outputs and the BIM 4D and 5D data to ensure that all project information will be accurately presented throughout integrated and complementary BIM reporting formats. Furthermore, this paper will present a method to integrate between the risk management output and the material management process to ensure that all critical materials are monitored and managed under the all project stages. The paper includes browsers which are proposed to be embedded in any 4D BIM platform in order to predict the EOQ as well as FOP and alarm the user during the construction stage. This enables the planner to check the status of the materials on the site as well as to get alarm when the new order will be requested. Therefore, this will lead to manage all the project information in a single context and avoid missing any information at early design stage. Subsequently, the planner will be capable of building a more reliable 4D schedule by allocating the categorised material with the required EOQ to check the optimum locations for inventory and the temporary construction facilitates.Keywords: building information management, BIM, economic order quantity, EOQ, fixed order point, FOP, BIM 4D, BIM 5D
Procedia PDF Downloads 17215363 Modeling of Thermally Induced Acoustic Emission Memory Effects in Heterogeneous Rocks with Consideration for Fracture Develo
Authors: Vladimir A. Vinnikov
Abstract:
The paper proposes a model of an inhomogeneous rock mass with initially random distribution of microcracks on mineral grain boundaries. It describes the behavior of cracks in a medium under the effect of thermal field, the medium heated instantaneously to a predetermined temperature. Crack growth occurs according to the concept of fracture mechanics provided that the stress intensity factor K exceeds the critical value of Kc. The modeling of thermally induced acoustic emission memory effects is based on the assumption that every event of crack nucleation or crack growth caused by heating is accompanied by a single acoustic emission event. Parameters of the thermally induced acoustic emission memory effect produced by cyclic heating and cooling (with the temperature amplitude increasing from cycle to cycle) were calculated for several rock texture types (massive, banded, and disseminated). The study substantiates the adaptation of the proposed model to humidity interference with the thermally induced acoustic emission memory effect. The influence of humidity on the thermally induced acoustic emission memory effect in quasi-homogeneous and banded rocks is estimated. It is shown that such modeling allows the structure and texture of rocks to be taken into account and the influence of interference factors on the distinctness of the thermally induced acoustic emission memory effect to be estimated. The numerical modeling can be used to obtain information about the thermal impacts on rocks in the past and determine the degree of rock disturbance by means of non-destructive testing.Keywords: degree of rock disturbance, non-destructive testing, thermally induced acoustic emission memory effects, structure and texture of rocks
Procedia PDF Downloads 26315362 Knowledge, Attitudes and Readiness of Students towards Higher Order Thinking Skills
Authors: Mohd Aderi Che Noh, Tuan Rahayu Tuan Lasan
Abstract:
Higher order thinking skills (HOTS) is an important skill in the Malaysian education system to produce a knowledgeable generation, able to think critically and creatively in order to face the challenges in the future. Educational challenges of the 21st century require that all students to have the HOTS. Therefore, this study aims to identify the level of knowledge, attitude and readiness of students towards HOTS. The respondents were 127 form four students from schools in the Federal Territory of Putrajaya. This study is quantitative survey using a questionnaire to collect data. Data were analyzed using Statistical Package for the Social Sciences (SPSS) 23.0. The results showed that knowledge, attitudes and readiness of students towards HOTS lam were at a high level. Inferential analysis showed that there was a significant relationship between knowledge with attitude and readiness towards HOTS. This study provides information to the schools and teachers to improve the teaching and learning to increase students HOTS and fulfilling the hope of Ministry of Education to produce human capital who can be globally competitive.Keywords: high order thinking skills, teaching, education, Malaysia
Procedia PDF Downloads 21215361 Clonal Dissemination of Pseudomonas aeruginosa Isolates in Kermanshah Hospitals, West of Iran
Authors: Alisha Akya, Afsaneh salami
Abstract:
Background and Objective: Pseudomonas aeruginosa is an opportunistic pathogen associated with nosocomial infections. One of the major concerns for the treatment of P. aeruginosa infections is its resistant to a variety of antibiotics. The purpose of this study was to assess the dissemination of p. aeruginosa isolates obtained from major hospitals in Kermanshah, west of Iran. Materials and Methods: Antibiotic susceptibility testing was performed using the minimal inhibitory concentrations. Mettalo-beta-lactamase was investigated using the double disk diffusion (DDST) test and PCR. Molecular typing was performed by pulsed-field gel electrophoresis (PFGE). Results: The 60 P. aeruginosa isolates, 30 (50%) were resistant to gentamicin, 38 (63/3%) to piperacilin, 42 (70%) to ceftazidime, and 45 (75%) to cefepime. Twenty-nine (48/3%) isolates were MBLs producer based on the DDST test. Five (8/3%) isolates were positive for VIM gene and 4 of them were from burn specimens. PFGE analysis among MBLs producers revealed 12 distinct genotype patterns. A pattern covering the highest number of strains was determined as the dominant clone. Conclusions: Our study showed that P. aeruginosa strains can be spread between patients in hospitals or acquired from different environmental sources. P. aeruginosa isolates were highly resistant to antibiotics and, therefore, the susceptibility of isolates to antibiotics should be tested before treatment. Given the clinical significance of MBLs producing isolates, identification of these organisms is essential in the hospitals in order to get a better therapeutic response and control of bacterial dissemination.Keywords: clonal dissemination, mettalo-beta-lactamase, Pseudomonas aeruginosa, PFGE
Procedia PDF Downloads 32615360 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm
Authors: Ameur Abdelkader, Abed Bouarfa Hafida
Abstract:
Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm
Procedia PDF Downloads 14215359 Elimination of Low Order Harmonics in Multilevel Inverter Using Nature-Inspired Metaheuristic Algorithm
Authors: N. Ould Cherchali, A. Tlemçani, M. S. Boucherit, A. Morsli
Abstract:
Nature-inspired metaheuristic algorithms, particularly those founded on swarm intelligence, have attracted much attention over the past decade. Firefly algorithm has appeared in approximately seven years ago, its literature has enlarged considerably with different applications. It is inspired by the behavior of fireflies. The aim of this paper is the application of firefly algorithm for solving a nonlinear algebraic system. This resolution is needed to study the Selective Harmonic Eliminated Pulse Width Modulation strategy (SHEPWM) to eliminate the low order harmonics; results have been applied on multilevel inverters. The final results from simulations indicate the elimination of the low order harmonics as desired. Finally, experimental results are presented to confirm the simulation results and validate the efficaciousness of the proposed approach.Keywords: firefly algorithm, metaheuristic algorithm, multilevel inverter, SHEPWM
Procedia PDF Downloads 14815358 A Reactive Flexible Job Shop Scheduling Model in a Stochastic Environment
Authors: Majid Khalili, Hamed Tayebi
Abstract:
This paper considers a stochastic flexible job-shop scheduling (SFJSS) problem in the presence of production disruptions, and reactive scheduling is implemented in order to find the optimal solution under uncertainty. In this problem, there are two main disruptions including machine failure which influences operation time, and modification or cancellation of the order delivery date during production. In order to decrease the negative effects of these difficulties, two derived strategies from reactive scheduling are used; the first one is relevant to being able to allocate multiple machine to each job, and the other one is related to being able to select the best alternative process from other job while some disruptions would be created in the processes of a job. For this purpose, a Mixed Integer Linear Programming model is proposed.Keywords: flexible job-shop scheduling, reactive scheduling, stochastic environment, mixed integer linear programming
Procedia PDF Downloads 36115357 A Retrospective Cross-Sectional Study on the Prevalence and Factors Associated with Virological Non-Suppression among HIV-Positive Adult Patients on Antiretroviral Therapy in Woliso Town, Oromia, Ethiopia
Authors: Teka Haile, Behailu Hawulte, Solomon Alemayehu
Abstract:
Background: HIV virological failure still remains a problem in HV/AIDS treatment and care. This study aimed to describe the prevalence and identify the factors associated with viral non-suppression among HIV-positive adult patients on antiretroviral therapy in Woliso Town, Oromia, Ethiopia. Methods: A retrospective cross-sectional study was conducted among 424 HIV-positive patient’s attending antiretroviral therapy (ART) in Woliso Town during the period from August 25, 2020 to August 30, 2020. Data collected from patient medical records were entered into Epi Info version 2.3.2.1 and exported to SPSS version 21.0 for analysis. Logistic regression analysis was done to identify factors associated with viral load non-suppression, and statistical significance of odds ratios were declared using 95% confidence interval and p-value < 0.05. Results: A total of 424 patients were included in this study. The mean age (± SD) of the study participants was 39.88 (± 9.995) years. The prevalence of HIV viral load non-suppression was 55 (13.0%) with 95% CI (9.9-16.5). Second-line ART treatment regimen (Adjusted Odds Ratio (AOR) = 8.98, 95% Confidence Interval (CI): 2.64, 30.58) and routine viral load testing (AOR = 0.01, 95% CI: 0.001, 0.02) were significantly associated with virological non-suppression. Conclusion: Virological non-suppression was high, which hinders the achievement of the third global 95 target. The second-line regimen and routine viral load testing were significantly associated with virological non-suppression. It suggests the need to assess the effectiveness of antiretroviral drugs for epidemic control. It also clearly shows the need to decentralize third-line ART treatment for those patients in need.Keywords: virological non-suppression, HIV-positive, ART, Woliso town, Ethiopia
Procedia PDF Downloads 15015356 Mathematical Model That Using Scrambling and Message Integrity Methods in Audio Steganography
Authors: Mohammed Salem Atoum
Abstract:
The success of audio steganography is to ensure imperceptibility of the embedded message in stego file and withstand any form of intentional or un-intentional degradation of message (robustness). Audio steganographic that utilized LSB of audio stream to embed message gain a lot of popularity over the years in meeting the perceptual transparency, robustness and capacity. This research proposes an XLSB technique in order to circumvent the weakness observed in LSB technique. Scrambling technique is introduce in two steps; partitioning the message into blocks followed by permutation each blocks in order to confuse the contents of the message. The message is embedded in the MP3 audio sample. After extracting the message, the permutation codebook is used to re-order it into its original form. Md5sum and SHA-256 are used to verify whether the message is altered or not during transmission. Experimental result shows that the XLSB performs better than LSB.Keywords: XLSB, scrambling, audio steganography, security
Procedia PDF Downloads 36315355 A Handheld Light Meter Device for Methamphetamine Detection in Oral Fluid
Authors: Anindita Sen
Abstract:
Oral fluid is a promising diagnostic matrix for drugs of abuse compared to urine and serum. Detection of methamphetamine in oral fluid would pave way for the easy evaluation of impairment in drivers during roadside drug testing as well as ensure safe working environments by facilitating evaluation of impairment in employees at workplaces. A membrane-based point-of-care (POC) friendly pre-treatment technique has been developed which aided elimination of interferences caused by salivary proteins and facilitated the demonstration of methamphetamine detection in saliva using a gold nanoparticle based colorimetric aptasensor platform. It was found that the colorimetric response in saliva was always suppressed owing to the matrix effects. By navigating the challenging interfering issues in saliva, we were successfully able to detect methamphetamine at nanomolar levels in saliva offering immense promise for the translation of these platforms for on-site diagnostic systems. This subsequently motivated the development of a handheld portable light meter device that can reliably transduce the aptasensors colorimetric response into absorbance, facilitating quantitative detection of analyte concentrations on-site. This is crucial due to the prevalent unreliability and sensitivity problems of the conventional drug testing kits. The fabricated light meter device response was validated against a standard UV-Vis spectrometer to confirm reliability. The portable and cost-effective handheld detector device features sensitivity comparable to the well-established UV-Vis benchtop instrument and the easy-to-use device could potentially serve as a prototype for a commercial device in the future.Keywords: aptasensors, colorimetric gold nanoparticle assay, point-of-care, oral fluid
Procedia PDF Downloads 5915354 Emergence of New Development Bank: Analyzing the Impact on BRICS Nations and the World Order
Authors: Urvi Shah, Anmol Jain
Abstract:
The talks of a New Global Order have been doing rounds since the advent of 21st century. Similar change in global scenario was witnessed when the Bretton Woods System came up post the World War II. The changing world order has been analyzed by using the Purchasing Power Parity (PPP) and Nominal Gross Domestic Product (GDP) estimates. The PPP and Nominal GDP methods show the purchasing power and financial background of the countries respectively, which helps in knowing both real and nominal financial strength of the country. Today, the rising powers of BRICS are posing new challenges to the world order shaped by the West. BRICS, i.e. Brazil, Russia, India, China and South Africa, countries have at various instances represented the interests of developing countries at world forums. The pooled population of these nations accounts for 41.6% of the total world population which gives a very resilient idea of the workforce or human resources which is mobilized by them. They have a combined GDP (PPP) of around 30.57% of the total world GDP (PPP). The paper tries to analyze the prospects and impact of the New Development Bank (NDB) formerly known as the BRICS Bank, on world economy, which has the potential to act as a rival to West dominated IMF and World Bank. The paper studies the paradigm shift in the global order, impact of the NDB on third world nations and the developed nations. The study concluded that the relative positions of BRICS countries in the world economy are changing, irrespective of the measurement methodology being US$ or the PPP model.Keywords: BRICS, New Development Bank, Nominal GDP, purchasing power parity
Procedia PDF Downloads 32215353 Enhancing the Interpretation of Group-Level Diagnostic Results from Cognitive Diagnostic Assessment: Application of Quantile Regression and Cluster Analysis
Authors: Wenbo Du, Xiaomei Ma
Abstract:
With the empowerment of Cognitive Diagnostic Assessment (CDA), various domains of language testing and assessment have been investigated to dig out more diagnostic information. What is noticeable is that most of the extant empirical CDA-based research puts much emphasis on individual-level diagnostic purpose with very few concerned about learners’ group-level performance. Even though the personalized diagnostic feedback is the unique feature that differentiates CDA from other assessment tools, group-level diagnostic information cannot be overlooked in that it might be more practical in classroom setting. Additionally, the group-level diagnostic information obtained via current CDA always results in a “flat pattern”, that is, the mastery/non-mastery of all tested skills accounts for the two highest proportion. In that case, the outcome does not bring too much benefits than the original total score. To address these issues, the present study attempts to apply cluster analysis for group classification and quantile regression analysis to pinpoint learners’ performance at different proficiency levels (beginner, intermediate and advanced) thus to enhance the interpretation of the CDA results extracted from a group of EFL learners’ reading performance on a diagnostic reading test designed by PELDiaG research team from a key university in China. The results show that EM method in cluster analysis yield more appropriate classification results than that of CDA, and quantile regression analysis does picture more insightful characteristics of learners with different reading proficiencies. The findings are helpful and practical for instructors to refine EFL reading curriculum and instructional plan tailored based on the group classification results and quantile regression analysis. Meanwhile, these innovative statistical methods could also make up the deficiencies of CDA and push forward the development of language testing and assessment in the future.Keywords: cognitive diagnostic assessment, diagnostic feedback, EFL reading, quantile regression
Procedia PDF Downloads 14615352 Design and Analysis of Formula One Car Halo
Authors: Indira priyadarshini, B. Tulja Lal, K. Anusha, P. Sai Varun
Abstract:
Formula One cars are the fastest road course racing cars in the world, owing to very high cornering speeds achieved through the generation of large amounts of aerodynamic downforce. The main intentions and goals of this paper are to reduce the accidents and improving the safety without affecting the visibility of the driver by redesigning Halo that was developed by Mercedes in conjunction with the FIA to deflect flying debris, such as a loose wheel, away from a driver’s head while the hinged locking mechanism can quickly be removed for easy access. Halo design has been modified in order to reduce the weight without affecting the aerodynamics of the car. CFD simulation is carried out to observe the flow over the Halo. The velocity profile and pressure contours were analyzed. Halo is designed using SOLIDWORKS Furthermore, using the software ANSYS FLUENT 3D simulation of the airflow contour around the Halo in order to make changes in the geometry to improve the design by reducing air resistance and improving aerodynamics. According to our assumption, new 3D Halo model has better aerodynamic properties in order to analyse possible improvements compared to the initial design. Structural analysis is also done by using ANSYS by making an F1 tire colliding with Halo at 225 kmph in order to know the deflections in the structure.Keywords: aerodynamics, Halo, safety, visibility
Procedia PDF Downloads 37315351 Multimodal Optimization of Density-Based Clustering Using Collective Animal Behavior Algorithm
Authors: Kristian Bautista, Ruben A. Idoy
Abstract:
A bio-inspired metaheuristic algorithm inspired by the theory of collective animal behavior (CAB) was integrated to density-based clustering modeled as multimodal optimization problem. The algorithm was tested on synthetic, Iris, Glass, Pima and Thyroid data sets in order to measure its effectiveness relative to CDE-based Clustering algorithm. Upon preliminary testing, it was found out that one of the parameter settings used was ineffective in performing clustering when applied to the algorithm prompting the researcher to do an investigation. It was revealed that fine tuning distance δ3 that determines the extent to which a given data point will be clustered helped improve the quality of cluster output. Even though the modification of distance δ3 significantly improved the solution quality and cluster output of the algorithm, results suggest that there is no difference between the population mean of the solutions obtained using the original and modified parameter setting for all data sets. This implies that using either the original or modified parameter setting will not have any effect towards obtaining the best global and local animal positions. Results also suggest that CDE-based clustering algorithm is better than CAB-density clustering algorithm for all data sets. Nevertheless, CAB-density clustering algorithm is still a good clustering algorithm because it has correctly identified the number of classes of some data sets more frequently in a thirty trial run with a much smaller standard deviation, a potential in clustering high dimensional data sets. Thus, the researcher recommends further investigation in the post-processing stage of the algorithm.Keywords: clustering, metaheuristics, collective animal behavior algorithm, density-based clustering, multimodal optimization
Procedia PDF Downloads 23015350 Identification of Damage Mechanisms in Interlock Reinforced Composites Using a Pattern Recognition Approach of Acoustic Emission Data
Authors: M. Kharrat, G. Moreau, Z. Aboura
Abstract:
The latest advances in the weaving industry, combined with increasingly sophisticated means of materials processing, have made it possible to produce complex 3D composite structures. Mainly used in aeronautics, composite materials with 3D architecture offer better mechanical properties than 2D reinforced composites. Nevertheless, these materials require a good understanding of their behavior. Because of the complexity of such materials, the damage mechanisms are multiple, and the scenario of their appearance and evolution depends on the nature of the exerted solicitations. The AE technique is a well-established tool for discriminating between the damage mechanisms. Suitable sensors are used during the mechanical test to monitor the structural health of the material. Relevant AE-features are then extracted from the recorded signals, followed by a data analysis using pattern recognition techniques. In order to better understand the damage scenarios of interlock composite materials, a multi-instrumentation was set-up in this work for tracking damage initiation and development, especially in the vicinity of the first significant damage, called macro-damage. The deployed instrumentation includes video-microscopy, Digital Image Correlation, Acoustic Emission (AE) and micro-tomography. In this study, a multi-variable AE data analysis approach was developed for the discrimination between the different signal classes representing the different emission sources during testing. An unsupervised classification technique was adopted to perform AE data clustering without a priori knowledge. The multi-instrumentation and the clustered data served to label the different signal families and to build a learning database. This latter is useful to construct a supervised classifier that can be used for automatic recognition of the AE signals. Several materials with different ingredients were tested under various solicitations in order to feed and enrich the learning database. The methodology presented in this work was useful to refine the damage threshold for the new generation materials. The damage mechanisms around this threshold were highlighted. The obtained signal classes were assigned to the different mechanisms. The isolation of a 'noise' class makes it possible to discriminate between the signals emitted by damages without resorting to spatial filtering or increasing the AE detection threshold. The approach was validated on different material configurations. For the same material and the same type of solicitation, the identified classes are reproducible and little disturbed. The supervised classifier constructed based on the learning database was able to predict the labels of the classified signals.Keywords: acoustic emission, classifier, damage mechanisms, first damage threshold, interlock composite materials, pattern recognition
Procedia PDF Downloads 15515349 Dry Needling Treatment in 38 Cases of Chronic Sleep Disturbance
Authors: P. Gao, Z. Q. Li, Y. G. Jin
Abstract:
In the past 10 years, computers and cellphones have become one of the most important factors in our lives, and one which has a tremendously negative impact on our muscles. Muscle tension may be one of the causes of sleep disturbance. Tension in the shoulders and neck can affect blood circulation to the muscles. This research uses a dry needling treatment to reduce muscle tension in order to determine if the strain in the head and shoulders can influence sleep duration. All 38 patients taking part in the testing suffered from tinnitus and have been experiencing disturbed sleep for at least one to five years. Even after undergoing drug therapy treatments and traditional acupuncture therapies, their sleep disturbances have not shown any improvement. After five to 10 dry needling treatments, 24 of the patients reported an improvement in their sleep duration. Five patients considered themselves to be completely recovered, while 12 patients experienced no improvement. This study investigated these pathogenic and therapeutic problems. The standard treatment for sleep disturbances is drug-based therapy; the results of most standard treatments are unfortunately negative. The result of this clinical research has demonstrated that: The possible cause of sleep disturbance for a lot of patients is the result of tensions in the neck and shoulder muscles. Blood circulation to those muscles is also influenced by the duration of sleep. Hypertonic neck and shoulder muscles are considered to impact sleeping patterns and lead to disturbed sleep. Poor posture, often adopted while speaking on the phone, is one of the main causes of hypertonic neck and shoulder muscle problems. The dry needling treatment specifically focuses on the release of muscle tension.Keywords: dry needling, muscle tension, sleep duration, hypertonic muscles
Procedia PDF Downloads 24515348 Pros and Cons of Nanoparticles on Health
Authors: Amber Shahi, Ayesha Tazeen, Abdus Samad, Shama Parveen
Abstract:
Nanoparticles (NPs) are tiny particles. According to the International Organization for Standardization, the size range of NPs is in the nanometer range (1-100 nm). They show distinct properties that are not shown by larger particles of the same material. NPs are currently being used in different fields due to their unique physicochemical nature. NPs are a boon for medical sciences, environmental sciences, electronics, and textile industries. However, there is growing concern about their potential adverse effects on human health. This poster presents a comprehensive review of the current literature on the pros and cons of NPs on human health. The poster will discuss the various types of interactions of NPs with biological systems. There are a number of beneficial uses of NPs in the field of health and environmental welfare. NPs are very useful in disease diagnosis, antimicrobial action, and the treatment of diseases like Alzheimer’s. They can also cross the blood-brain barrier, making them capable of treating brain diseases. Additionally, NPs can target specific tumors and be used for cancer treatment. To treat environmental health, NPs also act as catalytic converters to reduce pollution from the environment. On the other hand, NPs also have some negative impacts on the human body, such as being cytotoxic and genotoxic. They can also affect the reproductive system, such as the testis and ovary, and sexual behavior. The poster will further discuss the routes of exposure of NPs. The poster will conclude with a discussion of the current regulations and guidelines on the use of NPs in various applications. It will highlight the need for further research and the development of standardized toxicity testing methods to ensure the safe use of NPs in various applications. When using NPs in diagnosis and treatment, we should also take into consideration their safe concentration in the body. Overall, this poster aims to provide a comprehensive overview of the pros and cons of NPs on human health and to promote awareness and understanding of the potential risks and benefits associated with their use.Keywords: disease diagnosis, human health, nanoparticles, toxicity testing
Procedia PDF Downloads 8015347 Hearing Conservation Program for Vector Control Workers: Short-Term Outcomes from a Cluster-Randomized Controlled Trial
Authors: Rama Krishna Supramanian, Marzuki Isahak, Noran Naqiah Hairi
Abstract:
Noise-induced hearing loss (NIHL) is one of the highest recorded occupational diseases, despite being preventable. Hearing Conservation Program (HCP) is designed to protect workers hearing and prevent them from developing hearing impairment due to occupational noise exposures. However, there is still a lack of evidence regarding the effectiveness of this program. The purpose of this study was to determine the effectiveness of a Hearing Conservation Program (HCP) in preventing or reducing audiometric threshold changes among vector control workers. This study adopts a cluster randomized controlled trial study design, with district health offices as the unit of randomization. Nine district health offices were randomly selected and 183 vector control workers were randomized to intervention or control group. The intervention included a safety and health policy, noise exposure assessment, noise control, distribution of appropriate hearing protection devices, training and education program and audiometric testing. The control group only underwent audiometric testing. Audiometric threshold changes observed in the intervention group showed improvement in the hearing threshold level for all frequencies except 500 Hz and 8000 Hz for the left ear. The hearing threshold changes range from 1.4 dB to 5.2 dB with largest improvement at higher frequencies mainly 4000 Hz and 6000 Hz. Meanwhile for the right ear, the mean hearing threshold level remained similar at 4000 Hz and 6000 Hz after 3 months of intervention. The Hearing Conservation Program (HCP) is effective in preserving the hearing of vector control workers involved in fogging activity as well as increasing their knowledge, attitude and practice towards noise-induced hearing loss (NIHL).Keywords: adult, hearing conservation program, noise-induced hearing loss, vector control worker
Procedia PDF Downloads 16815346 The Importance of Imaging and Functional Tests for Early Detection of Occupational Diseases in Kosovo's Miners
Authors: Krenare Shabani, Kreshnike Dedushi Hoti, Serbeze Kabashi, Jeton Shatri, Arben Rroji, Mrikë Bunjaku, Leotrim Berisha, Jona Kosova, Edmond Puca, Bleriana Shabani
Abstract:
Introduction: Workers in Kosovo's mining industry are subjected to hazardous working conditions and airborne particles, such as silica dust, which can cause silicosis and other severe respiratory illnesses. The purpose of this research is to assess the health impacts of such exposures, as well as the importance of imaging and functional testing in detecting pathological changes early on. Methodology: The study is prospective and cross-sectional and was carried out during the year 2024. 626 people (446 miners and 180 non-miners) were enrolled in the study. Subjects underwent spirometry and chest radiography. Data were analysed with SPSS24. Results: The average age of the participants is 48 years. Demographics and Smoking: Smoking was common among young miners. Radiological Changes: Radiographic abnormalities in the lungs were seen in 23.1% of miners and 10.6% of non-miners, including small irregular opacities and emphysematous changes. Lung Function: The FEV1/FVC ratio decreased with increased exposure time, indicating a decline in pulmonary function.Impact of Exposure Duration: Longer exposure duration was associated with a higher number of miners experiencing coughs and requiring medical consultations such as CT scans and biopsies. Conclusions: Medical imaging and functional testing are critical for early diagnosis of lung abnormalities in miners.Findings demonstrate a strong correlation between extended exposure to mine dust and the development of respiratory disorders, emphasising the importance of preventative measures and routine health monitoring.Keywords: silicosis, miners, imaging, spirometry
Procedia PDF Downloads 2815345 Teaching Practices for Subverting Significant Retentive Learner Errors in Arithmetic
Authors: Michael Lousis
Abstract:
The systematic identification of the most conspicuous and significant errors made by learners during three-years of testing of their progress in learning Arithmetic throughout the development of the Kassel Project in England and Greece was accomplished. How much retentive these errors were over three-years in the officially provided school instruction of Arithmetic in these countries has also been shown. The learners’ errors in Arithmetic stemmed from a sample, which was comprised of two hundred (200) English students and one hundred and fifty (150) Greek students. The sample was purposefully selected according to the students’ participation in each testing session in the development of the three-year project, in both domains simultaneously in Arithmetic and Algebra. Specific teaching practices have been invented and are presented in this study for subverting these learners’ errors, which were found out to be retentive to the level of the nationally provided mathematical education of each country. The invention and the development of these proposed teaching practices were founded on the rationality of the theoretical accounts concerning the explanation, prediction and control of the errors, on the conceptual metaphor and on an analysis, which tried to identify the required cognitive components and skills of the specific tasks, in terms of Psychology and Cognitive Science as applied to information-processing. The aim of the implementation of these instructional practices is not only the subversion of these errors but the achievement of the mathematical competence, as this was defined to be constituted of three elements: appropriate representations - appropriate meaning - appropriately developed schemata. However, praxis is of paramount importance, because there is no independent of science ‘real-truth’ and because praxis serves as quality control when it takes the form of a cognitive method.Keywords: arithmetic, cognitive science, cognitive psychology, information-processing paradigm, Kassel project, level of the nationally provided mathematical education, praxis, remedial mathematical teaching practices, retentiveness of errors
Procedia PDF Downloads 316