Search results for: R statistical software
7978 Software Component Identification from Its Object-Oriented Code: Graph Metrics Based Approach
Authors: Manel Brichni, Abdelhak-Djamel Seriai
Abstract:
Systems are increasingly complex. To reduce their complexity, an abstract view of the system can simplify its development. To overcome this problem, we propose a method to decompose systems into subsystems while reducing their coupling. These subsystems represent components. Consisting of an existing object-oriented systems, the main idea of our approach is based on modelling as graphs all entities of an oriented object source code. Such modelling is easy to handle, so we can apply restructuring algorithms based on graph metrics. The particularity of our approach consists in integrating in addition to standard metrics, such as coupling and cohesion, some graph metrics giving more precision during the components identication. To treat this problem, we relied on the ROMANTIC approach that proposed a component-based software architecture recovery from an object oriented system.Keywords: software reengineering, software component and interfaces, metrics, graphs
Procedia PDF Downloads 5017977 Mathematics as the Foundation for the STEM Disciplines: Different Pedagogical Strategies Addressed
Authors: Marion G. Ben-Jacob, David Wang
Abstract:
There is a mathematics requirement for entry level college and university students, especially those who plan to study STEM (Science, Technology, Engineering and Mathematics). Most of them take College Algebra, and to continue their studies, they need to succeed in this course. Different pedagogical strategies are employed to promote the success of our students. There is, of course, the Traditional Method of teaching- lecture, examples, problems for students to solve. The Emporium Model, another pedagogical approach, replaces traditional lectures with a learning resource center model featuring interactive software and on-demand personalized assistance. This presentation will compare these two methods of pedagogy and the study done with its results on this comparison. Math is the foundation for science, technology, and engineering. Its work is generally used in STEM to find patterns in data. These patterns can be used to test relationships, draw general conclusions about data, and model the real world. In STEM, solutions to problems are analyzed, reasoned, and interpreted using math abilities in a assortment of real-world scenarios. This presentation will examine specific examples of how math is used in the different STEM disciplines. Math becomes practical in science when it is used to model natural and artificial experiments to identify a problem and develop a solution for it. As we analyze data, we are using math to find the statistical correlation between the cause of an effect. Scientists who use math include the following: data scientists, scientists, biologists and geologists. Without math, most technology would not be possible. Math is the basis of binary, and without programming, you just have the hardware. Addition, subtraction, multiplication, and division is also used in almost every program written. Mathematical algorithms are inherent in software as well. Mechanical engineers analyze scientific data to design robots by applying math and using the software. Electrical engineers use math to help design and test electrical equipment. They also use math when creating computer simulations and designing new products. Chemical engineers often use mathematics in the lab. Advanced computer software is used to aid in their research and production processes to model theoretical synthesis techniques and properties of chemical compounds. Mathematics mastery is crucial for success in the STEM disciplines. Pedagogical research on formative strategies and necessary topics to be covered are essential.Keywords: emporium model, mathematics, pedagogy, STEM
Procedia PDF Downloads 757976 Simulation Based Analysis of Gear Dynamic Behavior in Presence of Multiple Cracks
Authors: Ahmed Saeed, Sadok Sassi, Mohammad Roshun
Abstract:
Gears are important components with a vital role in many rotating machines. One of the common gear failure causes is tooth fatigue crack; however, its early detection is still a challenging task. The objective of this study is to develop a numerical model that simulates the effect of teeth cracks on the resulting gears vibrations and permits consequently to perform an early fault detection. In contrast to other published papers, this work incorporates the possibility of multiple simultaneous cracks with different depths. As cracks alter significantly the stiffness of the tooth, finite element software is used to determine the stiffness variation with respect to the angular position, for different combinations of crack orientation and depth. A simplified six degrees of freedom nonlinear lumped parameter model of a one-stage spur gear system is proposed to study the vibration with and without cracks. The model developed for calculating the stiffness with the crack permitted to update the physical parameters of the second-degree-of-freedom equations of motions describing the vibration of the gearbox. The vibration simulation results of the gearbox were by obtained using Simulink/Matlab. The effect of one crack with different levels was studied thoroughly. The change in the mesh stiffness and the vibration response were found to be consistent with previously published works. In addition, various statistical time domain parameters were considered. They showed different degrees of sensitivity toward the crack depth. Multiple cracks were also introduced at different locations and the vibration response along with the statistical parameters were obtained again for a general case of degradation (increase in crack depth, crack number and crack locations). It was found that although some parameters increase in value as the deterioration level increases, they show almost no change or even decrease when the number of cracks increases. Therefore, the use of any statistical parameters could be misleading if not considered in an appropriate way.Keywords: Spur gear, cracked tooth, numerical simulation, time-domain parameters
Procedia PDF Downloads 2667975 Instruct Students Effective Ways to Reach an Advanced Level after Graduation
Authors: Huynh Tan Hoi
Abstract:
Considered as one of the hardest languages in the world, Japanese is still the language that many young people choose to learn. Today, with the development of technology, learning foreign languages in general and Japanese language, in particular, is not an impossible barrier. Learning materials are not only from paper books, songs but also through software programs of smartphones or computers. Especially, students who begin to explore effective skills to study this language need to access modern technologies to improve their learning much better. When using the software, some students may feel embarrassed and challenged, but everything would go smoothly after a few days. After completing the course, students will get more knowledge, achieve a higher knowledge such as N2 or N1 Japanese Language Proficiency Test Certificate. In this research paper, 35 students who are studying at Ho Chi Minh City FPT University were asked to complete the questionnaire at the beginning of July up to August of 2018. Through this research, we realize that with the guidance of lecturers, the necessity of using modern software and some effective methods are indispensable in term of improving quality of teaching and learning process.Keywords: higher knowledge, Japanese, methods, software, students
Procedia PDF Downloads 2257974 Functional Decomposition Based Effort Estimation Model for Software-Intensive Systems
Authors: Nermin Sökmen
Abstract:
An effort estimation model is needed for software-intensive projects that consist of hardware, embedded software or some combination of the two, as well as high level software solutions. This paper first focuses on functional decomposition techniques to measure functional complexity of a computer system and investigates its impact on system development effort. Later, it examines effects of technical difficulty and design team capability factors in order to construct the best effort estimation model. With using traditional regression analysis technique, the study develops a system development effort estimation model which takes functional complexity, technical difficulty and design team capability factors as input parameters. Finally, the assumptions of the model are tested.Keywords: functional complexity, functional decomposition, development effort, technical difficulty, design team capability, regression analysis
Procedia PDF Downloads 2937973 KBASE Technological Framework - Requirements
Authors: Ivan Stanev, Maria Koleva
Abstract:
Automated software development issues are addressed in this paper. Layers and packages of a Common Platform for Automated Programming (CPAP) are defined based on Service Oriented Architecture, Cloud computing, Knowledge based automated software engineering (KBASE) and Method of automated programming. Tools of seven leading companies (AWS of Amazon, Azure of Microsoft, App Engine of Google, vCloud of VMWare, Bluemix of IBM, Helion of HP, OCPaaS of Oracle) are analyzed in the context of CPAP. Based on the results of the analysis CPAP requirements are formulatedKeywords: automated programming, cloud computing, knowledge based software engineering, service oriented architecture
Procedia PDF Downloads 3017972 Investigating Visual Statistical Learning during Aging Using the Eye-Tracking Method
Authors: Zahra Kazemi Saleh, Bénédicte Poulin-Charronnat, Annie Vinter
Abstract:
This study examines the effects of aging on visual statistical learning, using eye-tracking techniques to investigate this cognitive phenomenon. Visual statistical learning is a fundamental brain function that enables the automatic and implicit recognition, processing, and internalization of environmental patterns over time. Some previous research has suggested the robustness of this learning mechanism throughout the aging process, underscoring its importance in the context of education and rehabilitation for the elderly. The study included three distinct groups of participants, including 21 young adults (Mage: 19.73), 20 young-old adults (Mage: 67.22), and 17 old-old adults (Mage: 79.34). Participants were exposed to a series of 12 arbitrary black shapes organized into 6 pairs, each with different spatial configurations and orientations (horizontal, vertical, and oblique). These pairs were not explicitly revealed to the participants, who were instructed to passively observe 144 grids presented sequentially on the screen for a total duration of 7 min. In the subsequent test phase, participants performed a two-alternative forced-choice task in which they had to identify the most familiar pair from 48 trials, each consisting of a base pair and a non-base pair. Behavioral analysis using t-tests revealed notable findings. The mean score for the first group was significantly above chance, indicating the presence of visual statistical learning. Similarly, the second group also performed significantly above chance, confirming the persistence of visual statistical learning in young-old adults. Conversely, the third group, consisting of old-old adults, showed a mean score that was not significantly above chance. This lack of statistical learning in the old-old adult group suggests a decline in this cognitive ability with age. Preliminary eye-tracking results showed a decrease in the number and duration of fixations during the exposure phase for all groups. The main difference was that older participants focused more often on empty cases than younger participants, likely due to a decline in the ability to ignore irrelevant information, resulting in a decrease in statistical learning performance.Keywords: aging, eye tracking, implicit learning, visual statistical learning
Procedia PDF Downloads 777971 Measuring Investigation and Computational Simulation of Cavitation Phenomenon Effects on the Industrial Centrifugal Pump Vibration
Authors: Mahdi Hamzehei, Homan Alimoradzadeh, Mahdi Shahriyari
Abstract:
In this paper, vibration of the industrial centrifugal pumps studied by measuring analysis and computational simulation. Effects of different parameters on pump vibration were investigated. Also, simulation of cavitation in the centrifugal pump was down. First, via CF-TURBO software, the pump impeller and the fluid passing through the pump is modelled and finally, the phenomenon of cavitation in the impeller has been modelled by Ansys software. Also, the effects of changes in the amount of NPSH and bubbles generation in the pump impeller were investigated. By simulation of piping with pipe flow software, effect of fluid velocity and pressure on hydraulics and vibration were studied computationally by applying Computational Fluid Dynamic (CFD) techniques, fluent software and experimentally. Furthermore, this comparison showed that the model can predict hydraulics and vibration behaviour.Keywords: cavitation, vibration, centrifugal pumps, performance curves, NPSH
Procedia PDF Downloads 5437970 JavaScript Object Notation Data against eXtensible Markup Language Data in Software Applications a Software Testing Approach
Authors: Theertha Chandroth
Abstract:
This paper presents a comparative study on how to check JSON (JavaScript Object Notation) data against XML (eXtensible Markup Language) data from a software testing point of view. JSON and XML are widely used data interchange formats, each with its unique syntax and structure. The objective is to explore various techniques and methodologies for validating comparison and integration between JSON data to XML and vice versa. By understanding the process of checking JSON data against XML data, testers, developers and data practitioners can ensure accurate data representation, seamless data interchange, and effective data validation.Keywords: XML, JSON, data comparison, integration testing, Python, SQL
Procedia PDF Downloads 1407969 Software-Defined Architecture and Front-End Optimization for DO-178B Compliant Distance Measuring Equipment
Authors: Farzan Farhangian, Behnam Shakibafar, Bobda Cedric, Rene Jr. Landry
Abstract:
Among the air navigation technologies, many of them are capable of increasing aviation sustainability as well as accuracy improvement in Alternative Positioning, Navigation, and Timing (APNT), especially avionics Distance Measuring Equipment (DME), Very high-frequency Omni-directional Range (VOR), etc. The integration of these air navigation solutions could make a robust and efficient accuracy in air mobility, air traffic management and autonomous operations. Designing a proper RF front-end, power amplifier and software-defined transponder could pave the way for reaching an optimized avionics navigation solution. In this article, the possibility of reaching an optimum front-end to be used with single low-cost Software-Defined Radio (SDR) has been investigated in order to reach a software-defined DME architecture. Our software-defined approach uses the firmware possibilities to design a real-time software architecture compatible with a Multi Input Multi Output (MIMO) BladeRF to estimate an accurate time delay between a Transmission (Tx) and the reception (Rx) channels using the synchronous scheduled communication. We could design a novel power amplifier for the transmission channel of the DME to pass the minimum transmission power. This article also investigates designing proper pair pulses based on the DO-178B avionics standard. Various guidelines have been tested, and the possibility of passing the certification process for each standard term has been analyzed. Finally, the performance of the DME was tested in the laboratory environment using an IFR6000, which showed that the proposed architecture reached an accuracy of less than 0.23 Nautical mile (Nmi) with 98% probability.Keywords: avionics, DME, software defined radio, navigation
Procedia PDF Downloads 797968 Project Progress Prediction in Software Devlopment Integrating Time Prediction Algorithms and Large Language Modeling
Authors: Dong Wu, Michael Grenn
Abstract:
Managing software projects effectively is crucial for meeting deadlines, ensuring quality, and managing resources well. Traditional methods often struggle with predicting project timelines accurately due to uncertain schedules and complex data. This study addresses these challenges by combining time prediction algorithms with Large Language Models (LLMs). It makes use of real-world software project data to construct and validate a model. The model takes detailed project progress data such as task completion dynamic, team Interaction and development metrics as its input and outputs predictions of project timelines. To evaluate the effectiveness of this model, a comprehensive methodology is employed, involving simulations and practical applications in a variety of real-world software project scenarios. This multifaceted evaluation strategy is designed to validate the model's significant role in enhancing forecast accuracy and elevating overall management efficiency, particularly in complex software project environments. The results indicate that the integration of time prediction algorithms with LLMs has the potential to optimize software project progress management. These quantitative results suggest the effectiveness of the method in practical applications. In conclusion, this study demonstrates that integrating time prediction algorithms with LLMs can significantly improve the predictive accuracy and efficiency of software project management. This offers an advanced project management tool for the industry, with the potential to improve operational efficiency, optimize resource allocation, and ensure timely project completion.Keywords: software project management, time prediction algorithms, large language models (LLMS), forecast accuracy, project progress prediction
Procedia PDF Downloads 797967 Sexual Cognitive Behavioral Therapy: Psychological Performance and Openness to Experience
Authors: Alireza Monzavi Chaleshtari, Mahnaz Aliakbari Dehkordi, Amin Asadi Hieh, Majid Kazemnezhad
Abstract:
This research was conducted with the aim of determining the effectiveness of sexual cognitive behavioral therapy on psychological performance and openness to experience in women. The type of research was experimental in the form of pre-test-post-test. The statistical population of this research was made up of all working and married women with membership in the researcher's Instagram social network who had problems in marital-sexual relationships (N=900). From the statistical community, which includes working and married women who are members of the researcher's Instagram social network who have problems in marital-sexual relationships, there are 30 people including two groups (15 people in the experimental group and 15 people in the control group) as available sampling and selected randomly. They were placed in two experimental and control groups. The anxiety, stress, and depression scale (DASS) and the Costa and McCree personality questionnaire were used to collect data, and the cognitive behavioral therapy protocol of Dr. Mehrnaz Ali Akbari was used for the treatment sessions. To analyze the data, the covariance test was used in the SPSS22 software environment. The results showed that sexual cognitive behavioral therapy has a positive and significant effect on psychological performance and openness to experience in women. Conclusion: It can be concluded that interventions such as cognitive-behavioral sex can be used to treat marital problems.Keywords: sexual cognitive behavioral therapy, psychological function, openness to experience, women
Procedia PDF Downloads 787966 Employer Learning, Statistical Discrimination and University Prestige
Authors: Paola Bordon, Breno Braga
Abstract:
This paper investigates whether firms use university prestige to statistically discriminate among college graduates. The test is based on the employer learning literature which suggests that if firms use a characteristic for statistical discrimination, this variable should become less important for earnings as a worker gains labor market experience. In this framework, we use a regression discontinuity design to estimate a 19% wage premium for recent graduates of two of the most selective universities in Chile. However, we find that this premium decreases by 3 percentage points per year of labor market experience. These results suggest that employers use college selectivity as a signal of workers' quality when they leave school. However, as workers reveal their productivity throughout their careers, they become rewarded based on their true quality rather than the prestige of their college.Keywords: employer learning, statistical discrimination, college returns, college selectivity
Procedia PDF Downloads 5807965 Examining the Relationship Between Depression and Drug and Alcohol Use in Iran
Authors: Masoumeh Kazemi
Abstract:
Depression is one of the most common mental disorders that damage mental health. In addition to mental distress, mental health damage affects other dimensions of human health, including physical and social health. According to the national study of diseases and injuries in Iran, the third health problem of the country is depression. The purpose of this study was to measure the level of depression in people referred to Karaj psychiatric treatment centers, and to investigate the relationship between depression and drug and alcohol consumption. The statistical population included 5000 people. Morgan table was used to determine the sample size. The research questions sought to identify the relationship between depression and factors such as drug and alcohol use, employment and marital status, and gender. Beck standard questionnaire was used to collect complete information. Cronbach's alpha coefficient was used to confirm the reliability of the questionnaire. To test research hypotheses, non-parametric methods of correlation coefficient, Spearman's rank, Mann-Whitney and Kruskal-Wallis tests were used. The results of using SPSS statistical software showed that there is a direct relationship between depression and drug and alcohol use. Also, the rate of depression was higher in women, widows and unemployed people. Finally, by conducting the present study, it is suggested that people use the following treatments in combination for effective recovery: 1. Cognitive Behavioral Therapy (CBT) 2. Interpersonal Therapy (IPT) 3. Treatment with appropriate medication 4. Special light therapy 5. Electric shock treatment (in acute and exceptional cases) 6. Self-helpKeywords: alcohol, depression, drug, Iran
Procedia PDF Downloads 577964 Hardware Implementation and Real-time Experimental Validation of a Direction of Arrival Estimation Algorithm
Authors: Nizar Tayem, AbuMuhammad Moinuddeen, Ahmed A. Hussain, Redha M. Radaydeh
Abstract:
This research paper introduces an approach for estimating the direction of arrival (DOA) of multiple RF noncoherent sources in a uniform linear array (ULA). The proposed method utilizes a Capon-like estimation algorithm and incorporates LU decomposition to enhance the accuracy of DOA estimation while significantly reducing computational complexity compared to existing methods like the Capon method. Notably, the proposed method does not require prior knowledge of the number of sources. To validate its effectiveness, the proposed method undergoes validation through both software simulations and practical experimentation on a prototype testbed constructed using a software-defined radio (SDR) platform and GNU Radio software. The results obtained from MATLAB simulations and real-time experiments provide compelling evidence of the proposed method's efficacy.Keywords: DOA estimation, real-time validation, software defined radio, computational complexity, Capon's method, GNU radio
Procedia PDF Downloads 757963 Statistical Manufacturing Cell/Process Qualification Sample Size Optimization
Authors: Angad Arora
Abstract:
In production operations/manufacturing, a cell or line is typically a bunch of similar machines (computer numerical control (CNCs), advanced cutting, 3D printing or special purpose machines. For qualifying a typical manufacturing line /cell / new process, Ideally, we need a sample of parts that can be flown through the process and then we make a judgment on the health of the line/cell. However, with huge volumes and mass production scope, such as in the mobile phone industry, for example, the actual cells or lines can go in thousands and to qualify each one of them with statistical confidence means utilizing samples that are very large and eventually add to product /manufacturing cost + huge waste if the parts are not intended to be customer shipped. To solve this, we come up with 2 steps statistical approach. We start with a small sample size and then objectively evaluate whether the process needs additional samples or not. For example, if a process is producing bad parts and we saw those samples early, then there is a high chance that the process will not meet the desired yield and there is no point in keeping adding more samples. We used this hypothesis and came up with 2 steps binomial testing approach. Further, we also prove through results that we can achieve an 18-25% reduction in samples while keeping the same statistical confidence.Keywords: statistics, data science, manufacturing process qualification, production planning
Procedia PDF Downloads 967962 An Approach Based on Statistics and Multi-Resolution Representation to Classify Mammograms
Authors: Nebi Gedik
Abstract:
One of the significant and continual public health problems in the world is breast cancer. Early detection is very important to fight the disease, and mammography has been one of the most common and reliable methods to detect the disease in the early stages. However, it is a difficult task, and computer-aided diagnosis (CAD) systems are needed to assist radiologists in providing both accurate and uniform evaluation for mass in mammograms. In this study, a multiresolution statistical method to classify mammograms as normal and abnormal in digitized mammograms is used to construct a CAD system. The mammogram images are represented by wave atom transform, and this representation is made by certain groups of coefficients, independently. The CAD system is designed by calculating some statistical features using each group of coefficients. The classification is performed by using support vector machine (SVM).Keywords: wave atom transform, statistical features, multi-resolution representation, mammogram
Procedia PDF Downloads 2227961 The Metacognition Levels of Students: A Research School of Physical Education and Sports at Anadolu University
Authors: Dilek Yalız Solmaz
Abstract:
Meta-cognition is an important factor for educating conscious individuals who are aware of their cognitive processes. With this respect, the purposes of this article is to find out the perceived metacognition level of Physical Education and Sports School students at Anadolu University and to identify whether metacognition levels display significant differences in terms of various variables. 416 Anadolu University Physical Education and Sports School students were formed the research universe. "The Meta-Cognitions Questionnaire (MCQ-30)" developed by Cartwright-Hatton and Wells and later developed the 30-item short form (MCQ-30) was used. The MCQ-30 which was adapted into Turkish by Tosun and Irak is a four-point agreement scale. In the data analysis, arithmethic mean, standard deviation, t-test and ANOVA were used. There is no statistical difference between mean scores of uncontrollableness and danger, cognitive awareness, cognitive confidence and the positive beliefs of girls and boys students. There is a statistical difference between mean scores of the need to control thinking. There is no statistical difference according to departments of students between mean scores of uncontrollableness and danger, cognitive awareness, cognitive confidence, need to control thinking and the positive beliefs. There is no statistical difference according to grade level of students between mean scores of the positive beliefs, cognitive confidence and need to control thinking. There is a statistical difference between mean scores of uncontrollableness and danger and cognitive awareness.Keywords: meta cognition, physical education, sports school students, thinking
Procedia PDF Downloads 3837960 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective
Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou
Abstract:
The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.Keywords: mortality map, spatial patterns, statistical area, variation
Procedia PDF Downloads 2587959 Direct Translation vs. Pivot Language Translation for Persian-Spanish Low-Resourced Statistical Machine Translation System
Authors: Benyamin Ahmadnia, Javier Serrano
Abstract:
In this paper we compare two different approaches for translating from Persian to Spanish, as a language pair with scarce parallel corpus. The first approach involves direct transfer using an statistical machine translation system, which is available for this language pair. The second approach involves translation through English, as a pivot language, which has more translation resources and more advanced translation systems available. The results show that, it is possible to achieve better translation quality using English as a pivot language in either approach outperforms direct translation from Persian to Spanish. Our best result is the pivot system which scores higher than direct translation by (1.12) BLEU points.Keywords: statistical machine translation, direct translation approach, pivot language translation approach, parallel corpus
Procedia PDF Downloads 4877958 The Effect of Voice Recognition Dictation Software on Writing Quality in Third Grade Students: An Action Research Study
Authors: Timothy J. Grebec
Abstract:
This study investigated whether using a voice dictation software program (i.e., Google Voice Typing) has an impact on student writing quality. The research took place in a third-grade general education classroom in a suburban school setting. Because the study involved minors, all data was encrypted and deidentified before analysis. The students completed a series of writings prior to the beginning of the intervention to determine their thoughts and skill level with writing. During the intervention phase, the students were introduced to the voice dictation software, given an opportunity to practice using it, and then assigned writing prompts to be completed using the software. The prompts written by nineteen student participants and surveys of student opinions on writing established a baseline for the study. The data showed that using the dictation software resulted in a 34% increase in the response quality (compared to the Pennsylvania State Standardized Assessment [PSSA] writing guidelines). Of particular interest was the increase in students' proficiency in demonstrating mastery of the English language and conventions and elaborating on the content. Although this type of research is relatively no, it has the potential to reshape the strategies educators have at their disposal when instructing students on written language.Keywords: educational technology, accommodations, students with disabilities, writing instruction, 21st century education
Procedia PDF Downloads 757957 Comparison of the Distillation Curve Obtained Experimentally with the Curve Extrapolated by a Commercial Simulator
Authors: Lívia B. Meirelles, Erika C. A. N. Chrisman, Flávia B. de Andrade, Lilian C. M. de Oliveira
Abstract:
True Boiling Point distillation (TBP) is one of the most common experimental techniques for the determination of petroleum properties. This curve provides information about the performance of petroleum in terms of its cuts. The experiment is performed in a few days. Techniques are used to determine the properties faster with a software that calculates the distillation curve when a little information about crude oil is known. In order to evaluate the accuracy of distillation curve prediction, eight points of the TBP curve and specific gravity curve (348 K and 523 K) were inserted into the HYSYS Oil Manager, and the extended curve was evaluated up to 748 K. The methods were able to predict the curve with the accuracy of 0.6%-9.2% error (Software X ASTM), 0.2%-5.1% error (Software X Spaltrohr).Keywords: distillation curve, petroleum distillation, simulation, true boiling point curve
Procedia PDF Downloads 4417956 Analysis of the Engineering Judgement Influence on the Selection of Geotechnical Parameters Characteristic Values
Authors: K. Ivandic, F. Dodigovic, D. Stuhec, S. Strelec
Abstract:
A characteristic value of certain geotechnical parameter results from an engineering assessment. Its selection has to be based on technical principles and standards of engineering practice. It has been shown that the results of engineering assessment of different authors for the same problem and input data are significantly dispersed. A survey was conducted in which participants had to estimate the force that causes a 10 cm displacement at the top of a axially in-situ compressed pile. Fifty experts from all over the world took part in it. The lowest estimated force value was 42% and the highest was 133% of measured force resulting from a mentioned static pile load test. These extreme values result in significantly different technical solutions to the same engineering task. In case of selecting a characteristic value of a geotechnical parameter the importance of the influence of an engineering assessment can be reduced by using statistical methods. An informative annex of Eurocode 1 prescribes the method of selecting the characteristic values of material properties. This is followed by Eurocode 7 with certain specificities linked to selecting characteristic values of geotechnical parameters. The paper shows the procedure of selecting characteristic values of a geotechnical parameter by using a statistical method with different initial conditions. The aim of the paper is to quantify an engineering assessment in the example of determining a characteristic value of a specific geotechnical parameter. It is assumed that this assessment is a random variable and that its statistical features will be determined. For this purpose, a survey research was conducted among relevant experts from the field of geotechnical engineering. Conclusively, the results of the survey and the application of statistical method were compared.Keywords: characteristic values, engineering judgement, Eurocode 7, statistical methods
Procedia PDF Downloads 2967955 The Impacts of Local Decision Making on Customisation Process Speed across Distributed Boundaries
Authors: Abdulrahman M. Qahtani, Gary. B. Wills, Andy. M. Gravell
Abstract:
Communicating and managing customers’ requirements in software development projects play a vital role in the software development process. While it is difficult to do so locally, it is even more difficult to communicate these requirements over distributed boundaries and to convey them to multiple distribution customers. This paper discusses the communication of multiple distribution customers’ requirements in the context of customised software products. The main purpose is to understand the challenges of communicating and managing customisation requirements across distributed boundaries. We propose a model for Communicating Customisation Requirements of Multi-Clients in a Distributed Domain (CCRD). Thereafter, we evaluate that model by presenting the findings of a case study conducted with a company with customisation projects for 18 distributed customers. Then, we compare the outputs of the real case process and the outputs of the CCRD model using simulation methods. Our conjecture is that the CCRD model can reduce the challenge of communication requirements over distributed organisational boundaries, and the delay in decision making and in the entire customisation process time.Keywords: customisation software products, global software engineering, local decision making, requirement engineering, simulation model
Procedia PDF Downloads 4297954 JREM: An Approach for Formalising Models in the Requirements Phase with JSON and NoSQL Databases
Authors: Aitana Alonso-Nogueira, Helia Estévez-Fernández, Isaías García
Abstract:
This paper presents an approach to reduce some of its current flaws in the requirements phase inside the software development process. It takes the software requirements of an application, makes a conceptual modeling about it and formalizes it within JSON documents. This formal model is lodged in a NoSQL database which is document-oriented, that is, MongoDB, because of its advantages in flexibility and efficiency. In addition, this paper underlines the contributions of the detailed approach and shows some applications and benefits for the future work in the field of automatic code generation using model-driven engineering tools.Keywords: conceptual modelling, JSON, NoSQL databases, requirements engineering, software development
Procedia PDF Downloads 3787953 Factors Associated with Weight Loss Maintenance after an Intervention Program
Authors: Filipa Cortez, Vanessa Pereira
Abstract:
Introduction: The main challenge of obesity treatment is long-term weight loss maintenance. The 3 phases method is a weight loss program that combines a low carb and moderately high-protein diet, food supplements and a weekly one-to-one consultation with a certified nutritionist. Sustained weight control is the ultimate goal of phase 3. Success criterion was the minimum loss of 10% of initial weight and its maintenance after 12 months. Objective: The aim of this study was to identify factors associated with successful weight loss maintenance after 12 months at the end of 3 phases method. Methods: The study included 199 subjects that achieved their weight loss goal (phase 3). Weight and body mass index (BMI) were obtained at the baseline and every week until the end of the program. Therapeutic adherence was measured weekly on a Likert scale from 1 to 5. Subjects were considered in compliance with nutritional recommendation and supplementation when their classification was ≥ 4. After 12 months of the method, the current weight and number of previous weight-loss attempts were collected by telephone interview. The statistical significance was assumed at p-values < 0.05. Statistical analyses were performed using SPSS TM software v.21. Results: 65.3% of subjects met the success criterion. The factors which displayed a significant weight loss maintenance prediction were: greater initial percentage weight loss (OR=1.44) during the weight loss intervention and a higher number of consultations in phase 3 (OR=1.10). Conclusion: These findings suggest that the percentage weight loss during the weight loss intervention and the number of consultations in phase 3 may facilitate maintenance of weight loss after the 3 phases method.Keywords: obesity, weight maintenance, low-carbohydrate diet, dietary supplements
Procedia PDF Downloads 1507952 Measurement and Analysis of Radiation Doses to Radiosensitive Organs from CT Examination of the Cervical Spine Using Radiochromic Films and Monte Carlo Simulation Based Software
Authors: Khaled Soliman, Abdullah Alrushoud, Abdulrahman Alkhalifah, Raed Albathi, Salman Altymiat
Abstract:
Radiation dose received by patients undergoing Computed Tomography (CT) examination of the cervical spine was evaluated using Gafchromic XR-QA2 films and CT-EXPO software (ver. 2.3), in order to document our clinical dose values and to compare our results with other benchmarks reported in the current literature. Radiochromic films were recently used as practical dosimetry tool that provides dose profile information not available using the standard ionisation chamber routinely used in CT dosimetry. We have developed an in-house program to use the films in order to calculate the Entrance Dose Length Product (EDLP) in (mGy.cm) and to relate the EDLP to various organ doses calculated using the CT-EXPO software. We also calculated conversion factor in (mSv/mGy.cm) relating the EDLP to the effective dose (ED) from the examination using CT-EXPO software. Variability among different types of CT scanners and dose modulation methods are reported from at least three major CT brands available at our medical institution. Our work describes the dosimetry method and results are reported. The method can be used as in-vivo dosimetry method. But this work only reports results obtained from adult female anthropomorphic Phantom studies.Keywords: CT dosimetry, gafchromic films, XR-QA2, CT-Expo software
Procedia PDF Downloads 4717951 An Architectural Model of Multi-Agent Systems for Student Evaluation in Collaborative Game Software
Authors: Monica Hoeldtke Pietruchinski, Andrey Ricardo Pimentel
Abstract:
The teaching of computer programming for beginners has been presented to the community as a not simple or trivial task. Several methodologies and research tools have been developed; however, the problem still remains. This paper aims to present multi-agent system architecture to be incorporated to the educational collaborative game software for teaching programming that monitors, evaluates and encourages collaboration by the participants. A literature review has been made on the concepts of Collaborative Learning, Multi-agents systems, collaborative games and techniques to teach programming using these concepts simultaneously.Keywords: architecture of multi-agent systems, collaborative evaluation, collaboration assessment, gamifying educational software
Procedia PDF Downloads 4637950 Evaluation of SDS (Software Defined Storage) Controller (CorpHD) for Various Storage Demands
Authors: Shreya Bokare, Sanjay Pawar, Shika Nema
Abstract:
Growth in cloud applications is generating the tremendous amount of data, building load on traditional storage management systems. Software Defined Storage (SDS) is a new storage management concept becoming popular to handle this large amount of data. CoprHD is one of the open source SDS controller, available for experimentation and development in the storage industry. In this paper, the storage management techniques provided by CoprHD to manage heterogeneous storage platforms are experimented and analyzed. Various storage management parameters such as time to provision, storage capacity measurement, and heterogeneity are experimentally evaluated along with the theoretical expression to prove the completeness of CoprHD controller for storage management.Keywords: software defined storage, SDS, CoprHD, open source, SMI-S simulator, clarion, Symmetrix
Procedia PDF Downloads 3137949 Bug Localization on Single-Line Bugs of Apache Commons Math Library
Authors: Cherry Oo, Hnin Min Oo
Abstract:
Software bug localization is one of the most costly tasks in program repair technique. Therefore, there is a high claim for automated bug localization techniques that can monitor programmers to the locations of bugs, with slight human arbitration. Spectrum-based bug localization aims to help software developers to discover bugs rapidly by investigating abstractions of the program traces to make a ranking list of most possible buggy modules. Using the Apache Commons Math library project, we study the diagnostic accuracy using our spectrum-based bug localization metric. Our outcomes show that the greater performance of a specific similarity coefficient, used to inspect the program spectra, is mostly effective on localizing of single line bugs.Keywords: software testing, bug localization, program spectra, bug
Procedia PDF Downloads 143