Search results for: multiple data
26820 The Impacts of Local Decision Making on Customisation Process Speed across Distributed Boundaries
Authors: Abdulrahman M. Qahtani, Gary. B. Wills, Andy. M. Gravell
Abstract:
Communicating and managing customers’ requirements in software development projects play a vital role in the software development process. While it is difficult to do so locally, it is even more difficult to communicate these requirements over distributed boundaries and to convey them to multiple distribution customers. This paper discusses the communication of multiple distribution customers’ requirements in the context of customised software products. The main purpose is to understand the challenges of communicating and managing customisation requirements across distributed boundaries. We propose a model for Communicating Customisation Requirements of Multi-Clients in a Distributed Domain (CCRD). Thereafter, we evaluate that model by presenting the findings of a case study conducted with a company with customisation projects for 18 distributed customers. Then, we compare the outputs of the real case process and the outputs of the CCRD model using simulation methods. Our conjecture is that the CCRD model can reduce the challenge of communication requirements over distributed organisational boundaries, and the delay in decision making and in the entire customisation process time.Keywords: customisation software products, global software engineering, local decision making, requirement engineering, simulation model
Procedia PDF Downloads 42926819 An Investigation about the Health-Promoting Lifestyle of 1389 Emergency Nurses in China
Authors: Lei Ye, Min Liu, Yong-Li Gao, Jun Zhang
Abstract:
Purpose: The aims of the study are to investigate the status of health-promoting lifestyle and to compare the healthy lifestyle of emergency nurses in different levels of hospitals in Sichuan province, China. The investigation is mainly about the health-promoting lifestyle, including spiritual growth, health responsibility, physical activity, nutrition, interpersonal relations, stress management. Then the factors were analyzed influencing the health-promoting lifestyle of emergency nurses in hospitals of Sichuan province in order to find the relevant models to provide reference evidence for intervention. Study Design: A cross-sectional research method was adopted. Stratified cluster sampling, based on geographical location, was used to select the health facilities of 1389 emergency nurses in 54 hospitals from Sichuan province in China. Method: The 52-item, six-factor structure Health-Promoting Lifestyle Profile II (HPLP- II) instrument was used to explore participants’ self-reported health-promoting behaviors and measure the dimensions of health responsibility, physical activity, nutrition, interpersonal relations, spiritual growth, and stress management. Demographic characteristics, education, work duration, emergency nursing work duration and self-rated health status were documented. Analysis: Data were analyzed through SPSS software ver. 17.0. Frequency, percentage, mean ± standard deviation were used to describe the general information, while the Nonparametric Test was used to compare the constituent ratio of general data of different hospitals. One-way ANOVA was used to compare the scores of health-promoting lifestyle in different levels hospital. A multiple linear regression model was established. P values which were less than 0.05 determined statistical significance in all analyses. Result: The survey showed that the total score of health-promoting lifestyle of nurses at emergency departments in Sichuan Province was 120.49 ± 21.280. The relevant dimensions are ranked by scores in descending order: interpersonal relations, nutrition, health responsibility, physical activity, stress management, spiritual growth. The total scores of the three-A hospital were the highest (121.63 ± 0.724), followed by the senior class hospital (119.7 ± 1.362) and three-B hospital (117.80 ± 1.255). The difference was statistically significant (P=0.024). The general data of nurses was used as the independent variable which includes age, gender, marital status, living conditions, nursing income, hospital level, Length of Service in nursing, Length of Service in emergency, Professional Title, education background, and the average number of night shifts. The total score of health-promoting lifestyle was used as dependent variable; Multiple linear regression analysis method was adopted to establish the regression model. The regression equation F = 20.728, R2 = 0.061, P < 0.05, the age, gender, nursing income, turnover intention and status of coping stress affect the health-promoting lifestyle of nurses in emergency department, the result was statistically significant (P < 0.05 ). Conclusion: The results of the investigation indicate that it will help to develop health promoting interventions for emergency nurses in all levels of hospital in Sichuan Province through further research. Managers need to pay more attention to emergency nurses’ exercise, stress management, self-realization, and conduct intervention in nurse training programs.Keywords: emergency nurse, health-promoting lifestyle profile II, health behaviors, lifestyle
Procedia PDF Downloads 28226818 The Effect of Rosella Flower Flour (Hibiscus sabdariffa L.) Utilization in Ration on Performance of Broiler Chicken
Authors: Nurlisa Uke Dessy, Dwi Septian Erwinsyah, Zuprizal
Abstract:
This experiment was aimed to investigate the effect of rosella flower flour in diet on broiler chicken Performace. The materials used in this experiment were 72 broiler chickens and were divided into six treatments, those were R0 = without rosella flower flour addition, R1 = 0.5% rosella flower flour addition, R2 = 1.0% rosella flower flour addition, R3 = 1.5% rosella flower flour addition, R4 = 2.0% rosella flower flour addition, and R5 = 2.5% rosella flower flour addition. Each treatment consisted of three replications and each replication consisted of four broiler chickens. This research took 35 days to collect the data. Parameters measured were feed intake, rosella flower flour consumption, body weight gain, feed conversion and mortality. The collected data were analyzed using Completely Randomized Design (CRD) and the differences of mean were tested by Duncan’s New Multiple Range Test (DMRT). The result showed the average of feed consumption were 2154; 2154; 2034; 2154; 2034 and 2154 g/bird on broiler chicken that were feed respectively by 0.0; 0.5; 1.0; 1.5; 2.0; and 2.5% rosella flower flour level. The average consumptions of rosella flower flour respectively were 0; 10.77; 20.34; 32.31; 40.68; and 53.85 g/bird. The body weight gains were 1263.33±70.40; 1422.42±36.33; 1443.75±30.00; 1387.42± 35.30; 1411.17±29.58 and 1457.08±40.75 g/bird. Feed conversion results were 1.71±0.94; 1.51±0.37; 1.47±0.62; 1.55±0.40; 1.53±0.30 and 1.48±0.40. The conclusion of the experiment was known that using rosella flower flour until 2.5% level in diet was able to increase broiler chicken performance, and also to decrease broiler chicken feed conversion.Keywords: feed intake, consumptions rosella flower flour, broiler chickens, body weight gain, feed conversion
Procedia PDF Downloads 63426817 Competition between Regression Technique and Statistical Learning Models for Predicting Credit Risk Management
Authors: Chokri Slim
Abstract:
The objective of this research is attempting to respond to this question: Is there a significant difference between the regression model and statistical learning models in predicting credit risk management? A Multiple Linear Regression (MLR) model was compared with neural networks including Multi-Layer Perceptron (MLP), and a Support vector regression (SVR). The population of this study includes 50 listed Banks in Tunis Stock Exchange (TSE) market from 2000 to 2016. Firstly, we show the factors that have significant effect on the quality of loan portfolios of banks in Tunisia. Secondly, it attempts to establish that the systematic use of objective techniques and methods designed to apprehend and assess risk when considering applications for granting credit, has a positive effect on the quality of loan portfolios of banks and their future collectability. Finally, we will try to show that the bank governance has an impact on the choice of methods and techniques for analyzing and measuring the risks inherent in the banking business, including the risk of non-repayment. The results of empirical tests confirm our claims.Keywords: credit risk management, multiple linear regression, principal components analysis, artificial neural networks, support vector machines
Procedia PDF Downloads 15026816 Performance of Rural and Urban Adult Participants on Neuropsychological Tests in Zambia
Authors: Happy Zulu
Abstract:
Neuropsychological examination is an important way of formally assessing brain function. While there is so much documentation about the influence that some factors, such as age and education, have on neuropsychological tests (NP), not so much has been done to assess the influence that residency (rural/urban) may have. The specific objectives of this study were to establish if there is a significant difference in mean test scores on NP tests between rural and urban participants and to assess which tests on the Zambia Neurobehavioural Test Battery (ZNTB) are more affected by the participants‘ residency (rural/urban) and to determine the extent to which education, gender, and age predict test performance on NP tests for rural and urban participants. The participants (324) were drawn from both urban and rural areas of Zambia (Rural = 152 and Urban = 172). However, only 234 participants (Rural = 152 and Urban 82) were used for all the analyses in this particular study. The 234 participants were used as the actual proportion of the rural vs urban population in Zambia was 65% : 35%, respectively (CSO, 2003). The rural-urban ratio for the participants that were captured during the data collection process was 152 : 172, respectively. Thus, all the rural participants (152) were included and 90 of the 172 urban participants were randomly excluded so that the rural/urban ratio reached the desired 65% : 35 % which was the required ideal statistic for appropriate representation of the actual population in Zambia. Data on NP tests were analyzed from 234 participants, rural (N=152) reflecting 65% and urban (N=82) reflecting 35%. T-tests indicated that urban participants had superior performances in all the seven NP test domains, and all the mean differences in all these domains were found to be statistically significant. Residency had a large or moderate effect in five domains, while its effect size was small only in two of the domains. A standard multiple regression revealed that education, age and residency as predictor variables made a significant contribution to variance in performance on various domains of the ZNTB. However, the gender of participants was not a major factor in determining one‘s performance on neuropsychological tests. This particular report is part of an ongoing, larger, cutting-edge study aimed at formulating the normative data for Zambia with regard to performance on neuropsychological tests. This is necessary for appropriate, effective, and efficient assessment or diagnosis of various neurocognitive and neurobehavioural deficits that a number of people may currently be suffering from. It has been shown in this study that it is vital to make careful analyses of the variables that may be associated with one‘s performance on neuropsychological tests.Keywords: neuropsychology, neurobehavioural, residency, Zambia
Procedia PDF Downloads 5526815 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests
Authors: Julius Onyancha, Valentina Plekhanova
Abstract:
One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.Keywords: web log data, web user profile, user interest, noise web data learning, machine learning
Procedia PDF Downloads 26526814 Data Mining and Knowledge Management Application to Enhance Business Operations: An Exploratory Study
Authors: Zeba Mahmood
Abstract:
The modern business organizations are adopting technological advancement to achieve competitive edge and satisfy their consumer. The development in the field of Information technology systems has changed the way of conducting business today. Business operations today rely more on the data they obtained and this data is continuously increasing in volume. The data stored in different locations is difficult to find and use without the effective implementation of Data mining and Knowledge management techniques. Organizations who smartly identify, obtain and then convert data in useful formats for their decision making and operational improvements create additional value for their customers and enhance their operational capabilities. Marketers and Customer relationship departments of firm use Data mining techniques to make relevant decisions, this paper emphasizes on the identification of different data mining and Knowledge management techniques that are applied to different business industries. The challenges and issues of execution of these techniques are also discussed and critically analyzed in this paper.Keywords: knowledge, knowledge management, knowledge discovery in databases, business, operational, information, data mining
Procedia PDF Downloads 53826813 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data
Authors: Adarsh Shroff
Abstract:
Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.Keywords: big data, map reduce, incremental processing, iterative computation
Procedia PDF Downloads 35026812 Multivariate Output-Associative RVM for Multi-Dimensional Affect Predictions
Authors: Achut Manandhar, Kenneth D. Morton, Peter A. Torrione, Leslie M. Collins
Abstract:
The current trends in affect recognition research are to consider continuous observations from spontaneous natural interactions in people using multiple feature modalities, and to represent affect in terms of continuous dimensions, incorporate spatio-temporal correlation among affect dimensions, and provide fast affect predictions. These research efforts have been propelled by a growing effort to develop affect recognition system that can be implemented to enable seamless real-time human-computer interaction in a wide variety of applications. Motivated by these desired attributes of an affect recognition system, in this work a multi-dimensional affect prediction approach is proposed by integrating multivariate Relevance Vector Machine (MVRVM) with a recently developed Output-associative Relevance Vector Machine (OARVM) approach. The resulting approach can provide fast continuous affect predictions by jointly modeling the multiple affect dimensions and their correlations. Experiments on the RECOLA database show that the proposed approach performs competitively with the OARVM while providing faster predictions during testing.Keywords: dimensional affect prediction, output-associative RVM, multivariate regression, fast testing
Procedia PDF Downloads 28626811 Traditional Management Systems and the Conservation of Cultural and Natural Heritage: Multiple Case Studies in Zimbabwe
Authors: Nyasha Agnes Gurira, Petronella Katekwe
Abstract:
Traditional management systems (TMS) are a vital source of knowledge for conserving cultural and natural heritage. TMS’s are renowned for their ability to preserve both tangible and intangible manifestations of heritage. They are a construct of the intricate relationship that exists between heritage and host communities, where communities are recognized as owners of heritage and so, set up management mechanisms to ensure its adequate conservation. Multiple heritage condition surveys were conducted to assess the effectiveness of using TMS in the conservation of both natural and cultural heritage. Surveys were done at Nharira Hills, Mahwemasimike, Dzimbahwe, Manjowe Rock art sites and Norumedzo forest which are heritage places in Zimbabwe. It assessed the state of conservation of the five case studies and assessed the role that host communities play in the management of these heritage places. It was revealed that TMS’s are effective in the conservation of natural heritage, however in relation to heritage forms with cultural manifestations, there are major disparities. These range from differences in appreciation and perception of value within communities leading to vandalism, over emphasis in the conservation of the intangible element as opposed to the tangible. This leaves the tangible element at risk. Despite these issues, TMS are a reliable knowledge base which enables more holistic conservation approaches for cultural and natural heritage.Keywords: communities, cultural intangible, tangible heritage, traditional management systems, natural
Procedia PDF Downloads 55826810 Power Integrity Analysis of Power Delivery System in High Speed Digital FPGA Board
Authors: Anil Kumar Pandey
Abstract:
Power plane noise is the most significant source of signal integrity (SI) issues in a high-speed digital design. In this paper, power integrity (PI) analysis of multiple power planes in a power delivery system of a 12-layer high-speed FPGA board is presented. All 10 power planes of HSD board are analyzed separately by using 3D Electromagnetic based PI solver, then the transient simulation is performed on combined PI data of all planes along with voltage regulator modules (VRMs) and 70 current drawing chips to get the board level power noise coupling on different high-speed signals. De-coupling capacitors are placed between power planes and ground to reduce power noise coupling with signals.Keywords: power integrity, power-aware signal integrity analysis, electromagnetic simulation, channel simulation
Procedia PDF Downloads 43626809 A Survey of Types and Causes of Medication Errors and Related Factors in Clinical Nurses
Authors: Kouorsh Zarea, Fatemeh Hassani, Samira Beiranvand, Akram Mohamadi
Abstract:
Background and Objectives: Medication error in hospitals is a major cause of the errors which disrupt the health care system. The aim of this study was to assess the nurses’ medication errors and related factors. Material and methods: This was a descriptive study on 225 nurses in various hospitals, selected through multistage random sampling. Data was collected by three researcher made tools; demographic, medication error and related factors questionnaires. Data was analyzed by descriptive statistics, Chi-square, Kruskal-Wallis, One-way analysis of variance. Results: Based on the results obtained, the type of medication errors giving drugs to patients later or earlier (55.6%), multiple oral medication together regardless of their interactions (36%) and the postoperative analgesic without a prescription (34.2%), respectively. In addition, factors such as the shortage of nurses to patients’ ratio (57.3%), high load functions (51.1%) and fatigue caused by the extra work (40.4%), were the most important factors affecting the incidence of medication errors. The fear of legal issues (40%) are the most important factor is the lack of reported medication errors. Conclusions: Based on the results, effective management and promotion motivate nurses. Therefore, increasing scientific and clinical expertise in the field of nursing medication orders is recommended to prevent medication errors in various states of nursing intervention. Employing experienced staff in areas with high risk of medication errors and also supervising less-experienced staff through competent personnel are also suggested.Keywords: medication error, nurse, clinical care, drug errors
Procedia PDF Downloads 26626808 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach
Authors: Jerry Q. Cheng
Abstract:
Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing
Procedia PDF Downloads 16526807 Teacher Candidates' Beliefs About Inclusive Teaching Practices
Authors: Charlotte Brenner
Abstract:
Teachers’ beleifs about inclusion are foundational to their implementation of inclusive teaching practices. Utilizing a longitudinal design and multiple case study methodology, this study investigates how teacher candidates’ instructional and practicum experiences shape their beliefs about inclusion in one teacher education program located in western Canada (N=20). Interview questions were developed through the lens of self-determinaiton theory and theory about teachers’ beleifs and inclusion. Preliminary thematic ananysis indicates that a 36-hour course focused on diversity and inclusion supports teacher candiates to deepen their understandings of: the need for inclusion in classrooms and strategies to promote inclusion. Furthermore, teacher candiates identified course components that fostered their developing understandings of inclusion. Future data will examine the stability of teacher candidates’ beliefs about inclusion and their implementation of inclusive teaching strategies throughout their practicum experiences.Keywords: teacher candidates, inclusion, teacher education programs, beliefs
Procedia PDF Downloads 8826806 Resilience, Mental Health, and Life Satisfaction
Authors: Saba Harati, Nasrin Arian Parsa
Abstract:
The current research was an attempt to investigate the effect of resilience on mental health and life satisfaction. In one Cross Sectional research, 287 (173 females and 114 males) students of Tehran University were participated their average age was 23.17 years old (SD=4.9). The instruments used for assessing the research variables included: Cutter and Davidson resilience scale (CD-RISC), the short form of the depression-anxiety-stress scale, and life satisfaction scale. The data analysis was done in the form of structural equation model. The results of Simultaneous Hierarchical Multiple Regression Analysis indicated that there was a significant mediating role of the negative emotions (depression, anxiety, and stress), in the relationship between the family resilience (p < 0.001) and satisfaction with life (p < 0.001). Resilience results in life satisfaction by reducing the emotional problems (or increasing the mental health level). The effect of the resilience variable on life satisfaction was indirect.Keywords: resilience, negative emotion, mental health, life satisfaction
Procedia PDF Downloads 49826805 Retrieving Similar Segmented Objects Using Motion Descriptors
Authors: Konstantinos C. Kartsakalis, Angeliki Skoura, Vasileios Megalooikonomou
Abstract:
The fuzzy composition of objects depicted in images acquired through MR imaging or the use of bio-scanners has often been a point of controversy for field experts attempting to effectively delineate between the visualized objects. Modern approaches in medical image segmentation tend to consider fuzziness as a characteristic and inherent feature of the depicted object, instead of an undesirable trait. In this paper, a novel technique for efficient image retrieval in the context of images in which segmented objects are either crisp or fuzzily bounded is presented. Moreover, the proposed method is applied in the case of multiple, even conflicting, segmentations from field experts. Experimental results demonstrate the efficiency of the suggested method in retrieving similar objects from the aforementioned categories while taking into account the fuzzy nature of the depicted data.Keywords: fuzzy object, fuzzy image segmentation, motion descriptors, MRI imaging, object-based image retrieval
Procedia PDF Downloads 37526804 Using Photogrammetry to Survey the Côa Valley Iron Age Rock Art Motifs: Vermelhosa Panel 3 Case Study
Authors: Natália Botica, Luís Luís, Paulo Bernardes
Abstract:
The Côa Valley, listed World Heritage since 1998, presents more than 1300 open-air engraved rock panels. The Archaeological Park of the Côa Valley recorded the rock art motifs, testing various techniques based on direct tracing processes on the rock, using natural and artificial lighting. In this work, integrated in the "Open Access Rock Art Repository" (RARAA) project, we present the methodology adopted for the vectorial drawing of the rock art motifs based on orthophotos taken from the photogrammetric survey and 3D models of the rocks. We also present the information system designed to integrate the vector drawing and the characterization data of the motifs, as well as the open access sharing, in order to promote their reuse in multiple areas. The 3D models themselves constitute a very detailed record, ensuring the digital preservation of the rock and iconography. Thus, even if a rock or motif disappears, it can continue to be studied and even recreated.Keywords: rock art, archaeology, iron age, 3D models
Procedia PDF Downloads 8326803 Assessing Project Performance through Work Sampling and Earned Value Analysis
Authors: Shobha Ramalingam
Abstract:
The majority of the infrastructure projects are affected by time overrun, resulting in project delays and subsequently cost overruns. Time overrun may vary from a few months to as high as five or more years, placing the project viability at risk. One of the probable reasons noted in the literature for this outcome in projects is due to poor productivity. Researchers contend that productivity in construction has only marginally increased over the years. While studies in the literature have extensively focused on time and cost parameters in projects, there are limited studies that integrate time and cost with productivity to assess project performance. To this end, a study was conducted to understand the project delay factors concerning cost, time and productivity. A case-study approach was adopted to collect rich data from a nuclear power plant project site for two months through observation, interviews and document review. The data were analyzed using three different approaches for a comprehensive understanding. Foremost, a root-cause analysis was performed on the data using Ishikawa’s fish-bone diagram technique to identify the various factors impacting the delay concerning time. Based on it, a questionnaire was designed and circulated to concerned executives, including project engineers and contractors to determine the frequency of occurrence of the delay, which was then compiled and presented to the management for a possible solution to mitigate. Second, a productivity analysis was performed on select activities, including rebar bending and concreting through a time-motion study to analyze product performance. Third, data on cost of construction for three years allowed analyzing the cost performance using earned value management technique. All three techniques allowed to systematically and comprehensively identify the key factors that deter project performance and productivity loss in the construction of the nuclear power plant project. The findings showed that improper planning and coordination between multiple trades, concurrent operations, improper workforce and material management, fatigue due to overtime were some of the key factors that led to delays and poor productivity. The findings are expected to act as a stepping stone for further research and have implications for practitioners.Keywords: earned value analysis, time performance, project costs, project delays, construction productivity
Procedia PDF Downloads 9726802 Adoption of Big Data by Global Chemical Industries
Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta
Abstract:
The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science
Procedia PDF Downloads 8526801 Psychometric Examination of Atma Jaya's Multiple Intelligence Batteries for University Students
Authors: Angela Oktavia Suryani, Bernadeth Gloria, Edwin Sutamto, Jessica Kristianty, Ni Made Rai Sapitri, Patricia Catherine Agla, Sitti Arlinda Rochiadi
Abstract:
It was found that some blogs or personal websites in Indonesia sell standardized intelligence tests (for example, Progressive Matrices (PM), Intelligence Structure Test (IST), and Culture Fair Intelligence Test (CFIT)) and other psychological tests, together with the manual and the key answers for public. Individuals can buy and prepare themselves for selection or recruitment with the real test. This action drives people to lie to the institution (education or company) and also to themselves. It was also found that those tests are old. Some items are not relevant with the current context, for example a question about a diameter of a certain coin that does not exist anymore. These problems motivate us to develop a new intelligence battery test, namely of Multiple Aptitude Battery (MAB). The battery test was built by using Thurstone’s Primary Mental Abilities theory and intended to be used by high schools students, university students, and worker applicants. The battery tests consist of 9 subtests. In the current study we examine six subtests, namely Reading Comprehension, Verbal Analogies, Numerical Inductive Reasoning, Numerical Deductive Reasoning, Mechanical Ability, and Two Dimensional Spatial Reasoning for university students. The study included 1424 data from students recruited by convenience sampling from eight faculties at Atma Jaya Catholic University of Indonesia. Classical and modern test approaches (Item Response Theory) were carried out to identify the item difficulties of the items and confirmatory factor analysis was applied to examine their internal validities. The validity of each subtest was inspected by using convergent–discriminant method, whereas the reliability was examined by implementing Kuder–Richardson formula. The result showed that the majority of the subtests were difficult in medium level, and there was only one subtest categorized as easy, namely Verbal Analogies. The items were found homogenous and valid measuring their constructs; however at the level of subtests, the construct validity examined by convergent-discriminant method indicated that the subtests were not unidimensional. It means they were not only measuring their own constructs but also other construct. Three of the subtests were able to predict academic performance with small effect size, namely Reading Comprehension, Numerical Inductive Reasoning, and Two Dimensional Spatial Reasoning. GPAs in intermediate level (GPAs at third semester and above) were considered as a factor for predictive invalidity. The Kuder-Richardson formula showed that the reliability coefficients for both numerical reasoning subtests and spatial reasoning were superior, in the range 0.84 – 0.87, whereas the reliability coefficient for the other three subtests were relatively below standard for ability test, in the range of 0.65 – 0.71. It can be concluded that some of the subtests are ready to be used, whereas some others are still need some revisions. This study also demonstrated that the convergent-discrimination method is useful to identify the general intelligence of human.Keywords: intelligence, psychometric examination, multiple aptitude battery, university students
Procedia PDF Downloads 43626800 Use of Numerical Tools Dedicated to Fire Safety Engineering for the Rolling Stock
Authors: Guillaume Craveur
Abstract:
This study shows the opportunity to use numerical tools dedicated to Fire Safety Engineering for the Rolling Stock. Indeed, some lawful requirements can now be demonstrated by using numerical tools. The first part of this study presents the use of modelling evacuation tool to satisfy the criteria of evacuation time for the rolling stock. The buildingEXODUS software is used to model and simulate the evacuation of rolling stock. Firstly, in order to demonstrate the reliability of this tool to calculate the complete evacuation time, a comparative study was achieved between a real test and simulations done with buildingEXODUS. Multiple simulations are performed to capture the stochastic variations in egress times. Then, a new study is done to calculate the complete evacuation time of a train with the same geometry but with a different interior architecture. The second part of this study shows some applications of Computational Fluid Dynamics. This work presents the approach of a multi scales validation of numerical simulations of standardized tests with Fire Dynamics Simulations software developed by the National Institute of Standards and Technology (NIST). This work highlights in first the cone calorimeter test, described in the standard ISO 5660, in order to characterize the fire reaction of materials. The aim of this process is to readjust measurement results from the cone calorimeter test in order to create a data set usable at the seat scale. In the second step, the modelisation concerns the fire seat test described in the standard EN 45545-2. The data set obtained thanks to the validation of the cone calorimeter test was set up in the fire seat test. To conclude with the third step, after controlled the data obtained for the seat from the cone calorimeter test, a larger scale simulation with a real part of train is achieved.Keywords: fire safety engineering, numerical tools, rolling stock, multi-scales validation
Procedia PDF Downloads 30326799 Secure Multiparty Computations for Privacy Preserving Classifiers
Authors: M. Sumana, K. S. Hareesha
Abstract:
Secure computations are essential while performing privacy preserving data mining. Distributed privacy preserving data mining involve two to more sites that cannot pool in their data to a third party due to the violation of law regarding the individual. Hence in order to model the private data without compromising privacy and information loss, secure multiparty computations are used. Secure computations of product, mean, variance, dot product, sigmoid function using the additive and multiplicative homomorphic property is discussed. The computations are performed on vertically partitioned data with a single site holding the class value.Keywords: homomorphic property, secure product, secure mean and variance, secure dot product, vertically partitioned data
Procedia PDF Downloads 41226798 Contrastive Learning for Unsupervised Object Segmentation in Sequential Images
Authors: Tian Zhang
Abstract:
Unsupervised object segmentation aims at segmenting objects in sequential images and obtaining the mask of each object without any manual intervention. Unsupervised segmentation remains a challenging task due to the lack of prior knowledge about these objects. Previous methods often require manually specifying the action of each object, which is often difficult to obtain. Instead, this paper does not need action information of objects and automatically learns the actions and relations among objects from the structured environment. To obtain the object segmentation of sequential images, the relationships between objects and images are extracted to infer the action and interaction of objects based on the multi-head attention mechanism. Three types of objects’ relationships in the object segmentation task are proposed: the relationship between objects in the same frame, the relationship between objects in two frames, and the relationship between objects and historical information. Based on these relationships, the proposed model (1) is effective in multiple objects segmentation tasks, (2) just needs images as input, and (3) produces better segmentation results as more relationships are considered. The experimental results on multiple datasets show that this paper’s method achieves state-of-art performance. The quantitative and qualitative analyses of the result are conducted. The proposed method could be easily extended to other similar applications.Keywords: unsupervised object segmentation, attention mechanism, contrastive learning, structured environment
Procedia PDF Downloads 10926797 A Validated UPLC-MS/MS Assay Using Negative Ionization Mode for High-Throughput Determination of Pomalidomide in Rat Plasma
Authors: Muzaffar Iqbal, Essam Ezzeldin, Khalid A. Al-Rashood
Abstract:
Pomalidomide is a second generation oral immunomodulatory agent, being used for the treatment of multiple myeloma in patients with disease refractory to lenalidomide and bortezomib. In this study, a sensitive UPLC-MS/MS assay was developed and validated for high-throughput determination of pomalidomide in rat plasma using celecoxib as an internal standard (IS). Liquid liquid extraction using dichloromethane as extracting agent was employed to extract pomalidomide and IS from 200 µL of plasma. Chromatographic separation was carried on Acquity BEHTM C18 column (50 × 2.1 mm, 1.7 µm) using an isocratic mobile phase of acetonitrile:10 mM ammonium acetate (80:20, v/v), at a flow rate of 0.250 mL/min. Both pomalidomide and IS were eluted at 0.66 ± 0.03 and 0.80 ± 0.03 min, respectively with a total run time of 1.5 min only. Detection was performed on a triple quadrupole tandem mass spectrometer using electrospray ionization in negative mode. The precursor to product ion transitions of m/z 272.01 → 160.89 for pomalidomide and m/z 380.08 → 316.01 for IS were used to quantify them respectively, using multiple reaction monitoring mode. The developed method was validated according to regulatory guideline for bioanalytical method validation. The linearity in plasma sample was achieved in the concentration range of 0.47–400 ng/mL (r2 ≥ 0.997). The intra and inter-day precision values were ≤ 11.1% (RSD, %) whereas accuracy values ranged from - 6.8 – 8.5% (RE, %). In addition, other validation results were within the acceptance criteria and the method was successfully applied in a pharmacokinetic study of pomalidomide in rats.Keywords: pomalidomide, pharmacokinetics, LC-MS/MS, celecoxib
Procedia PDF Downloads 39126796 A webGIS Methodology to Support Sediments Management in Wallonia
Authors: Nathalie Stephenne, Mathieu Veschkens, Stéphane Palm, Christophe Charlemagne, Jacques Defoux
Abstract:
According to Europe’s first River basin Management Plans (RBMPs), 56% of European rivers failed to achieve the good status targets of the Water Framework Directive WFD. In Central European countries such as Belgium, even more than 80% of rivers failed to achieve the WFD quality targets. Although the RBMP’s should reduce the stressors and improve water body status, their potential to address multiple stress situations is limited due to insufficient knowledge on combined effects, multi-stress, prioritization of measures, impact on ecology and implementation effects. This paper describes a webGis prototype developed for the Walloon administration to improve the communication and the management of sediment dredging actions carried out in rivers and lakes in the frame of RBMPs. A large number of stakeholders are involved in the management of rivers and lakes in Wallonia. They are in charge of technical aspects (client and dredging operators, organizations involved in the treatment of waste…), management (managers involved in WFD implementation at communal, provincial or regional level) or policy making (people responsible for policy compliance or legislation revision). These different kinds of stakeholders need different information and data to cover their duties but have to interact closely at different levels. Moreover, information has to be shared between them to improve the management quality of dredging operations within the ecological system. In the Walloon legislation, leveling dredged sediments on banks requires an official authorization from the administration. This request refers to spatial information such as the official land use map, the cadastral map, the distance to potential pollution sources. The production of a collective geodatabase can facilitate the management of these authorizations from both sides. The proposed internet system integrates documents, data input, integration of data from disparate sources, map representation, database queries, analysis of monitoring data, presentation of results and cartographic visualization. A prototype of web application using the API geoviewer chosen by the Geomatic department of the SPW has been developed and discussed with some potential users to facilitate the communication, the management and the quality of the data. The structure of the paper states the why, what, who and how of this communication tool.Keywords: sediments, web application, GIS, rivers management
Procedia PDF Downloads 40526795 Feature Analysis of Predictive Maintenance Models
Authors: Zhaoan Wang
Abstract:
Research in predictive maintenance modeling has improved in the recent years to predict failures and needed maintenance with high accuracy, saving cost and improving manufacturing efficiency. However, classic prediction models provide little valuable insight towards the most important features contributing to the failure. By analyzing and quantifying feature importance in predictive maintenance models, cost saving can be optimized based on business goals. First, multiple classifiers are evaluated with cross-validation to predict the multi-class of failures. Second, predictive performance with features provided by different feature selection algorithms are further analyzed. Third, features selected by different algorithms are ranked and combined based on their predictive power. Finally, linear explainer SHAP (SHapley Additive exPlanations) is applied to interpret classifier behavior and provide further insight towards the specific roles of features in both local predictions and global model behavior. The results of the experiments suggest that certain features play dominant roles in predictive models while others have significantly less impact on the overall performance. Moreover, for multi-class prediction of machine failures, the most important features vary with type of machine failures. The results may lead to improved productivity and cost saving by prioritizing sensor deployment, data collection, and data processing of more important features over less importance features.Keywords: automated supply chain, intelligent manufacturing, predictive maintenance machine learning, feature engineering, model interpretation
Procedia PDF Downloads 13326794 The Determinants of Financial Stability: Evidence from Jordan
Authors: Wasfi Al Salamat, Shaker Al-Kharouf
Abstract:
This study aims to examine the determinants of financial stability for 13 commercial banks listed on the Amman stock exchange (ASE) over the period (2007-2016) after controlling for the independent variables: return on equity (ROE), return on assets (ROA), earnings per share (EPS), growth in gross domestic product (GDP), inflation rate and debt ratio to measure the financial stability by three main variables: capital adequacy, non-performing loans and the number of returned checks. The balanced panel data statistical approach has been used for data analysis. Results are estimated by using multiple regression models. The empirical results suggested that there is statistically significant negative effect of inflation rate and debt ratio on the capital adequacy while there is statistically significant positive effect of growth in gross domestic product on capital adequacy. In contrast, there is statistically significant negative effect of return on equity and growth in gross domestic product on the non-performing loans while there is statistically significant positive effect of inflation rate on non-performing loans. Finally, there is statistically significant negative effect of growth in gross domestic product on the number of returned checks while there is statistically significant positive effect of inflation rate on the number of returned checks.Keywords: capital adequacy, financial stability, non-performing loans, number of returned checks, ASE
Procedia PDF Downloads 22426793 Time Travel Testing: A Mechanism for Improving Renewal Experience
Authors: Aritra Majumdar
Abstract:
While organizations strive to expand their new customer base, retaining existing relationships is a key aspect of improving overall profitability and also showcasing how successful an organization is in holding on to its customers. It is an experimentally proven fact that the lion’s share of profit always comes from existing customers. Hence seamless management of renewal journeys across different channels goes a long way in improving trust in the brand. From a quality assurance standpoint, time travel testing provides an approach to both business and technology teams to enhance the customer experience when they look to extend their partnership with the organization for a defined phase of time. This whitepaper will focus on key pillars of time travel testing: time travel planning, time travel data preparation, and enterprise automation. Along with that, it will call out some of the best practices and common accelerator implementation ideas which are generic across verticals like healthcare, insurance, etc. In this abstract document, a high-level snapshot of these pillars will be provided. Time Travel Planning: The first step of setting up a time travel testing roadmap is appropriate planning. Planning will include identifying the impacted systems that need to be time traveled backward or forward depending on the business requirement, aligning time travel with other releases, frequency of time travel testing, preparedness for handling renewal issues in production after time travel testing is done and most importantly planning for test automation testing during time travel testing. Time Travel Data Preparation: One of the most complex areas in time travel testing is test data coverage. Aligning test data to cover required customer segments and narrowing it down to multiple offer sequencing based on defined parameters are keys for successful time travel testing. Another aspect is the availability of sufficient data for similar combinations to support activities like defect retesting, regression testing, post-production testing (if required), etc. This section will talk about the necessary steps for suitable data coverage and sufficient data availability from a time travel testing perspective. Enterprise Automation: Time travel testing is never restricted to a single application. The workflow needs to be validated in the downstream applications to ensure consistency across the board. Along with that, the correctness of offers across different digital channels needs to be checked in order to ensure a smooth customer experience. This section will talk about the focus areas of enterprise automation and how automation testing can be leveraged to improve the overall quality without compromising on the project schedule. Along with the above-mentioned items, the white paper will elaborate on the best practices that need to be followed during time travel testing and some ideas pertaining to accelerator implementation. To sum it up, this paper will be written based on the real-time experience author had on time travel testing. While actual customer names and program-related details will not be disclosed, the paper will highlight the key learnings which will help other teams to implement time travel testing successfully.Keywords: time travel planning, time travel data preparation, enterprise automation, best practices, accelerator implementation ideas
Procedia PDF Downloads 15926792 South African Multiple Deprivation-Concentration Index Quantiles Differentiated by Components of Success and Impediment to Tuberculosis Control Programme Using Mathematical Modelling in Rural O. R. Tambo District Health Facilities
Authors: Ntandazo Dlatu, Benjamin Longo-Mbenza, Andre Renzaho, Ruffin Appalata, Yolande Yvonne Valeria Matoumona Mavoungou, Mbenza Ben Longo, Kenneth Ekoru, Blaise Makoso, Gedeon Longo Longo
Abstract:
Background: The gap between complexities related to the integration of Tuberculosis /HIV control and evidence-based knowledge motivated the initiation of the study. Therefore, the objective of this study was to explore correlations between national TB management guidelines, multiple deprivation indexes, quantiles, components and levels of Tuberculosis control programme using mathematical modeling in rural O.R. Tambo District Health Facilities, South Africa. Methods: The study design used mixed secondary data analysis and cross-sectional analysis between 2009 and 2013 across O.R Tambo District, Eastern Cape, South Africa using univariate/ bivariate analysis, linear multiple regression models, and multivariate discriminant analysis. Health inequalities indicators and component of an impediment to the tuberculosis control programme were evaluated. Results: In total, 62 400 records for TB notification were analyzed for the period 2009-2013. There was a significant but negative between Financial Year Expenditure (r= -0.894; P= 0.041) Seropositive HIV status(r= -0.979; P= 0.004), Population Density (r = -0.881; P= 0.048) and the number of TB defaulter in all TB cases. It was shown unsuccessful control of TB management program through correlations between numbers of new PTB smear positive, TB defaulter new smear-positive, TB failure all TB, Pulmonary Tuberculosis case finding index and deprivation-concentration-dispersion index. It was shown successful TB program control through significant and negative associations between declining numbers of death in co-infection of HIV and TB, TB deaths all TB and SMIAD gradient/ deprivation-concentration-dispersion index. The multivariate linear model was summarized by unadjusted r of 96%, adjusted R2 of 95 %, Standard Error of estimate of 0.110, R2 changed of 0.959 and significance for variance change for P=0.004 to explain the prediction of TB defaulter in all TB with equation y= 8.558-0.979 x number of HIV seropositive. After adjusting for confounding factors (PTB case finding the index, TB defaulter new smear-positive, TB death in all TB, TB defaulter all TB, and TB failure in all TB). The HIV and TB death, as well as new PTB smear positive, were identified as the most important, significant, and independent indicator to discriminate most deprived deprivation index far from other deprivation quintiles 2-5 using discriminant analysis. Conclusion: Elimination of poverty such as overcrowding, lack of sanitation and environment of highest burden of HIV might end the TB threat in O.R Tambo District, Eastern Cape, South Africa. Furthermore, ongoing adequate budget comprehensive, holistic and collaborative initiative towards Sustainable Developmental Goals (SDGs) is necessary for complete elimination of TB in poor O.R Tambo District.Keywords: tuberculosis, HIV/AIDS, success, failure, control program, health inequalities, South Africa
Procedia PDF Downloads 17026791 Cross Project Software Fault Prediction at Design Phase
Authors: Pradeep Singh, Shrish Verma
Abstract:
Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.Keywords: software metrics, fault prediction, cross project, within project.
Procedia PDF Downloads 344