Search results for: electroacoustic methods
14931 An Efficient Propensity Score Method for Causal Analysis With Application to Case-Control Study in Breast Cancer Research
Authors: Ms Azam Najafkouchak, David Todem, Dorothy Pathak, Pramod Pathak, Joseph Gardiner
Abstract:
Propensity score (PS) methods have recently become the standard analysis as a tool for the causal inference in the observational studies where exposure is not randomly assigned, thus, confounding can impact the estimation of treatment effect on the outcome. For the binary outcome, the effect of treatment on the outcome can be estimated by odds ratios, relative risks, and risk differences. However, using the different PS methods may give you a different estimation of the treatment effect on the outcome. Several methods of PS analyses have been used mainly, include matching, inverse probability of weighting, stratification, and covariate adjusted on PS. Due to the dangers of discretizing continuous variables (exposure, covariates), the focus of this paper will be on how the variation in cut-points or boundaries will affect the average treatment effect (ATE) utilizing the stratification of PS method. Therefore, we are trying to avoid choosing arbitrary cut-points, instead, we continuously discretize the PS and accumulate information across all cut-points for inferences. We will use Monte Carlo simulation to evaluate ATE, focusing on two PS methods, stratification and covariate adjusted on PS. We will then show how this can be observed based on the analyses of the data from a case-control study of breast cancer, the Polish Women’s Health Study.Keywords: average treatment effect, propensity score, stratification, covariate adjusted, monte Calro estimation, breast cancer, case_control study
Procedia PDF Downloads 10714930 An Approach to Capture, Evaluate and Handle Complexity of Engineering Change Occurrences in New Product Development
Authors: Mohammad Rostami Mehr, Seyed Arya Mir Rashed, Arndt Lueder, Magdalena Missler-Behr
Abstract:
This paper represents the conception that complex problems do not necessarily need a similar complex solution in order to cope with the complexity. Furthermore, a simple solution based on established methods can provide a sufficient way to deal with the complexity. To verify this conception, the presented paper focuses on the field of change management as a part of the new product development process in the automotive sector. In this field, dealing with increasing complexity is essential, while only non-flexible rigid processes that are not designed to handle complexity are available. The basic methodology of this paper can be divided into four main sections: 1) analyzing the complexity of the change management, 2) literature review in order to identify potential solutions and methods, 3) capturing and implementing expertise of experts from the change management field of an automobile manufacturing company and 4) systematical comparison of the identified methods from literature and connecting these with defined requirements of the complexity of the change management in order to develop a solution. As a practical outcome, this paper provides a method to capture the complexity of engineering changes (EC) and includes it within the EC evaluation process, following case-related process guidance to cope with the complexity. Furthermore, this approach supports the conception that dealing with complexity is possible while utilizing rather simple and established methods by combining them into a powerful tool.Keywords: complexity management, new product development, engineering change management, flexibility
Procedia PDF Downloads 19814929 Earthquake Classification in Molluca Collision Zone Using Conventional Statistical Methods
Authors: H. J. Wattimanela, U. S. Passaribu, A. N. T. Puspito, S. W. Indratno
Abstract:
Molluca Collision Zone is located at the junction of the Eurasian plate, Australian, Pacific, and the Philippines. Between the Sangihe arc, west of the collision zone, and to the east of Halmahera arc is active collision and convex toward the Molluca Sea. This research will analyze the behavior of earthquake occurrence in Molluca Collision Zone related to the distributions of an earthquake in each partition regions, determining the type of distribution of a occurrence earthquake of partition regions, and the mean occurrence of earthquakes each partition regions, and the correlation between the partitions region. We calculate number of earthquakes using partition method and its behavioral using conventional statistical methods. The data used is the data type of shallow earthquakes with magnitudes ≥ 4 SR for the period 1964-2013 in the Molluca Collision Zone. From the results, we can classify partitioned regions based on the correlation into two classes: strong and very strong. This classification can be used for early warning system in disaster management.Keywords: molluca collision zone, partition regions, conventional statistical methods, earthquakes, classifications, disaster management
Procedia PDF Downloads 49914928 The Effect of Simultaneous Application of Laser Beam and Magnet in Treatment of Intervertebral Disc Herniation
Authors: Alireza Moghtaderi, Negin Khakpour
Abstract:
Disc Herniation is a common complication in the society and it is one of the main reasons for referring to physical medicine and rehabilitation clinics. Despite of various methods proposed for treatingthis disease, still there is disagreement on success of these methods especially in non-surgical methods, and thus current study aims at determining effect of laser beam and magnet on treatment of Intervertebral Disc Herniation. During a clinical trial study, 80 patients with Intervertebral Disc Herniation underwent a combined package of treatment including magnet, laser beam, PRP and Prolotherapy during 6 months. Average age of patients was 51.25 ± 10.7 with range of 25 – 71 years. 30 men (37.5%) and 50 women (62.5%) took part in the study. average weight of patients was 64.3 ± 7.2 with range of 49 – 79 kg. highest level of Disc Herniation was L5 – S1 with frequency of 17 cases (21.3%). Disc Herniation was severe in 30 cases before treatment, but it reduced to 3 casesafter treatment. This study indicates effect of combined treatment using non-invasive laser beam and magnet therapy on disco genic diseases and mechanical pains of spine is highly effective.Keywords: hallux, valgus, botulinum toxin a, pain
Procedia PDF Downloads 9214927 Observed Changes in Constructed Precipitation at High Resolution in Southern Vietnam
Authors: Nguyen Tien Thanh, Günter Meon
Abstract:
Precipitation plays a key role in water cycle, defining the local climatic conditions and in ecosystem. It is also an important input parameter for water resources management and hydrologic models. With spatial continuous data, a certainty of discharge predictions or other environmental factors is unquestionably better than without. This is, however, not always willingly available to acquire for a small basin, especially for coastal region in Vietnam due to a low network of meteorological stations (30 stations) on long coast of 3260 km2. Furthermore, available gridded precipitation datasets are not fine enough when applying to hydrologic models. Under conditions of global warming, an application of spatial interpolation methods is a crucial for the climate change impact studies to obtain the spatial continuous data. In recent research projects, although some methods can perform better than others do, no methods draw the best results for all cases. The objective of this paper therefore, is to investigate different spatial interpolation methods for daily precipitation over a small basin (approximately 400 km2) located in coastal region, Southern Vietnam and find out the most efficient interpolation method on this catchment. The five different interpolation methods consisting of cressman, ordinary kriging, regression kriging, dual kriging and inverse distance weighting have been applied to identify the best method for the area of study on the spatio-temporal scale (daily, 10 km x 10 km). A 30-year precipitation database was created and merged into available gridded datasets. Finally, observed changes in constructed precipitation were performed. The results demonstrate that the method of ordinary kriging interpolation is an effective approach to analyze the daily precipitation. The mixed trends of increasing and decreasing monthly, seasonal and annual precipitation have documented at significant levels.Keywords: interpolation, precipitation, trend, vietnam
Procedia PDF Downloads 27614926 Students’ and Clinical Supervisors’ Experiences of Occupational Therapy Practice Education: A Structured Critical Review
Authors: Hamad Alhamad, Catriona Khamisha, Emma Green, Yvonne Robb
Abstract:
Introduction: Practice education is a key component of occupational therapy education. This critical review aimed to explore students’ and clinical supervisors’ experiences of practice education, and to make recommendations for research. Method: The literature was systematically searched using five databases. Qualitative, quantitative and mixed methods studies were included. Critical Appraisal Skills Programme checklist for qualitative studies and Mixed Methods Assessment Tool for quantitative and mixed methods studies were used to assess study quality. Findings: Twenty-two studies with high quality scores were included: 16 qualitative, 3 quantitative and 3 mixed methods. Studies were conducted in Australia, Canada, USA and UK. During practice education, students learned professional skills, practical skills, clinical skills and problem-solving skills, and improved confidence and creativity. Supervisors had an opportunity to reflect on their practice and get experience of supervising students. However, clear objectives and expectations for students, and sufficient theoretical knowledge, preparation and resources for supervisors were required. Conclusion: Practice education provides different skills and experiences, necessary to become competent professionals; but some areas of practice education need to improve. Studies in non-western countries are needed to explore the perspectives of students and clinical supervisors in different cultures, to ensure the practice education models adopted are relevant.Keywords: occupational therapy, practice education, fieldwork, students, clinical supervisors
Procedia PDF Downloads 20314925 Statistical and Land Planning Study of Tourist Arrivals in Greece during 2005-2016
Authors: Dimitra Alexiou
Abstract:
During the last 10 years, in spite of the economic crisis, the number of tourists arriving in Greece has increased, particularly during the tourist season from April to October. In this paper, the number of annual tourist arrivals is studied to explore their preferences with regard to the month of travel, the selected destinations, as well the amount of money spent. The collected data are processed with statistical methods, yielding numerical and graphical results. From the computation of statistical parameters and the forecasting with exponential smoothing, useful conclusions are arrived at that can be used by the Greek tourism authorities, as well as by tourist organizations, for planning purposes for the coming years. The results of this paper and the computed forecast can also be used for decision making by private tourist enterprises that are investing in Greece. With regard to the statistical methods, the method of Simple Exponential Smoothing of time series of data is employed. The search for a best forecast for 2017 and 2018 provides the value of the smoothing coefficient. For all statistical computations and graphics Microsoft Excel is used.Keywords: tourism, statistical methods, exponential smoothing, land spatial planning, economy
Procedia PDF Downloads 26514924 Estimating Estimators: An Empirical Comparison of Non-Invasive Analysis Methods
Authors: Yan Torres, Fernanda Simoes, Francisco Petrucci-Fonseca, Freddie-Jeanne Richard
Abstract:
The non-invasive samples are an alternative of collecting genetic samples directly. Non-invasive samples are collected without the manipulation of the animal (e.g., scats, feathers and hairs). Nevertheless, the use of non-invasive samples has some limitations. The main issue is degraded DNA, leading to poorer extraction efficiency and genotyping. Those errors delayed for some years a widespread use of non-invasive genetic information. Possibilities to limit genotyping errors can be done using analysis methods that can assimilate the errors and singularities of non-invasive samples. Genotype matching and population estimation algorithms can be highlighted as important analysis tools that have been adapted to deal with those errors. Although, this recent development of analysis methods there is still a lack of empirical performance comparison of them. A comparison of methods with dataset different in size and structure can be useful for future studies since non-invasive samples are a powerful tool for getting information specially for endangered and rare populations. To compare the analysis methods, four different datasets used were obtained from the Dryad digital repository were used. Three different matching algorithms (Cervus, Colony and Error Tolerant Likelihood Matching - ETLM) are used for matching genotypes and two different ones for population estimation (Capwire and BayesN). The three matching algorithms showed different patterns of results. The ETLM produced less number of unique individuals and recaptures. A similarity in the matched genotypes between Colony and Cervus was observed. That is not a surprise since the similarity between those methods on the likelihood pairwise and clustering algorithms. The matching of ETLM showed almost no similarity with the genotypes that were matched with the other methods. The different cluster algorithm system and error model of ETLM seems to lead to a more criterious selection, although the processing time and interface friendly of ETLM were the worst between the compared methods. The population estimators performed differently regarding the datasets. There was a consensus between the different estimators only for the one dataset. The BayesN showed higher and lower estimations when compared with Capwire. The BayesN does not consider the total number of recaptures like Capwire only the recapture events. So, this makes the estimator sensitive to data heterogeneity. Heterogeneity in the sense means different capture rates between individuals. In those examples, the tolerance for homogeneity seems to be crucial for BayesN work properly. Both methods are user-friendly and have reasonable processing time. An amplified analysis with simulated genotype data can clarify the sensibility of the algorithms. The present comparison of the matching methods indicates that Colony seems to be more appropriated for general use considering a time/interface/robustness balance. The heterogeneity of the recaptures affected strongly the BayesN estimations, leading to over and underestimations population numbers. Capwire is then advisable to general use since it performs better in a wide range of situations.Keywords: algorithms, genetics, matching, population
Procedia PDF Downloads 14414923 Bacteriological Culture Methods and its Uses in Clinical Pathology
Authors: Prachi Choudhary, Jai Gopal Sharma
Abstract:
Microbial cultures determine the type of organism, its abundance in the tested sample, or both. It is one of the primary diagnostic methods of microbiology. It is used to determine the cause of infectious disease by letting the agent multiply in a predetermined medium. Different bacteria produce colonies that may be very distinct from the bacterial species that produced them. To culture any pathogen or microorganism, we should first know about the types of media used in microbiology for culturing. Sometimes sub culturing is also done in various microorganisms if some mixed growth is seen in culture. Nearly 3 types of culture media based on consistency – solid, semi-solid, and liquid (broth) media; are further explained in the report. Then, The Five I's approach is a method for locating, growing, observing, and characterizing microorganisms, including inoculation and incubation. Isolation, inspection, and identification. For identification of bacteria, we have to culture the sample like urine, sputum, blood, etc., on suitable media; there are different methods of culturing the bacteria or microbe like pour plate method, streak plate method, swabbing by needle, pipetting, inoculation by loop, spreading by spreader, etc. After this, we see the bacterial growth after incubation of 24 hours, then according to the growth of bacteria antibiotics susceptibility test is conducted; this is done for sensitive antibiotics or resistance to that bacteria, and also for knowing the name of bacteria. Various methods like the dilution method, disk diffusion method, E test, etc., do antibiotics susceptibility tests. After that, various medicines are provided to the patients according to antibiotic sensitivity and resistance.Keywords: inoculation, incubation, isolation, antibiotics suspectibility test, characterizing
Procedia PDF Downloads 8314922 Analysis of Non-Coding Genome in Streptococcus pneumoniae for Molecular Epidemiology Typing
Authors: Martynova Alina, Lyubov Buzoleva
Abstract:
Streptococcus pneumoniae is the causative agent of pneumonias and meningitids throught all the world. Having high genetic diversity, this microorganism can cause different clinical forms of pneumococcal infections and microbiologically it is really difficult diagnosed by routine methods. Also, epidemiological surveillance requires more developed methods of molecular typing because the recent method of serotyping doesn't allow to distinguish invasive and non-invasive isolates properly. Non-coding genome of bacteria seems to be the interesting source for seeking of highly distinguishable markers to discriminate the subspecies of such a variable bacteria as Streptococcus pneumoniae. Technically, we proposed scheme of discrimination of S.pneumoniae strains with amplification of non-coding region (SP_1932) with the following restriction with 2 types of enzymes of Alu1 and Mn1. Aim: This research aimed to compare different methods of typing and their application for molecular epidemiology purposes. Methods: we analyzed population of 100 strains of S.pneumoniae isolated from different patients by different molecular epidemiology methods such as pulse-field gel electophoresis (PFGE), restriction polymorphism analysis (RFLP) and multilolocus sequence typing (MLST), and all of them were compared with classic typing method as serotyping. The discriminative power was estimated with Simpson Index (SI). Results: We revealed that the most discriminative typing method is RFLP (SI=0,97, there were distinguished 42 genotypes).PFGE was slightly less discriminative (SI=0,95, we identified 35 genotypes). MLST is still the best reference method (SI=1.0). Classic method of serotyping showed quite weak discriminative power (SI=0,93, 24 genotypes). In addition, sensivity of RFLP was 100%, specificity was 97,09%. Conclusion: the most appropriate method for routine epidemiology surveillance is RFLP with non-coding region of Streptococcsu pneumoniae, then PFGE, though in some cases these results should be obligatory confirmed by MLST.Keywords: molecular epidemiology typing, non-coding genome, Streptococcus pneumoniae, MLST
Procedia PDF Downloads 40014921 Multilabel Classification with Neural Network Ensemble Method
Authors: Sezin Ekşioğlu
Abstract:
Multilabel classification has a huge importance for several applications, it is also a challenging research topic. It is a kind of supervised learning that contains binary targets. The distance between multilabel and binary classification is having more than one class in multilabel classification problems. Features can belong to one class or many classes. There exists a wide range of applications for multi label prediction such as image labeling, text categorization, gene functionality. Even though features are classified in many classes, they may not always be properly classified. There are many ensemble methods for the classification. However, most of the researchers have been concerned about better multilabel methods. Especially little ones focus on both efficiency of classifiers and pairwise relationships at the same time in order to implement better multilabel classification. In this paper, we worked on modified ensemble methods by getting benefit from k-Nearest Neighbors and neural network structure to address issues within a beneficial way and to get better impacts from the multilabel classification. Publicly available datasets (yeast, emotion, scene and birds) are performed to demonstrate the developed algorithm efficiency and the technique is measured by accuracy, F1 score and hamming loss metrics. Our algorithm boosts benchmarks for each datasets with different metrics.Keywords: multilabel, classification, neural network, KNN
Procedia PDF Downloads 15514920 Flicker Detection with Motion Tolerance for Embedded Camera
Authors: Jianrong Wu, Xuan Fu, Akihiro Higashi, Zhiming Tan
Abstract:
CMOS image sensors with a rolling shutter are used broadly in the digital cameras embedded in mobile devices. The rolling shutter suffers the flicker artifacts from the fluorescent lamp, and it could be observed easily. In this paper, the characteristics of illumination flicker in motion case were analyzed, and two efficient detection methods based on matching fragment selection were proposed. According to the experimental results, our methods could achieve as high as 100% accuracy in static scene, and at least 97% in motion scene.Keywords: illumination flicker, embedded camera, rolling shutter, detection
Procedia PDF Downloads 42214919 Impact Position Method Based on Distributed Structure Multi-Agent Coordination with JADE
Authors: YU Kaijun, Liang Dong, Zhang Yarong, Jin Zhenzhou, Yang Zhaobao
Abstract:
For the impact monitoring of distributed structures, the traditional positioning methods are based on the time difference, which includes the four-point arc positioning method and the triangulation positioning method. But in the actual operation, these two methods have errors. In this paper, the Multi-Agent Blackboard Coordination Principle is used to combine the two methods. Fusion steps: (1) The four-point arc locating agent calculates the initial point and records it to the Blackboard Module.(2) The triangulation agent gets its initial parameters by accessing the initial point.(3) The triangulation agent constantly accesses the blackboard module to update its initial parameters, and it also logs its calculated point into the blackboard.(4) When the subsequent calculation point and the initial calculation point are within the allowable error, the whole coordination fusion process is finished. This paper presents a Multi-Agent collaboration method whose agent framework is JADE. The JADE platform consists of several agent containers, with the agent running in each container. Because of the perfect management and debugging tools of the JADE, it is very convenient to deal with complex data in a large structure. Finally, based on the data in Jade, the results show that the impact location method based on Multi-Agent coordination fusion can reduce the error of the two methods.Keywords: impact monitoring, structural health monitoring(SHM), multi-agent system(MAS), black-board coordination, JADE
Procedia PDF Downloads 17814918 Modification of Underwood's Equation to Calculate Minimum Reflux Ratio for Column with One Side Stream Upper Than Feed
Authors: S. Mousavian, A. Abedianpour, A. Khanmohammadi, S. Hematian, Gh. Eidi Veisi
Abstract:
Distillation is one of the most important and utilized separation methods in the industrial practice. There are different ways to design of distillation column. One of these ways is short cut method. In short cut method, material balance and equilibrium are employed to calculate number of tray in distillation column. There are different methods that are classified in short cut method. One of these methods is Fenske-Underwood-Gilliland method. In this method, minimum reflux ratio should be calculated by underwood equation. Underwood proposed an equation that is useful for simple distillation column with one feed and one top and bottom product. In this study, underwood method is developed to predict minimum reflux ratio for column with one side stream upper than feed. The result of this model compared with McCabe-Thiele method. The result shows that proposed method able to calculate minimum reflux ratio with very small error.Keywords: minimum reflux ratio, side stream, distillation, Underwood’s method
Procedia PDF Downloads 40714917 Determination of Biological Efficiency Values of Some Pesticide Application Methods under Second Crop Maize Conditions
Authors: Ali Bolat, Ali Bayat, Mustafa Gullu
Abstract:
Maize can be cultivated both under main and second crop conditions in Turkey. Main pests of maize under second crop conditions are Sesamia nonagrioides Lefebvre (Lepidoptera: Noctuidae) and Ostrinia nubilalis Hübner (Lepidoptera: Crambidae). Aerial spraying applications to control these two main maize pests can be carried out until 2006 in Turkey before it was banned due to environmental concerns like drifting of sprayed pestisides and low biological efficiency. In this context, pulverizers which can spray tall maize plants ( > 175 cm) from the ground have begun to be used. However, the biological efficiency of these sprayers is unknown. Some methods have been tested to increase the success of ground spraying in field experiments conducted in second crop maize in 2008 and 2009. For this aim, 6 spraying methods (air assisted spraying with TX cone jet, domestic cone nozzles, twinjet nozzles, air induction nozzles, standard domestic cone nozzles and tail booms) were used at two application rates (150 and 300 l.ha-1) by a sprayer. In the study, biological efficacy evaluations of each methods were measured in each parcel. Biological efficacy evaluations included counts of number of insect damaged plants, number of holes in stems and live larvae and pupa in stems of selected plants. As a result, the highest biological efficacy value (close to 70%) was obtained from Air Assisted Spraying method at 300 l / ha application volume.Keywords: air assisted sprayer, drift nozzles, biological efficiency, maize plant
Procedia PDF Downloads 21414916 Comparison between XGBoost, LightGBM and CatBoost Using a Home Credit Dataset
Authors: Essam Al Daoud
Abstract:
Gradient boosting methods have been proven to be a very important strategy. Many successful machine learning solutions were developed using the XGBoost and its derivatives. The aim of this study is to investigate and compare the efficiency of three gradient methods. Home credit dataset is used in this work which contains 219 features and 356251 records. However, new features are generated and several techniques are used to rank and select the best features. The implementation indicates that the LightGBM is faster and more accurate than CatBoost and XGBoost using variant number of features and records.Keywords: gradient boosting, XGBoost, LightGBM, CatBoost, home credit
Procedia PDF Downloads 17414915 Software Development for AASHTO and Ethiopian Roads Authority Flexible Pavement Design Methods
Authors: Amare Setegn Enyew, Bikila Teklu Wodajo
Abstract:
The primary aim of flexible pavement design is to ensure the development of economical and safe road infrastructure. However, failures can still occur due to improper or erroneous structural design. In Ethiopia, the design of flexible pavements relies on doing calculations manually and selecting pavement structure from catalogue. The catalogue offers, in eight different charts, alternative structures for combinations of traffic and subgrade classes, as outlined in the Ethiopian Roads Authority (ERA) Pavement Design Manual 2001. Furthermore, design modification is allowed in accordance with the structural number principles outlined in the AASHTO 1993 Guide for Design of Pavement Structures. Nevertheless, the manual calculation and design process involves the use of nomographs, charts, tables, and formulas, which increases the likelihood of human errors and inaccuracies, and this may lead to unsafe or uneconomical road construction. To address the challenge, a software called AASHERA has been developed for AASHTO 1993 and ERA design methods, using MATLAB language. The software accurately determines the required thicknesses of flexible pavement surface, base, and subbase layers for the two methods. It also digitizes design inputs and references like nomographs, charts, default values, and tables. Moreover, the software allows easier comparison of the two design methods in terms of results and cost of construction. AASHERA's accuracy has been confirmed through comparisons with designs from handbooks and manuals. The software can aid in reducing human errors, inaccuracies, and time consumption as compared to the conventional manual design methods employed in Ethiopia. AASHERA, with its validated accuracy, proves to be an indispensable tool for flexible pavement structure designers.Keywords: flexible pavement design, AASHTO 1993, ERA, MATLAB, AASHERA
Procedia PDF Downloads 6314914 Virtual Customer Integration in Innovation Development: A Systematic Literature Review
Authors: Chau Nguyen Pham Minh
Abstract:
The aim of this study is to answer the following research question: What do we know about virtual customer integration in innovation development based on existing empirical research? The paper is based on a systematic review of 136 articles which were published in the past 16 years. The analysis focuses on three areas: what forms of virtual customer integration (e.g. netnography, online co-creation, virtual experience) have been applied in innovation development; how have virtual customer integration methods effectively been utilized by firms; and what are the influences of virtual customer integration on innovation development activities? Through the detailed analysis, the study provides researchers with broad understanding about virtual customer integration in innovation development. The study shows that practitioners and researchers increasingly pay attention on using virtual customer integration methods in developing innovation since those methods have dominant advantages in interact with customers in order to generate the best ideas for innovation development. Additionally, the findings indicate that netnography has been the most common method in integrating with customers for idea generation; while virtual product experience has been mainly used in product testing. Moreover, the analysis also reveals the positive and negative influences of virtual customer integration in innovation development from both process and strategic perspectives. Most of the review studies examined the phenomenon from company’s perspectives to understand the process of applying virtual customer integration methods and their impacts; however, the customers’ perspective on participating in the virtual interaction has been inadequately studied; therefore, it creates many potential interesting research paths for future studies.Keywords: innovation, virtual customer integration, co-creation, netnography, new product development
Procedia PDF Downloads 33714913 Interior Design: Changing Values
Authors: Kika Ioannou Kazamia
Abstract:
This paper examines the action research cycle of the second phase of longitudinal research on sustainable interior design practices, between two groups of stakeholders, designers and clients. During this phase of the action research, the second step - the change stage - of Lewin’s change management model has been utilized to change values, approaches, and attitudes toward sustainable design practices among the participants. Affective domain learning theory is utilized to attach new values. Learning with the use of information technology, collaborative learning, and problem-based learning are the learning methods implemented toward the acquisition of the objectives. Learning methods, and aims, require the design of interventions with participants' involvement in activities that would lead to the acknowledgment of the benefits of sustainable practices. Interventions are steered to measure participants’ decisions for the worth and relevance of ideas, and experiences; accept or commit to a particular stance or action. The data collection methods used in this action research are observers’ reports, participants' questionnaires, and interviews. The data analyses use both quantitative and qualitative methods. The main beneficial aspect of the quantitative method was to provide the means to separate many factors that obscured the main qualitative findings. The qualitative method allowed data to be categorized, to adapt the deductive approach, and then examine for commonalities that could reflect relevant categories or themes. The results from the data indicate that during the second phase, designers and clients' participants altered their behaviours.Keywords: design, change, sustainability, learning, practices
Procedia PDF Downloads 8014912 Implementation of ADETRAN Language Using Message Passing Interface
Authors: Akiyoshi Wakatani
Abstract:
This paper describes the Message Passing Interface (MPI) implementation of ADETRAN language, and its evaluation on SX-ACE supercomputers. ADETRAN language includes pdo statement that specifies the data distribution and parallel computations and pass statement that specifies the redistribution of arrays. Two methods for implementation of pass statement are discussed and the performance evaluation using Splitting-Up CG method is presented. The effectiveness of the parallelization is evaluated and the advantage of one dimensional distribution is empirically confirmed by using the results of experiments.Keywords: iterative methods, array redistribution, translator, distributed memory
Procedia PDF Downloads 27114911 Cultural Embeddedness of E-Participation Methods in Hungary
Authors: Hajnalka Szarvas
Abstract:
The research examines the effectiveness of e-participation tools and methods from a point of view of cultural fitting to the Hungarian community traditions. Participation can have very different meanings depending on the local cultural and historical traditions, experiences of the certain societies. Generally when it is about e-democracy or e-participation tools most of the researches are dealing with its technological sides and novelties, but there is not much said about the cultural and social context of the different platforms. However from the perspective of their success it would be essential to look at the human factor too, the actual users, how the certain DMS or any online platform is fitting to the way of thought, the way of functioning of the certain society. Therefore the paper will explore that to what extent the different online platforms like Loomio, Democracy OS, Your Priorities EVoks, Populus, miutcank.hu, Liquid Democracy, Brain Bar Budapest Lab are compatible with the Hungarian mental structures and community traditions, the contents of collective mind about community functioning. As a result the influence of cultural embeddedness of the logic of e-participation development tools on success of these methods will be clearly seen. Furthermore the most crucial factors in general which determine the efficiency of e-participation development tools in Hungary will be demonstrated.Keywords: cultural embeddedness, e-participation, local community traditions, mental structures
Procedia PDF Downloads 30414910 Protein Remote Homology Detection by Using Profile-Based Matrix Transformation Approaches
Authors: Bin Liu
Abstract:
As one of the most important tasks in protein sequence analysis, protein remote homology detection has been studied for decades. Currently, the profile-based methods show state-of-the-art performance. Position-Specific Frequency Matrix (PSFM) is widely used profile. However, there exists noise information in the profiles introduced by the amino acids with low frequencies. In this study, we propose a method to remove the noise information in the PSFM by removing the amino acids with low frequencies called Top frequency profile (TFP). Three new matrix transformation methods, including Autocross covariance (ACC) transformation, Tri-gram, and K-separated bigram (KSB), are performed on these profiles to convert them into fixed length feature vectors. Combined with Support Vector Machines (SVMs), the predictors are constructed. Evaluated on two benchmark datasets, and experimental results show that these proposed methods outperform other state-of-the-art predictors.Keywords: protein remote homology detection, protein fold recognition, top frequency profile, support vector machines
Procedia PDF Downloads 12514909 Learning Fashion Construction and Manufacturing Methods from the Past: Cultural History and Genealogy at the Middle Tennessee State University Historic Clothing Collection
Authors: Teresa B. King
Abstract:
In the millennial age, with more students desiring a fashion major yet fewer having sewing and manufacturing knowledge, this increases demand on academicians to adequately educate. While fashion museums have a prominent place for historical preservation, the need for apparel education via working collections of handmade or mass manufactured apparel is lacking in most universities in the United States, especially in the Southern region. Created in 1988, Middle Tennessee State University’s historic clothing collection provides opportunities to study apparel construction methods throughout history, to compare and apply to today’s construction and manufacturing methods, as well as to learn the cyclical nature/importance of historic styles on current and upcoming fashion. In 2019, a class exercise experiment was implemented for which students researched their family genealogy using Ancestry.com, identified the oldest visual media (photographs, etc.) available, and analyzed the garment represented in said media. The student then located a comparable garment in the historic collection and evaluated the construction methods of the ancestor’s time period. A class 'fashion' genealogy tree was created and mounted for public viewing/education. Results of this exercise indicated that student learning increased due to the 'personal/familial connection' as it triggered more interest in historical garments as related to the student’s own personal culture. Students better identified garments regarding the historical time period, fiber content, fabric, and construction methods utilized, thus increasing learning and retention. Students also developed increased learning and recognition of custom construction methods versus current mass manufacturing techniques, which impact today’s fashion industry. A longitudinal effort will continue with the growth of the historic collection and as students continue to utilize the historic clothing collection.Keywords: ancestry, clothing history, fashion history, genealogy, historic fashion museum collection
Procedia PDF Downloads 13814908 Quantifying Product Impacts on Biodiversity: The Product Biodiversity Footprint
Authors: Leveque Benjamin, Rabaud Suzanne, Anest Hugo, Catalan Caroline, Neveux Guillaume
Abstract:
Human products consumption is one of the main drivers of biodiversity loss. However, few pertinent ecological indicators regarding product life cycle impact on species and ecosystems have been built. Life cycle assessment (LCA) methodologies are well under way to conceive standardized methods to assess this impact, by taking already partially into account three of the Millennium Ecosystem Assessment pressures (land use, pollutions, climate change). Coupling LCA and ecological data and methods is an emerging challenge to develop a product biodiversity footprint. This approach was tested on three case studies from food processing, textile, and cosmetic industries. It allowed first to improve the environmental relevance of the Potential Disappeared Fraction of species, end-point indicator typically used in life cycle analysis methods, and second to introduce new indicators on overexploitation and invasive species. This type of footprint is a major step in helping companies to identify their impacts on biodiversity and to propose potential improvements.Keywords: biodiversity, companies, footprint, life cycle assessment, products
Procedia PDF Downloads 32714907 Risk Measure from Investment in Finance by Value at Risk
Authors: Mohammed El-Arbi Khalfallah, Mohamed Lakhdar Hadji
Abstract:
Managing and controlling risk is a topic research in the world of finance. Before a risky situation, the stakeholders need to do comparison according to the positions and actions, and financial institutions must take measures of a particular market risk and credit. In this work, we study a model of risk measure in finance: Value at Risk (VaR), which is a new tool for measuring an entity's exposure risk. We explain the concept of value at risk, your average, tail, and describe the three methods for computing: Parametric method, Historical method, and numerical method of Monte Carlo. Finally, we briefly describe advantages and disadvantages of the three methods for computing value at risk.Keywords: average value at risk, conditional value at risk, tail value at risk, value at risk
Procedia PDF Downloads 44314906 Meet Automotive Software Safety and Security Standards Expectations More Quickly
Authors: Jean-François Pouilly
Abstract:
This study addresses the growing complexity of embedded systems and the critical need for secure, reliable software. Traditional cybersecurity testing methods, often conducted late in the development cycle, struggle to keep pace. This talk explores how formal methods, integrated with advanced analysis tools, empower C/C++ developers to 1) Proactively address vulnerabilities and bugs, which includes formal methods and abstract interpretation techniques to identify potential weaknesses early in the development process, reducing the reliance on penetration and fuzz testing in later stages. 2) Streamline development by focusing on bugs that matter, with close to no false positives and catching flaws earlier, the need for rework and retesting is minimized, leading to faster development cycles, improved efficiency and cost savings. 3) Enhance software dependability which includes combining static analysis using abstract interpretation with full context sensitivity, with hardware memory awareness allows for a more comprehensive understanding of potential vulnerabilities, leading to more dependable and secure software. This approach aligns with industry best practices (ISO2626 or ISO 21434) and empowers C/C++ developers to deliver robust, secure embedded systems that meet the demands of today's and tomorrow's applications. We will illustrate this approach with the TrustInSoft analyzer to show how it accelerates verification for complex cases, reduces user fatigue, and improves developer efficiency, cost-effectiveness, and software cybersecurity. In summary, integrating formal methods and sound Analyzers enhances software reliability and cybersecurity, streamlining development in an increasingly complex environment.Keywords: safety, cybersecurity, ISO26262, ISO24434, formal methods
Procedia PDF Downloads 2214905 A Pedagogical Case Study on Consumer Decision Making Models: A Selection of Smart Phone Apps
Authors: Yong Bum Shin
Abstract:
This case focuses on Weighted additive difference, Conjunctive, Disjunctive, and Elimination by aspects methodologies in consumer decision-making models and the Simple additive weighting (SAW) approach in the multi-criteria decision-making (MCDM) area. Most decision-making models illustrate that the rank reversal phenomenon is unpreventable. This paper presents that rank reversal occurs in popular managerial methods such as Weighted Additive Difference (WAD), Conjunctive Method, Disjunctive Method, Elimination by Aspects (EBA) and MCDM methods as well as such as the Simple Additive Weighting (SAW) and finally Unified Commensurate Multiple (UCM) models which successfully addresses these rank reversal problems in most popular MCDM methods in decision-making area.Keywords: multiple criteria decision making, rank inconsistency, unified commensurate multiple, analytic hierarchy process
Procedia PDF Downloads 8214904 Developing Learning in Organizations with Innovation Pedagogy Methods
Authors: T. Konst
Abstract:
Most jobs include training and communication tasks, but often the people in these jobs lack pedagogical competences to plan, implement and assess learning. This paper aims to discuss how a learning approach called innovation pedagogy developed in higher education can be utilized for learning development in various organizations. The methods presented how to implement innovation pedagogy such as process consultation and train the trainer model can provide added value to develop pedagogical knowhow in organizations and thus support their internal learning and development.Keywords: innovation pedagogy, learning, organizational development, process consultation
Procedia PDF Downloads 36914903 Arabic Handwriting Recognition Using Local Approach
Authors: Mohammed Arif, Abdessalam Kifouche
Abstract:
Optical character recognition (OCR) has a main role in the present time. It's capable to solve many serious problems and simplify human activities. The OCR yields to 70's, since many solutions has been proposed, but unfortunately, it was supportive to nothing but Latin languages. This work proposes a system of recognition of an off-line Arabic handwriting. This system is based on a structural segmentation method and uses support vector machines (SVM) in the classification phase. We have presented a state of art of the characters segmentation methods, after that a view of the OCR area, also we will address the normalization problems we went through. After a comparison between the Arabic handwritten characters & the segmentation methods, we had introduced a contribution through a segmentation algorithm.Keywords: OCR, segmentation, Arabic characters, PAW, post-processing, SVM
Procedia PDF Downloads 7414902 Research of Data Cleaning Methods Based on Dependency Rules
Authors: Yang Bao, Shi Wei Deng, WangQun Lin
Abstract:
This paper introduces the concept and principle of data cleaning, analyzes the types and causes of dirty data, and proposes several key steps of typical cleaning process, puts forward a well scalability and versatility data cleaning framework, in view of data with attribute dependency relation, designs several of violation data discovery algorithms by formal formula, which can obtain inconsistent data to all target columns with condition attribute dependent no matter data is structured (SQL) or unstructured (NoSQL), and gives 6 data cleaning methods based on these algorithms.Keywords: data cleaning, dependency rules, violation data discovery, data repair
Procedia PDF Downloads 564