Search results for: feature selection methods
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 17713

Search results for: feature selection methods

15643 Stress Analysis of Buried Pipes from Soil and Traffic Loads

Authors: A. Mohamed, A. El-Hamalawi, M. Frost, A. Connell

Abstract:

Often design standards do not provide guidance or formulae for the calculation of stresses on buried pipelines caused by external loads. Frequently engineers rely on other methods and published sources of information to calculate such imposed stresses and a variety of methods can be used. This paper reviews three current approaches to soil pipeline interaction modelling to predict stresses on buried pipelines subjected to soil overburden and traffic loading. The traditional approach to use empirical stress formulas to calculate circumferential bending stresses on pipelines. The alternative approaches considered are the use of a finite element package to compute an estimate of circumferential bending stress and a proprietary stress analysis system (SURFLOAD) to estimate the circumferential bending stress. The results from analysis using the methods are presented and compared to experimental results in terms of predicted and measured circumferential stresses. This study shows that the approach used to assess externally generated stress is important and can lead to an over-conservative analysis. Using FE analysis either through SURFLOAD or a general FE package to predict circumferential stress is the most accurate way to undertake stress analysis due to traffic and soil loads. Although conservative, classical empirical methods will continue to be applied to the analysis of buried pipelines, an opportunity exists, therefore, in many circumstances, to use applied numerical techniques, made possible by advances in finite element analysis.

Keywords: buried pipelines, circumferential bending stress, finite element analysis, soil overburden, soil pipeline interaction analysis (SPIA), traffic loadings

Procedia PDF Downloads 429
15642 A State-Of-The-Art Review on Web Services Adaptation

Authors: M. Velasco, D. While, P. Raju, J. Krasniewicz, A. Amini, L. Hernandez-Munoz

Abstract:

Web service adaptation involves the creation of adapters that solve Web services incompatibilities known as mismatches. Since the importance of Web services adaptation is increasing because of the frequent implementation and use of online Web services, this paper presents a literature review of web services to investigate the main methods of adaptation, their theoretical underpinnings and the metrics used to measure adapters performance. Eighteen publications were reviewed independently by two researchers. We found that adaptation techniques are needed to solve different types of problems that may arise due to incompatibilities in Web service interfaces, including protocols, messages, data and semantics that affect the interoperability of the services. Although adapters are non-invasive methods that can improve Web services interoperability and there are current approaches for service adaptation; there is, however, not yet one solution that fits all types of mismatches. Our results also show that only a few research projects incorporate theoretical frameworks and that metrics to measure adapters’ performance are very limited. We conclude that further research on software adaptation should improve current adaptation methods in different layers of the service interoperability and that an adaptation theoretical framework that incorporates a theoretical underpinning and measures of qualitative and quantitative performance needs to be created.

Keywords: Web Services Adapters, software adaptation, web services mismatches, web services interoperability

Procedia PDF Downloads 279
15641 Influence of Javascript Programming on the Developement of Web and Mobile Application

Authors: Abdul Basit Kiani

Abstract:

Web technologies are growing rapidly in the current era with the increasing development of the web, various novel web technologies emerged to web applications, compared to HTML. JavaScript is the language that provided a dynamic web site which actively interacts with users. The JavaScript language supports the Model View Controller (MVC) architecture that maintains a readable code and clearly separates parts of the program code. Our research is focused on the comparison of the popular JavaScript frameworks; Angular JS, Django, Node JS, Laravel. These frameworks are rely on MVC. In this paper, we will discuss the merits and demerits of each framework, the influence on the application speed, testing methods, for example, JS applications, and methods to advance code security.

Keywords: java script, react, nodejs, htmlcsss

Procedia PDF Downloads 100
15640 Orthogonal Regression for Nonparametric Estimation of Errors-In-Variables Models

Authors: Anastasiia Yu. Timofeeva

Abstract:

Two new algorithms for nonparametric estimation of errors-in-variables models are proposed. The first algorithm is based on penalized regression spline. The spline is represented as a piecewise-linear function and for each linear portion orthogonal regression is estimated. This algorithm is iterative. The second algorithm involves locally weighted regression estimation. When the independent variable is measured with error such estimation is a complex nonlinear optimization problem. The simulation results have shown the advantage of the second algorithm under the assumption that true smoothing parameters values are known. Nevertheless the use of some indexes of fit to smoothing parameters selection gives the similar results and has an oversmoothing effect.

Keywords: grade point average, orthogonal regression, penalized regression spline, locally weighted regression

Procedia PDF Downloads 404
15639 Effects, Causes, and Prevention of Teen Dating Violence

Authors: Isabel Jones

Abstract:

As adolescence is a formative time, experiences during adolescence often affect the rest of one’s life. Therefore, dating, specifically violence in dating, can have lasting effects on the rest of one’s life. In order to find sources, searches were conducted on PsycINFO, specifically EBSCO, and narrowed down under the criteria that the source contained information about adolescent dating violence rather than adult, and focused on causes, effects, or prevention methods. This literature review examines research regarding the effects and causes of TDV, and then what methods are effective in the prevention of TDV development. This will allow for a clear image of how these prevention methods are effective and why they are important. Effects of TDV extend beyond the physical, including psychological and sexual long-lasting effects. These are caused by a number of concepts, including learned behavior, inhibitory issues/substance abuse, and cultural factors. When both of these are taken into account, preventative measures such as school-based interventions, parental/adult monitoring, and the presence of positive family examples are more clear as to their effectiveness. This literature review may provide further awareness to this public health crisis and give the public a view of how adolescents are affected by TDV on their path from child to adult.

Keywords: adolescence, dating violence, risk factors, predictors, relationship

Procedia PDF Downloads 55
15638 Study on the Changes in Material Strength According to Changes in Forming Methods in Hot-Stamping Process

Authors: Yong-Jun Jeon, Hyung-Pil Park, Min-Jae Song, Baeg-Soon Cha

Abstract:

Following the recent trend of having increased demand in producing lighter-weight car bodies for improvement of automobile safety and gas mileage, there is a forming method that makes use of hot-stamping technique, which satisfies all conditions mentioned above. Hot-stamping is a forming technique with advantages of excellent formability, good dimensional precision and others since it is a process in which steel plates are heated up to temperatures of at least approximately 900°C after which forming is conducted in die at room temperature followed by rapid cooling. In addition, it has characteristics of allowing for improvement in material strength through achievement of quenching effect by having simultaneous forming and rapid cooling of material of high temperatures. However, there is insufficient information on the changes in material strength according to changes in material temperature with regards to material heating method and forming process in hot-stamping. Accordingly, this study aims to design and press die for T-type product of the scale models of the center pillar and to understand the changes in material strength in relation to changes in forming methods of hot-stamping process. Thus in order to understand the changes in material strength due to quenching effect among the hot-stamping process, material strength and material forming precision were to be studied while varying the forming and forming method when forming. For test methods, material strength was observed by using boron steel that has boron additives, which was heated up to 950°C, after which it was transferred to a die and was cooled down to material temperature of 400°C followed by air cooling process. During the forming and cooling process here, experiment was conducted with forming parameters of 2 holding rates and 3 flange heating rates wherein changing appearance in material strength according to changes forming method were observed by verifying forming strength and forming precision for each of the conditions.

Keywords: hot-stamping, formability, quenching, forming, press die, forming methods

Procedia PDF Downloads 453
15637 Weighted G2 Multi-Degree Reduction of Bezier Curves

Authors: Salisu ibrahim, Abdalla Rababah

Abstract:

In this research, we use Weighted G2-Multi-degree reduction of Bezier curve of degree n to a Bezier curve of degree m, m < n. The degree reduction of Bezier curves is used to represent a given Bezier curve of n by a Bezier curve of degree m, m < n. Exact degree reduction is not possible, and degree reduction is approximate process in nature. We derive a weighted degree reducing method that is geometrically continuous at the end points. Different norms will be considered, several error minimizations will be given. The proposed methods produce error function that are less than the errors of existing methods.

Keywords: Bezier curves, multiple degree reduction, geometric continuity, error function

Procedia PDF Downloads 471
15636 Assessing Diagnostic and Evaluation Tools for Use in Urban Immunisation Programming: A Critical Narrative Review and Proposed Framework

Authors: Tim Crocker-Buque, Sandra Mounier-Jack, Natasha Howard

Abstract:

Background: Due to both the increasing scale and speed of urbanisation, urban areas in low and middle-income countries (LMICs) host increasingly large populations of under-immunized children, with the additional associated risks of rapid disease transmission in high-density living environments. Multiple interdependent factors are associated with these coverage disparities in urban areas and most evidence comes from relatively few countries, e.g., predominantly India, Kenya, Nigeria, and some from Pakistan, Iran, and Brazil. This study aimed to identify, describe, and assess the main tools used to measure or improve coverage of immunisation services in poor urban areas. Methods: Authors used a qualitative review design, including academic and non-academic literature, to identify tools used to improve coverage of public health interventions in urban areas. Authors selected and extracted sources that provided good examples of specific tools, or categories of tools, used in a context relevant to urban immunization. Diagnostic (e.g., for data collection, analysis, and insight generation) and programme tools (e.g., for investigating or improving ongoing programmes) and interventions (e.g., multi-component or stand-alone with evidence) were selected for inclusion to provide a range of type and availability of relevant tools. These were then prioritised using a decision-analysis framework and a tool selection guide for programme managers developed. Results: Authors reviewed tools used in urban immunisation contexts and tools designed for (i) non-immunization and/or non-health interventions in urban areas, and (ii) immunisation in rural contexts that had relevance for urban areas (e.g., Reaching every District/Child/ Zone). Many approaches combined several tools and methods, which authors categorised as diagnostic, programme, and intervention. The most common diagnostic tools were cross-sectional surveys, key informant interviews, focus group discussions, secondary analysis of routine data, and geographical mapping of outcomes, resources, and services. Programme tools involved multiple stages of data collection, analysis, insight generation, and intervention planning and included guidance documents from WHO (World Health Organisation), UNICEF (United Nations Children's Fund), USAID (United States Agency for International Development), and governments, and articles reporting on diagnostics, interventions, and/or evaluations to improve urban immunisation. Interventions involved service improvement, education, reminder/recall, incentives, outreach, mass-media, or were multi-component. The main gaps in existing tools were an assessment of macro/policy-level factors, exploration of effective immunization communication channels, and measuring in/out-migration. The proposed framework uses a problem tree approach to suggest tools to address five common challenges (i.e. identifying populations, understanding communities, issues with service access and use, improving services, improving coverage) based on context and available data. Conclusion: This study identified many tools relevant to evaluating urban LMIC immunisation programmes, including significant crossover between tools. This was encouraging in terms of supporting the identification of common areas, but problematic as data volumes, instructions, and activities could overwhelm managers and tools are not always suitably applied to suitable contexts. Further research is needed on how best to combine tools and methods to suit local contexts. Authors’ initial framework can be tested and developed further.

Keywords: health equity, immunisation, low and middle-income countries, poverty, urban health

Procedia PDF Downloads 131
15635 Incorporating Anomaly Detection in a Digital Twin Scenario Using Symbolic Regression

Authors: Manuel Alves, Angelica Reis, Armindo Lobo, Valdemar Leiras

Abstract:

In industry 4.0, it is common to have a lot of sensor data. In this deluge of data, hints of possible problems are difficult to spot. The digital twin concept aims to help answer this problem, but it is mainly used as a monitoring tool to handle the visualisation of data. Failure detection is of paramount importance in any industry, and it consumes a lot of resources. Any improvement in this regard is of tangible value to the organisation. The aim of this paper is to add the ability to forecast test failures, curtailing detection times. To achieve this, several anomaly detection algorithms were compared with a symbolic regression approach. To this end, Isolation Forest, One-Class SVM and an auto-encoder have been explored. For the symbolic regression PySR library was used. The first results show that this approach is valid and can be added to the tools available in this context as a low resource anomaly detection method since, after training, the only requirement is the calculation of a polynomial, a useful feature in the digital twin context.

Keywords: anomaly detection, digital twin, industry 4.0, symbolic regression

Procedia PDF Downloads 108
15634 Your First Step to Understanding Research Ethics: Psychoneurolinguistic Approach

Authors: Sadeq Al Yaari, Ayman Al Yaari, Adham Al Yaari, Montaha Al Yaari, Aayah Al Yaari, Sajedah Al Yaari

Abstract:

Objective: This research aims at investigating the research ethics in the field of science. Method: It is an exploratory research wherein the researchers attempted to cover the phenomenon at hand from all specialists’ viewpoints. Results Discussion is based upon the findings resulted from the analysis the researcher undertook. Concerning the results’ prediction, the researcher needs first to seek highly qualified people in the field of research as well as in the field of statistics who share the philosophy of the research. Then s/he should make sure that s/he is adequately trained in the specific techniques, methods and statically programs that are used at the study. S/he should also believe in continually analysis for the data in the most current methods.

Keywords: research ethics, legal, rights, psychoneurolinguistics

Procedia PDF Downloads 27
15633 Statistical Wavelet Features, PCA, and SVM-Based Approach for EEG Signals Classification

Authors: R. K. Chaurasiya, N. D. Londhe, S. Ghosh

Abstract:

The study of the electrical signals produced by neural activities of human brain is called Electroencephalography. In this paper, we propose an automatic and efficient EEG signal classification approach. The proposed approach is used to classify the EEG signal into two classes: epileptic seizure or not. In the proposed approach, we start with extracting the features by applying Discrete Wavelet Transform (DWT) in order to decompose the EEG signals into sub-bands. These features, extracted from details and approximation coefficients of DWT sub-bands, are used as input to Principal Component Analysis (PCA). The classification is based on reducing the feature dimension using PCA and deriving the support-vectors using Support Vector Machine (SVM). The experimental are performed on real and standard dataset. A very high level of classification accuracy is obtained in the result of classification.

Keywords: discrete wavelet transform, electroencephalogram, pattern recognition, principal component analysis, support vector machine

Procedia PDF Downloads 623
15632 The Need for Embodiment Perspectives and Somatic Methods in Social Work Curriculum: Lessons Learned from a Decade of Developing a Program to Support College Students Who Exited the State Foster Care System

Authors: Yvonne A. Unrau

Abstract:

Social work education is a competency-based curriculum that relies mostly on cognitive frameworks and problem-solving models. Absent from the curriculum is knowledge and skills that draw from an embodiment perspective, especially somatic practice methods. Embodiment broadly encompasses the understanding that biological, political, historical, and social factors impact human development via changes to the nervous system. In the past 20 years, research has well-established that unresolved traumatic events, especially during childhood, negatively impacts long-term health and well-being. Furthermore, traumatic stress compromises cognitive processing and activates reflexive action such as ‘fight’ or ‘flight,’ which are the focus of somatic methods. The main objective of this paper is to show how embodiment perspectives and somatic methods can enhance social work practice overall. Using an exploratory approach, the author shares a decade-long journey that involved creating an education-support program for college students who exited the state foster care system. Personal experience, program outcomes and case study narratives revealed that ‘classical’ social work methods were insufficient to fully address the complex needs of college students who were living with complex traumatic stressors. The paper chronicles select case study scenarios and key program development milestones over a 10-year period to show the benefit of incorporating embodiment perspectives in social work practice. The lessons reveal that there is an immediate need for social work curriculum to include embodiment perspectives so that social workers may be equipped to respond competently to their many clients who live with unresolved trauma.

Keywords: social work practice, social work curriculum, embodiment, traumatic stress

Procedia PDF Downloads 114
15631 A Comparative Study of Multi-SOM Algorithms for Determining the Optimal Number of Clusters

Authors: Imèn Khanchouch, Malika Charrad, Mohamed Limam

Abstract:

The interpretation of the quality of clusters and the determination of the optimal number of clusters is still a crucial problem in clustering. We focus in this paper on multi-SOM clustering method which overcomes the problem of extracting the number of clusters from the SOM map through the use of a clustering validity index. We then tested multi-SOM using real and artificial data sets with different evaluation criteria not used previously such as Davies Bouldin index, Dunn index and silhouette index. The developed multi-SOM algorithm is compared to k-means and Birch methods. Results show that it is more efficient than classical clustering methods.

Keywords: clustering, SOM, multi-SOM, DB index, Dunn index, silhouette index

Procedia PDF Downloads 589
15630 Effect of Fresh Concrete Curing Methods on Its Compressive Strength

Authors: Xianghe Dai, Dennis Lam, Therese Sheehan, Naveed Rehman, Jie Yang

Abstract:

Concrete is one of the most used construction materials that may be made onsite as fresh concrete and then placed in formwork to produce the desired shapes of structures. It has been recognized that the raw materials and mix proportion of concrete dominate the mechanical characteristics of hardened concrete, and the curing method and environment applied to the concrete in early stages of hardening will significantly influence the concrete properties, such as compressive strength, durability, permeability etc. In construction practice, there are various curing methods to maintain the presence of mixing water throughout the early stages of concrete hardening. They are also beneficial to concrete in hot weather conditions as they provide cooling and prevent the evaporation of water. Such methods include ponding or immersion, spraying or fogging, saturated wet covering etc. Also there are various curing methods that may be implemented to decrease the level of water lost which belongs to the concrete surface, such as putting a layer of impervious paper, plastic sheeting or membrane on the concrete to cover it. In the concrete material laboratory, accelerated strength gain methods supply the concrete with heat and additional moisture by applying live steam, coils that are subject to heating or pads that have been warmed electrically. Currently when determining the mechanical parameters of a concrete, the concrete is usually sampled from fresh concrete on site and then cured and tested in laboratories where standardized curing procedures are adopted. However, in engineering practice, curing procedures in the construction sites after the placing of concrete might be very different from the laboratory criteria, and this includes some standard curing procedures adopted in the laboratory that can’t be applied on site. Sometimes the contractor compromises the curing methods in order to reduce construction costs etc. Obviously the difference between curing procedures adopted in the laboratory and those used on construction sites might over- or under-estimate the real concrete quality. This paper presents the effect of three typical curing methods (air curing, water immersion curing, plastic film curing) and of maintaining concrete in steel moulds on the compressive strength development of normal concrete. In this study, Portland cement with 30% fly ash was used and different curing periods, 7 days, 28 days and 60 days were applied. It was found that the highest compressive strength was observed from concrete samples to which 7-day water immersion curing was applied and from samples maintained in steel moulds up to the testing date. The research results implied that concrete used as infill in steel tubular members might develop a higher strength than predicted by design assumptions based on air curing methods. Wrapping concrete with plastic film as a curing method might delay the concrete strength development in the early stages. Water immersion curing for 7 days might significantly increase the concrete compressive strength.

Keywords: compressive strength, air curing, water immersion curing, plastic film curing, maintaining in steel mould, comparison

Procedia PDF Downloads 284
15629 Evaluation of Low-Global Warming Potential Refrigerants in Vapor Compression Heat Pumps

Authors: Hamed Jafargholi

Abstract:

Global warming presents an immense environmental risk, causing detrimental impacts on ecological systems and putting coastal areas at risk. Implementing efficient measures to minimize greenhouse gas emissions and the use of fossil fuels is essential to reducing global warming. Vapor compression heat pumps provide a practical method for harnessing energy from waste heat sources and reducing energy consumption. However, traditional working fluids used in these heat pumps generally contain a significant global warming potential (GWP), which might cause severe greenhouse effects if they are released. The goal of the emphasis on low-GWP (below 150) refrigerants is to further the vapor compression heat pumps. A classification system for vapor compression heat pumps is offered, with different boundaries based on the needed heat temperature and advancements in heat pump technology. A heat pump could be classified as a low temperature heat pump (LTHP), medium temperature heat pump (MTHP), high temperature heat pump (HTHP), or ultra-high temperature heat pump (UHTHP). The HTHP/UHTHP border is 160 °C, the MTHP/HTHP and LTHP/MTHP limits are 100 and 60 °C, respectively. The refrigerant is one of the most important parts of a vapor compression heat pump system. Presently, the main ways to choose a refrigerant are based on ozone depletion potential (ODP) and GWP, with GWP being the lowest possible value and ODP being zero. Pure low-GWP refrigerants, such as natural refrigerants (R718 and R744), hydrocarbons (R290, R600), hydrofluorocarbons (R152a and R161), hydrofluoroolefins (R1234yf, R1234ze(E)), and hydrochlorofluoroolefin (R1233zd(E)), were selected as candidates for vapor compression heat pump systems based on these selection principles. The performance, characteristics, and potential uses of these low-GWP refrigerants in heat pump systems are investigated in this paper. As vapor compression heat pumps with pure low-GWP refrigerants become more common, more and more low-grade heat can be recovered. This means that energy consumption would decrease. The research outputs showed that the refrigerants R718 for UHTHP application, R1233zd(E) for HTHP application, R600, R152a, R161, R1234ze(E) for MTHP, and R744, R290, and R1234yf for LTHP application are appropriate. The selection of an appropriate refrigerant should, in fact, take into consideration two different environmental and thermodynamic points of view. It might be argued that, depending on the situation, a trade-off between these two groups should constantly be considered. The environmental approach is now far stronger than it was previously, according to the European Union regulations. This will promote sustainable energy consumption and social development in addition to assisting in the reduction of greenhouse gas emissions and the management of global warming.

Keywords: vapor compression, global warming potential, heat pumps, greenhouse

Procedia PDF Downloads 9
15628 Extraction and Characterization of Ethiopian Hibiscus macranthus Bast Fiber

Authors: Solomon Tilahun Desisa, Muktar Seid Hussen

Abstract:

Hibiscus macranthus is one of family Malvaceae and genus Hibiscus plant which grows mainly in western part of Ethiopia. Hibiscus macranthus is the most adaptable and abundant plant in the nation, which are used as an ornamental plant often a hedge or fence plant, and used as a firewood after harvesting the stem together with the bark, and used also as a fiber for trying different kinds of things by forming the rope. However, Hibiscus macranthus plant fibre has not been commercially exploited and extracted properly. This study of work describes the possibility of mechanical and retting methods of Hibiscus macranthus fibre extraction and characterization. Hibiscus macranthus fibre is a bast fibre which obtained naturally from the stem or stalks of the dicotyledonous plant since it is a natural cellulose plant fiber. And the fibre characterized by studying its physical and chemical properties. The physical characteristics were investigated as follows, including the length of 100-190mm, fineness of 1.0-1.2Tex, diameter under X100 microscopic view 16-21 microns, the moisture content of 12.46% and dry tenacity of 48-57cN/Tex along with breaking extension of 0.9-1.6%. Hibiscus macranthus fiber productivity was observed that 12-18% of the stem out of which more than 65% is primary long fibers. The fiber separation methods prove to decrease of non-cellulose ingredients in the order of mechanical, water and chemical methods. The color measurement also shows the raw Hibiscus macranthus fiber has a natural golden color according to YID1925 and paler look under both retting methods than mechanical separation. Finally, it is suggested that Hibiscus macranthus fibre can be used for manufacturing of natural and organic crop and coffee packages as well as super absorbent, fine and high tenacity textile products.

Keywords: Hibiscus macranthus, bast fiber, extraction, characterization

Procedia PDF Downloads 197
15627 Internationalization Process Model for Construction Firms: Stages and Strategies

Authors: S. Ping Ho, R. Dahal

Abstract:

The global economy has drastically changed how firms operate and compete. Although the construction industry is ‘local’ by its nature, the internationalization of the construction industry has become an inevitable reality. As a result of global competition, staying domestic is no longer safe from competition and, on the contrary, to grow and become an MNE (multi-national enterprise) becomes one of the important strategies for a firm to survive in the global competition. For the successful entrance into competing markets, the firms need to re-define their competitive advantages and re-identify the sources of the competitive advantages. A firm’s initiation of internationalization is not necessarily a result of strategic planning but also involves certain idiosyncratic events that pave the path leading to a firm’s internationalization. For example, a local firm’s incidental or unintentional collaboration with an MNE can become the initiating point of its internationalization process. However, because of the intensive competition in today’s global movement, many firms were compelled to initiate their internationalization as a strategic response to the competition. Understandingly stepping in in the process of internationalization and appropriately implementing the strategies (in the process) at different stages lead the construction firms to a successful internationalization journey. This study is carried out to develop a model of the internationalization process, which derives appropriate strategies that the construction firms can implement at each stage. The proposed model integrates two major and complementary views of internationalization and expresses the dynamic process of internationalization in three stages, which are the pre-international (PRE) stage, the foreign direct investment (FDI) stage, and the multi-national enterprise (MNE) stage. The strategies implied in the proposed model are derived, focusing on capability building, market locations, and entry modes based on the resource-based views: value, rareness, imitability, and substitutability (VRIN). With the proposed dynamic process model the potential construction firms which are willing to expand their business market area can be benefitted. Strategies for internationalization, such as core competence strategy, market selection, partner selection, and entry mode strategy, can be derived from the proposed model. The internationalization process is expressed in two different forms. First, we discuss the construction internationalization process, identify the driving factor/s of the process, and explain the strategy formation in the process. Second, we define the stages of internationalization along the process and the corresponding strategies in each stage. The strategies may include how to exploit existing advantages for the competition at the current stage and develop or explore additional advantages appropriate for the next stage. Particularly, the additionally developed advantages will then be accumulated and drive forward the firm’s stage of internationalization, which will further determine the subsequent strategies, and so on and so forth, spiraling up the stages of a higher degree of internationalization. However, the formation of additional strategies for the next stage does not happen automatically, and the strategy evolution is based on the firm’s dynamic capabilities.

Keywords: construction industry, dynamic capabilities, internationalization process, internationalization strategies, strategic management

Procedia PDF Downloads 52
15626 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 263
15625 Feasibility Study of Particle Image Velocimetry in the Muzzle Flow Fields during the Intermediate Ballistic Phase

Authors: Moumen Abdelhafidh, Stribu Bogdan, Laboureur Delphine, Gallant Johan, Hendrick Patrick

Abstract:

This study is part of an ongoing effort to improve the understanding of phenomena occurring during the intermediate ballistic phase, such as muzzle flows. A thorough comprehension of muzzle flow fields is essential for optimizing muzzle device and projectile design. This flow characterization has heretofore been almost entirely limited to local and intrusive measurement techniques such as pressure measurements using pencil probes. Consequently, the body of quantitative experimental data is limited, so is the number of numerical codes validated in this field. The objective of the work presented here is to demonstrate the applicability of the Particle Image Velocimetry (PIV) technique in the challenging environment of the propellant flow of a .300 blackout weapon to provide accurate velocity measurements. The key points of a successful PIV measurement are the selection of the particle tracer, their seeding technique, and their tracking characteristics. We have experimentally investigated the aforementioned points by evaluating the resistance, gas dispersion, laser light reflection as well as the response to a step change across the Mach disk for five different solid tracers using two seeding methods. To this end, an experimental setup has been performed and consisted of a PIV system, the combustion chamber pressure measurement, classical high-speed schlieren visualization, and an aerosol spectrometer. The latter is used to determine the particle size distribution in the muzzle flow. The experimental results demonstrated the ability of PIV to accurately resolve the salient features of the propellant flow, such as the under the expanded jet and vortex rings, as well as the instantaneous velocity field with maximum centreline velocities of more than 1000 m/s. Besides, naturally present unburned particles in the gas and solid ZrO₂ particles with a nominal size of 100 nm, when coated on the propellant powder, are suitable as tracers. However, the TiO₂ particles intended to act as a tracer, surprisingly not only melted but also functioned as a combustion accelerator and decreased the number of particles in the propellant gas.

Keywords: intermediate ballistic, muzzle flow fields, particle image velocimetry, propellant gas, particle size distribution, under expanded jet, solid particle tracers

Procedia PDF Downloads 150
15624 Virucidal, Bactericidal and Fungicidal Efficiency of Dry Microfine Steam on Innate Surfaces

Authors: C. Recchia, M. Bourel, B. Recchia

Abstract:

Microorganisms (viruses, bacteria, fungi) are responsible for most communicable diseases, threatening human health. For domestic use, chemical agents are often criticized because of their potential dangerousness, and natural solutions are needed. Application of the “dry microfine steam” (DMS) technology was tested on a selection of common pathogens (SARS-CoV-2, enterovirus EV-71, human coronavirus 229E, E. coli, S. aureus, C. albicans), on different innate surfaces, for 5 to 10 seconds. Quantification of the remaining pathogens was performed, and the reduction rates ranged from 99.8% (S. aureus on plastic) to over 99.999%. DMS showed high efficacy in the elimination of common microorganisms and could be seen as a natural alternative to chemical agents to improve domestic hygiene.

Keywords: steam, SARS-CoV-2, bactericidal, virucidal, fungicidal, sterilization

Procedia PDF Downloads 157
15623 Working Memory Growth from Kindergarten to First Grade: Considering Impulsivity, Parental Discipline Methods and Socioeconomic Status

Authors: Ayse Cobanoglu

Abstract:

Working memory can be defined as a workspace that holds and regulates active information in mind. This study investigates individual changes in children's working memory from kindergarten to first grade. The main purpose of the study is whether parental discipline methods and child impulsive/overactive behaviors affect children's working memory initial status and growth rate, controlling for gender, minority status, and socioeconomic status (SES). A linear growth curve model with the first four waves of the Early Childhood Longitudinal Study-Kindergarten Cohort of 2011 (ECLS-K:2011) is performed to analyze the individual growth of children's working memory longitudinally (N=3915). Results revealed that there is a significant variation among students' initial status in the kindergarten fall semester as well as the growth rate during the first two years of schooling. While minority status, SES, and children's overactive/impulsive behaviors influenced children's initial status, only SES and minority status were significantly associated with the growth rate of working memory. For parental discipline methods, such as giving a warning and ignoring the child's negative behavior, are also negatively associated with initial working memory scores. Following that, students' working memory growth rate is examined, and students with lower SES as well as minorities showed a faster growth pattern during the first two years of schooling. However, the findings of parental disciplinary methods on working memory growth rates were mixed. It can be concluded that schooling helps low-SES minority students to develop their working memory.

Keywords: growth curve modeling, impulsive/overactive behaviors, parenting, working memory

Procedia PDF Downloads 123
15622 Machine Learning in Gravity Models: An Application to International Recycling Trade Flow

Authors: Shan Zhang, Peter Suechting

Abstract:

Predicting trade patterns is critical to decision-making in public and private domains, especially in the current context of trade disputes among major economies. In the past, U.S. recycling has relied heavily on strong demand for recyclable materials overseas. However, starting in 2017, a series of new recycling policies (bans and higher inspection standards) was enacted by multiple countries that were the primary importers of recyclables from the U.S. prior to that point. As the global trade flow of recycling shifts, some new importers, mostly developing countries in South and Southeast Asia, have been overwhelmed by the sheer quantities of scrap materials they have received. As the leading exporter of recyclable materials, the U.S. now has a pressing need to build its recycling industry domestically. With respect to the global trade in scrap materials used for recycling, the interest in this paper is (1) predicting how the export of recyclable materials from the U.S. might vary over time, and (2) predicting how international trade flows for recyclables might change in the future. Focusing on three major recyclable materials with a history of trade, this study uses data-driven and machine learning (ML) algorithms---supervised (shrinkage and tree methods) and unsupervised (neural network method)---to decipher the international trade pattern of recycling. Forecasting the potential trade values of recyclables in the future could help importing countries, to which those materials will shift next, to prepare related trade policies. Such policies can assist policymakers in minimizing negative environmental externalities and in finding the optimal amount of recyclables needed by each country. Such forecasts can also help exporting countries, like the U.S understand the importance of healthy domestic recycling industry. The preliminary result suggests that gravity models---in addition to particular selection macroeconomic predictor variables--are appropriate predictors of the total export value of recyclables. With the inclusion of variables measuring aspects of the political conditions (trade tariffs and bans), predictions show that recyclable materials are shifting from more policy-restricted countries to less policy-restricted countries in international recycling trade. Those countries also tend to have high manufacturing activities as a percentage of their GDP.

Keywords: environmental economics, machine learning, recycling, international trade

Procedia PDF Downloads 159
15621 Topology Optimization of the Interior Structures of Beams under Various Load and Support Conditions with Solid Isotropic Material with Penalization Method

Authors: Omer Oral, Y. Emre Yilmaz

Abstract:

Topology optimization is an approach that optimizes material distribution within a given design space for a certain load and boundary conditions by providing performance goals. It uses various restrictions such as boundary conditions, set of loads, and constraints to maximize the performance of the system. It is different than size and shape optimization methods, but it reserves some features of both methods. In this study, interior structures of the parts were optimized by using SIMP (Solid Isotropic Material with Penalization) method. The volume of the part was preassigned parameter and minimum deflection was the objective function. The basic idea behind the theory was considered, and different methods were discussed. Rhinoceros 3D design tool was used with Grasshopper and TopOpt plugins to create and optimize parts. A Grasshopper algorithm was designed and tested for different beams, set of arbitrary located forces and support types such as pinned, fixed, etc. Finally, 2.5D shapes were obtained and verified by observing the changes in density function.

Keywords: Grasshopper, lattice structure, microstructures, Rhinoceros, solid isotropic material with penalization method, TopOpt, topology optimization

Procedia PDF Downloads 119
15620 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods

Authors: Autcha Araveeporn

Abstract:

This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.

Keywords: Bayes method, Markov chain Monte Carlo method, maximum likelihood method, normal distribution

Procedia PDF Downloads 347
15619 Evaluation of Traditional Methods in Construction and Their Effects on Reinforced-Concrete Buildings Behavior

Authors: E. H. N. Gashti, M. Zarrini, M. Irannezhad, J. R. Langroudi

Abstract:

Using ETABS software, this study analyzed 23 buildings to evaluate effects of mistakes during construction phase on buildings structural behavior. For modelling, two different loadings were assumed: 1) design loading and 2) loading due to the effects of mistakes in construction phase. Research results determined that considering traditional construction methods for buildings resulted in a significant increase in dead loads and consequently intensified the displacements and base-shears of buildings under seismic loads.

Keywords: reinforced-concrete buildings, construction mistakes, base-shear, displacements, failure

Procedia PDF Downloads 261
15618 A New Method to Winner Determination for Economic Resource Allocation in Cloud Computing Systems

Authors: Ebrahim Behrouzian Nejad, Rezvan Alipoor Sabzevari

Abstract:

Cloud computing systems are large-scale distributed systems, so that they focus more on large scale resource sharing, cooperation of several organizations and their use in new applications. One of the main challenges in this realm is resource allocation. There are many different ways to resource allocation in cloud computing. One of the common methods to resource allocation are economic methods. Among these methods, the auction-based method has greater prominence compared with Fixed-Price method. The double combinatorial auction is one of the proper ways of resource allocation in cloud computing. This method includes two phases: winner determination and resource allocation. In this paper a new method has been presented to determine winner in double combinatorial auction-based resource allocation using Imperialist Competitive Algorithm (ICA). The experimental results show that in our new proposed the number of winner users is higher than genetic algorithm. On other hand, in proposed algorithm, the number of winner providers is higher in genetic algorithm.

Keywords: cloud computing, resource allocation, double auction, winner determination

Procedia PDF Downloads 352
15617 Selection of Endophytcs Fungi Isolated from Date Palm, Halotolerants and Productors of Secondary Metabolite

Authors: Fadila Mohamed Mahmoud., Derkaoui I., Krimi Z.

Abstract:

Date palm is a plant which presents a very good adaptation to the difficult conditions of the environment in particular to the drought and saline stress even at high temperatures. This adaptation is related on the biology of the plant and to the presence of a microflora endophyte which live inside its tissues. Fifteen endophytics fungi isolated from date palm were tested in vitro in the presence of various NaCl concentrations to select halotolerantes isolates. These same endophytes were tested for their colonizing capacity by the description of the production of secondary metabolites more particularly the enzymes (pectinases, proteases, and phosphorylases), and the production of antibiotics and growth hormones. Significant difference was observed between the isolates with respect to the tests carried out.

Keywords: Date palm, Halotolerantes, endophyte, Secondary metabolites.

Procedia PDF Downloads 509
15616 The Menu Planning Problem: A Systematic Literature Review

Authors: Dorra Kallel, Ines Kanoun, Diala Dhouib

Abstract:

This paper elaborates a Systematic Literature Review SLR) to select the most outstanding studies that address the Menu Planning Problem (MPP) and to classify them according to the to the three following criteria: the used methods, types of patients and the required constraints. At first, a set of 4165 studies was selected. After applying the SLR’s guidelines, this collection was filtered to 13 studies using specific inclusion and exclusion criteria as well as an accurate analysis of each study. Second, the selected papers were invested to answer the proposed research questions. Finally, data synthesis and new perspectives for future works are incorporated in the closing section.

Keywords: Menu Planning Problem (MPP), Systematic Literature Review (SLR), classification, exact and approaches methods

Procedia PDF Downloads 264
15615 A Study of Rapid Replication of Square-Microlens Structures

Authors: Ting-Ting Wen, Jung-Ruey Tsai

Abstract:

This paper reports a method for the replication of micro-scale structures. By using electromagnetic force-assisted imprinting system with magnetic soft stamp written square-microlens cavity, a photopolymer square-microlens structures can be rapidly fabricated. Under the proper processing conditions, the polymeric square-microlens structures with feature size of width 100.3um and height 15.2um across a large area can be successfully fabricated. Scanning electron microscopy (SEM) and surface profiler observations confirm that the micro-scale polymer structures are produced without defects or distortion and with good pattern fidelity over a 60x60mm2 area. This technique shows great potential for the efficient replication of the micro-scale structure array at room temperature and with high productivity and low cost.

Keywords: square-microlens structures, electromagnetic force-assisted imprinting, magnetic soft stamp

Procedia PDF Downloads 318
15614 Genomics of Adaptation in the Sea

Authors: Agostinho Antunes

Abstract:

The completion of the human genome sequencing in 2003 opened a new perspective into the importance of whole genome sequencing projects, and currently multiple species are having their genomes completed sequenced, from simple organisms, such as bacteria, to more complex taxa, such as mammals. This voluminous sequencing data generated across multiple organisms provides also the framework to better understand the genetic makeup of such species and related ones, allowing to explore the genetic changes underlining the evolution of diverse phenotypic traits. Here, recent results from our group retrieved from comparative evolutionary genomic analyses of selected marine animal species will be considered to exemplify how gene novelty and gene enhancement by positive selection might have been determinant in the success of adaptive radiations into diverse habitats and lifestyles.

Keywords: marine genomics, evolutionary bioinformatics, human genome sequencing, genomic analyses

Procedia PDF Downloads 598