Search results for: lean tools and techniques
8500 Modern Proteomics and the Application of Machine Learning Analyses in Proteomic Studies of Chronic Kidney Disease of Unknown Etiology
Authors: Dulanjali Ranasinghe, Isuru Supasan, Kaushalya Premachandra, Ranjan Dissanayake, Ajith Rajapaksha, Eustace Fernando
Abstract:
Proteomics studies of organisms are considered to be significantly information-rich compared to their genomic counterparts because proteomes of organisms represent the expressed state of all proteins of an organism at a given time. In modern top-down and bottom-up proteomics workflows, the primary analysis methods employed are gel–based methods such as two-dimensional (2D) electrophoresis and mass spectrometry based methods. Machine learning (ML) and artificial intelligence (AI) have been used increasingly in modern biological data analyses. In particular, the fields of genomics, DNA sequencing, and bioinformatics have seen an incremental trend in the usage of ML and AI techniques in recent years. The use of aforesaid techniques in the field of proteomics studies is only beginning to be materialised now. Although there is a wealth of information available in the scientific literature pertaining to proteomics workflows, no comprehensive review addresses various aspects of the combined use of proteomics and machine learning. The objective of this review is to provide a comprehensive outlook on the application of machine learning into the known proteomics workflows in order to extract more meaningful information that could be useful in a plethora of applications such as medicine, agriculture, and biotechnology.Keywords: proteomics, machine learning, gel-based proteomics, mass spectrometry
Procedia PDF Downloads 1548499 Fe-BTC Based Electrochemical Sensor for Anti-Psychotic and Anti-Migraine Drugs: Aripiprazole and Rizatriptan
Authors: Sachin Saxena, Manju Srivastava
Abstract:
The present study describes a stable, highly sensitive and selective analytical sensor. Fe-BTC was synthesized at room temperature using the noble Iron-trimesate system. The high surface area of as synthesized Fe-BTC proved MOFs as ideal modifiers for glassy carbon electrode. The characterization techniques such as TGA, XRD, FT-IR, BET (BET surface area= 1125 m2/gm) analysis explained the electrocatalytic behaviour of Fe-BTC towards these two drugs. The material formed is cost effective and exhibit higher catalytic behaviour towards analyte systems. The synergism between synthesized Fe-BTC and electroanalytical techniques helped in developing a highly sensitive analytical method for studying the redox fate of ARP and RZ, respectively. Cyclic voltammetry of ferricyanide system proved Fe-BTC/GCE with an increase in 132% enhancement in peak current value as compared to that of GCE. The response characteristics of cyclic voltammetry (CV) and square wave voltammetry (SWV) revealed that the ARP and RZ could be effectively accumulated at Fe-BTC/GCE. On the basis of the electrochemical measurements, electrode dynamics parameters have been evaluated. Present study opens up new field of applications of MOFs modified GCE for drug sensing.Keywords: MOFs, anti-psychotic, electrochemical sensor, anti-migraine drugs
Procedia PDF Downloads 1718498 Research on Structural Changes in Plastic Deformation during Rolling and Crimping of Tubes
Authors: Hein Win Zaw
Abstract:
Today, the advanced strategies for aircraft production technology potentially need the higher performance, and on the other hand, those strategies and engineering technologies should meet considerable process and reduce of production costs. Thus, professionals who are working in these scopes are attempting to develop new materials to improve the manufacturability of designs, the creation of new technological processes, tools and equipment. This paper discusses about the research on structural changes in plastic deformation during rotary expansion and crimp of pipes. Pipelines are experiencing high pressure and pulsating load. That is why, it is high demands on the mechanical properties of the material, the quality of the external and internal surfaces, preserve cross-sectional shape and the minimum thickness of the pipe wall are taking into counts. In the manufacture of pipes, various operations: distribution, crimping, bending, etc. are used. The most widely used at various semi-products, connecting elements found the process of rotary expansion and crimp of pipes. In connection with the use of high strength materials and less-plastic, these conventional techniques do not allow obtaining high-quality parts, and also have a low economic efficiency. Therefore, research in this field is relevantly considerable to develop in advanced. Rotary expansion and crimp of pipes are accompanied by inhomogeneous plastic deformation, which leads to structural changes in the material, causes its deformation hardening, by this result changes the operational reliability of the product. Parts of the tube obtained by rotary expansion and crimp differ by multiplicity of form and characterized by various diameter in the various section, which formed in the result of inhomogeneous plastic deformation. The reliability of the coupling, obtained by rotary expansion and crimp, is determined by the structural arrangement of material formed by the formation process; there is maximum value of deformation, the excess of which is unacceptable. The structural state of material in this condition is determined by technological mode of formation in the rotary expansion and crimp. Considering the above, objective of the present study is to investigate the structural changes at different levels of plastic deformation, accompanying rotary expansion and crimp, and the analysis of stress concentrators of different scale levels, responsible for the formation of the primary zone of destruction.Keywords: plastic deformation, rolling of tubes, crimping of tubes, structural changes
Procedia PDF Downloads 3368497 Clustering of Association Rules of ISIS & Al-Qaeda Based on Similarity Measures
Authors: Tamanna Goyal, Divya Bansal, Sanjeev Sofat
Abstract:
In world-threatening terrorist attacks, where early detection, distinction, and prediction are effective diagnosis techniques and for functionally accurate and precise analysis of terrorism data, there are so many data mining & statistical approaches to assure accuracy. The computational extraction of derived patterns is a non-trivial task which comprises specific domain discovery by means of sophisticated algorithm design and analysis. This paper proposes an approach for similarity extraction by obtaining the useful attributes from the available datasets of terrorist attacks and then applying feature selection technique based on the statistical impurity measures followed by clustering techniques on the basis of similarity measures. On the basis of degree of participation of attributes in the rules, the associative dependencies between the attacks are analyzed. Consequently, to compute the similarity among the discovered rules, we applied a weighted similarity measure. Finally, the rules are grouped by applying using hierarchical clustering. We have applied it to an open source dataset to determine the usability and efficiency of our technique, and a literature search is also accomplished to support the efficiency and accuracy of our results.Keywords: association rules, clustering, similarity measure, statistical approaches
Procedia PDF Downloads 3238496 Sustainable Manufacturing of Concentrated Latex and Ribbed Smoked Sheets in Sri Lanka
Authors: Pasan Dunuwila, V. H. L. Rodrigo, Naohiro Goto
Abstract:
Sri Lanka is one the largest natural rubber (NR) producers of the world, where the NR industry is a major foreign exchange earner. Among the locally manufactured NR products, concentrated latex (CL) and ribbed smoked sheets (RSS) hold a significant position. Furthermore, these products become the foundation for many products utilized by the people all over the world (e.g. gloves, condoms, tires, etc.). Processing of CL and RSS costs a significant amount of material, energy, and workforce. With this background, both manufacturing lines have immensely challenged by waste, low productivity, lack of cost efficiency, rising cost of production, and many environmental issues. To face the above challenges, the adaptation of sustainable manufacturing measures that use less energy, water, materials, and produce less waste is imperative. However, these sectors lack comprehensive studies that shed light on such measures and thoroughly discuss their improvement potentials from both environmental and economic points of view. Therefore, based on a study of three CL and three RSS mills in Sri Lanka, this study deploys sustainable manufacturing techniques and tools to uncover the underlying potentials to improve performances in CL and RSS processing sectors. This study is comprised of three steps: 1. quantification of average material waste, economic losses, and greenhouse gas (GHG) emissions via material flow analysis (MFA), material flow cost accounting (MFCA), and life cycle assessment (LCA) in each manufacturing process, 2. identification of improvement options with the help of Pareto and What-if analyses, field interviews, and the existing literature; and 3. validation of the identified improvement options via the re-execution of MFA, MFCA, and LCA. With the help of this methodology, the economic and environmental hotspots, and the degrees of improvement in both systems could be identified. Results highlighted that each process could be improved to have less waste, monetary losses, manufacturing costs, and GHG emissions. Conclusively, study`s methodology and findings are believed to be beneficial for assuring the sustainable growth not only in Sri Lankan NR processing sector itself but also in NR or any other industry rooted in other developing countries.Keywords: concentrated latex, natural rubber, ribbed smoked sheets, Sri Lanka
Procedia PDF Downloads 2628495 Neural Networks Based Prediction of Long Term Rainfall: Nine Pilot Study Zones over the Mediterranean Basin
Authors: Racha El Kadiri, Mohamed Sultan, Henrique Momm, Zachary Blair, Rachel Schultz, Tamer Al-Bayoumi
Abstract:
The Mediterranean Basin is a very diverse region of nationalities and climate zones, with a strong dependence on agricultural activities. Predicting long term (with a lead of 1 to 12 months) rainfall, and future droughts could contribute in a sustainable management of water resources and economical activities. In this study, an integrated approach was adopted to construct predictive tools with lead times of 0 to 12 months to forecast rainfall amounts over nine subzones of the Mediterranean Basin region. The following steps were conducted: (1) acquire, assess and intercorrelate temporal remote sensing-based rainfall products (e.g. The CPC Merged Analysis of Precipitation [CMAP]) throughout the investigation period (1979 to 2016), (2) acquire and assess monthly values for all of the climatic indices influencing the regional and global climatic patterns (e.g., Northern Atlantic Oscillation [NOI], Southern Oscillation Index [SOI], and Tropical North Atlantic Index [TNA]); (3) delineate homogenous climatic regions and select nine pilot study zones, (4) apply data mining methods (e.g. neural networks, principal component analyses) to extract relationships between the observed rainfall and the controlling factors (i.e. climatic indices with multiple lead-time periods) and (5) use the constructed predictive tools to forecast monthly rainfall and dry and wet periods. Preliminary results indicate that rainfall and dry/wet periods were successfully predicted with lead zones of 0 to 12 months using the adopted methodology, and that the approach is more accurately applicable in the southern Mediterranean region.Keywords: rainfall, neural networks, climatic indices, Mediterranean
Procedia PDF Downloads 3168494 Natural Dyes in Schools. Development of Techniques From Early Childhood as a Tool for Art, Design and Sustainability
Authors: Luciana Marrone
Abstract:
Natural dyes are a great resource for today's artists and designers providing endless possibilities for design and sustainability. This research and development project focuses on the idea of making these dyeing or painting methodologies reach the widest possible range of students. The main objective is to inform and train, free of charge, teachers and students from different academic institutions, at different levels, kindergarten, primary, secondary, tertiary and university. In this research and dissemination project, in the first instance, institutions from Argentina, Chile, Uruguay, Mexico, Spain, Italy, Colombia, Paraguay, Venezuela, Brazil and Australia joined the project, reaching the grassroots of education from the very beginning. Natural dyes will become part of everyday life for more people, achieving their own colors for art, textiles or any other application. The knowledge of the techniques and resources of the student a fundamental tool, sustainable and opens endless possibilities even in places or homes with few economic resources, thus achieving that natural dyes are not only part of the world of designers but also that they are incorporated from the basics and can thus become a resource applicable in different areas even in places with few economic or development possibilities.Keywords: art, education, natural dyes, sustainability, textile design.
Procedia PDF Downloads 868493 Low Probability of Intercept (LPI) Signal Detection and Analysis Using Choi-Williams Distribution
Authors: V. S. S. Kumar, V. Ramya
Abstract:
In the modern electronic warfare, the signal scenario is changing at a rapid pace with the introduction of Low Probability of Intercept (LPI) radars. In the modern battlefield, radar system faces serious threats from passive intercept receivers such as Electronic Attack (EA) and Anti-Radiation Missiles (ARMs). To perform necessary target detection and tracking and simultaneously hide themselves from enemy attack, radar systems should be LPI. These LPI radars use a variety of complex signal modulation schemes together with pulse compression with the aid of advancement in signal processing capabilities of the radar such that the radar performs target detection and tracking while simultaneously hiding enemy from attack such as EA etc., thus posing a major challenge to the ES/ELINT receivers. Today an increasing number of LPI radars are being introduced into the modern platforms and weapon systems so these LPI radars created a requirement for the armed forces to develop new techniques, strategies and equipment to counter them. This paper presents various modulation techniques used in generation of LPI signals and development of Time Frequency Algorithms to analyse those signals.Keywords: anti-radiation missiles, cross terms, electronic attack, electronic intelligence, electronic warfare, intercept receiver, low probability of intercept
Procedia PDF Downloads 4768492 Embedded Hybrid Intuition: A Deep Learning and Fuzzy Logic Approach to Collective Creation and Computational Assisted Narratives
Authors: Roberto Cabezas H
Abstract:
The current work shows the methodology developed to create narrative lighting spaces for the multimedia performance piece 'cluster: the vanished paradise.' This empirical research is focused on exploring unconventional roles for machines in subjective creative processes, by delving into the semantics of data and machine intelligence algorithms in hybrid technological, creative contexts to expand epistemic domains trough human-machine cooperation. The creative process in scenic and performing arts is guided mostly by intuition; from that idea, we developed an approach to embed collective intuition in computational creative systems, by joining the properties of Generative Adversarial Networks (GAN’s) and Fuzzy Clustering based on a semi-supervised data creation and analysis pipeline. The model makes use of GAN’s to learn from phenomenological data (data generated from experience with lighting scenography) and algorithmic design data (augmented data by procedural design methods), fuzzy logic clustering is then applied to artificially created data from GAN’s to define narrative transitions built on membership index; this process allowed for the creation of simple and complex spaces with expressive capabilities based on position and light intensity as the parameters to guide the narrative. Hybridization comes not only from the human-machine symbiosis but also on the integration of different techniques for the implementation of the aided design system. Machine intelligence tools as proposed in this work are well suited to redefine collaborative creation by learning to express and expand a conglomerate of ideas and a wide range of opinions for the creation of sensory experiences. We found in GAN’s and Fuzzy Logic an ideal tool to develop new computational models based on interaction, learning, emotion and imagination to expand the traditional algorithmic model of computation.Keywords: fuzzy clustering, generative adversarial networks, human-machine cooperation, hybrid collective data, multimedia performance
Procedia PDF Downloads 1448491 From Design, Experience and Play Framework to Common Design Thinking Tools: Using Serious Modern Board Games
Authors: Micael Sousa
Abstract:
Board games (BGs) are thriving as new designs emerge from the hobby community to greater audiences all around the world. Although digital games are gathering most of the attention in game studies and serious games research fields, the post-digital movement helps to explain why in the world dominated by digital technologies, the analog experiences are still unique and irreplaceable to users, allowing innovation in new hybrid environments. The BG’s new designs are part of these post-digital and hybrid movements because they result from the use of powerful digital tools that enable the production and knowledge sharing about the BGs and their face-to-face unique social experiences. These new BGs, defined as modern by many authors, are providing innovative designs and unique game mechanics that are still not yet fully explored by the main serious games (SG) approaches. Even the most established frameworks settled to address SG, as fun games implemented to achieve predefined goals need more development, especially when considering modern BGs. Despite the many anecdotic perceptions, researchers are only now starting to rediscover BGs and demonstrating their potentials. They are proving that BGs are easy to adapt and to grasp by non-expert players in experimental approaches, with the possibility of easy-going adaptation to players’ profiles and serious objectives even during gameplay. Although there are many design thinking (DT) models and practices, their relations with SG frameworks are also underdeveloped, mostly because this is a new research field, lacking theoretical development and the systematization of the experimental practices. Using BG as case studies promise to help develop these frameworks. Departing from the Design, Experience, and Play (DPE) framework and considering the Common Design Think Tools (CDST), this paper proposes a new experimental framework for the adaptation and development of modern BG design for DT: the Design, Experience, and Play for Think (DPET) experimental framework. This is done through the systematization of the DPE and CDST approaches applied in two case studies, where two different sequences of adapted BG were employed to establish a DT collaborative process. These two sessions occurred with different participants and in different contexts, also using different sequences of games for the same DT approach. The first session took place at the Faculty of Economics at the University of Coimbra in a training session of serious games for project development. The second session took place in the Casa do Impacto through The Great Village Design Jam light. Both sessions had the same duration and were designed to progressively achieve DT goals, using BGs as SGs in a collaborative process. The results from the sessions show that a sequence of BGs, when properly adapted to address the DPET framework, can generate a viable and innovative process of collaborative DT that is productive, fun, and engaging. The DPET proposed framework intents to help establish how new SG solutions could be defined for new goals through flexible DT. Applications in other areas of research and development can also benefit from these findings.Keywords: board games, design thinking, methodology, serious games
Procedia PDF Downloads 1168490 Machine Learning Techniques to Predict Cyberbullying and Improve Social Work Interventions
Authors: Oscar E. Cariceo, Claudia V. Casal
Abstract:
Machine learning offers a set of techniques to promote social work interventions and can lead to support decisions of practitioners in order to predict new behaviors based on data produced by the organizations, services agencies, users, clients or individuals. Machine learning techniques include a set of generalizable algorithms that are data-driven, which means that rules and solutions are derived by examining data, based on the patterns that are present within any data set. In other words, the goal of machine learning is teaching computers through 'examples', by training data to test specifics hypothesis and predict what would be a certain outcome, based on a current scenario and improve that experience. Machine learning can be classified into two general categories depending on the nature of the problem that this technique needs to tackle. First, supervised learning involves a dataset that is already known in terms of their output. Supervising learning problems are categorized, into regression problems, which involve a prediction from quantitative variables, using a continuous function; and classification problems, which seek predict results from discrete qualitative variables. For social work research, machine learning generates predictions as a key element to improving social interventions on complex social issues by providing better inference from data and establishing more precise estimated effects, for example in services that seek to improve their outcomes. This paper exposes the results of a classification algorithm to predict cyberbullying among adolescents. Data were retrieved from the National Polyvictimization Survey conducted by the government of Chile in 2017. A logistic regression model was created to predict if an adolescent would experience cyberbullying based on the interaction and behavior of gender, age, grade, type of school, and self-esteem sentiments. The model can predict with an accuracy of 59.8% if an adolescent will suffer cyberbullying. These results can help to promote programs to avoid cyberbullying at schools and improve evidence based practice.Keywords: cyberbullying, evidence based practice, machine learning, social work research
Procedia PDF Downloads 1708489 The Effectiveness of Electronic Local Financial Management Information System (ELFMIS) in Mempawah Regency, West Borneo Province, Indonesia
Authors: Muhadam Labolo, Afdal R. Anwar, Sucia Miranti Sipisang
Abstract:
Electronic Local Finance Management Information System (ELFMIS) is integrated application that was used as a tool for local governments to improve the effectiveness of the implementation of the various areas of financial management regulations. Appropriate With Exceptions Opinion (WDP) of Indonesia Audit Agency (BPK) for local governments Mempawah is a financial management problem that must be improved to avoid mistakes in decision-making. The use of Electronic Local Finance Management Information System (ELFMIS) by Mempawah authority has not yet performed maximally. These problems became the basis for research in measuring the effectiveness LFMIS in Mempawah regency. This research uses an indicator variable for measuring information systems effectiveness proposed by Bodnar. This research made use descriptive with inductive approach. Data collection techniques were mixed from qualitative and quantitative techniques, used questionnaires, interviews and documentation. The obstacles in Local Finance Board (LFB) for the application of ELFMIS such as connection, the quality and quantity of human resources, realization of financial resources, absence of maintenance and another facilities of ELFMIS and verification for financial information.Keywords: effectiveness, E-LFMIS, finance, local government, system
Procedia PDF Downloads 2208488 Application of Voltammetry as a Non-Destructive Tool to Quantify Cathodic Protection of Steel in Simulated Soil Solution
Authors: Mandlenkosi G. R. Mahlobo, Peter A. Olubambi
Abstract:
Cathodic protection (CP) has been widely considered as a suitable technique for mitigating corrosion of steel structures buried in soil. Plenty of efforts have been made in developing techniques, in particular non-destructive techniques, for monitoring and quantifying the effectiveness of CP to ensure the sustainability and performance of buried steel structures. This study was aimed at using a specifically modified voltammetry approach as a non-destructive tool to monitor and quantify the effectiveness of CP of steel in simulated soil. Carbon steel was subjected to electrochemical tests with NS4 solution used as simulated soil conditions for four days before applying CP for further 11 days. A specifically modified voltammetry technique was applied at various time intervals of the experiment to monitor the corrosion behaviour and therefore reflect CP effectiveness. The voltammetry results revealed that the application of CP reduced the corrosion rate from the highest value of 410 µm/yr to 8 µm/yr between days 5 and 14 of the experiments. The microstructural analysis of the steel surface performed using x-ray diffraction identified calcareous deposit as the dominant phase protecting the surface from corrosion. It was deduced that the formation of calcareous deposits was linked with the effectiveness of CP of steel.Keywords: carbon steel, cathodic protection, NS4 solution, voltammetry, XRD
Procedia PDF Downloads 718487 Ground Improvement Using Deep Vibro Techniques at Madhepura E-Loco Project
Authors: A. Sekhar, N. Ramakrishna Raju
Abstract:
This paper is a result of ground improvement using deep vibro techniques with combination of sand and stone columns performed on a highly liquefaction susceptible site (70 to 80% sand strata and balance silt) with low bearing capacities due to high settlements located (earth quake zone V as per IS code) at Madhepura, Bihar state in northern part of India. Initially, it was envisaged with bored cast in-situ/precast piles, stone/sand columns. However, after detail analysis to address both liquefaction and improve bearing capacities simultaneously, it was analyzed the deep vibro techniques with combination of sand and stone columns is excellent solution for given site condition which may be first time in India. First after detail soil investigation, pre eCPT test was conducted to evaluate the potential depth of liquefaction to densify silty sandy soils to improve factor of safety against liquefaction. Then trail test were being carried out at site by deep vibro compaction technique with sand and stone columns combination with different spacings of columns in triangular shape with different timings during each lift of vibro up to ground level. Different spacings and timing was done to obtain the most effective spacing and timing with vibro compaction technique to achieve maximum densification of saturated loose silty sandy soils uniformly for complete treated area. Then again, post eCPT test and plate load tests were conducted at all trail locations of different spacings and timing of sand and stone columns to evaluate the best results for obtaining the required factor of safety against liquefaction and the desired bearing capacities with reduced settlements for construction of industrial structures. After reviewing these results, it was noticed that the ground layers are densified more than the expected with improved factor of safety against liquefaction and achieved good bearing capacities for a given settlements as per IS codal provisions. It was also worked out for cost-effectiveness of lightly loaded single storied structures by using deep vibro technique with sand column avoiding stone. The results were observed satisfactory for resting the lightly loaded foundations. In this technique, the most important is to mitigating liquefaction with improved bearing capacities and reduced settlements to acceptable limits as per IS: 1904-1986 simultaneously up to a depth of 19M. To our best knowledge it was executed first time in India.Keywords: ground improvement, deep vibro techniques, liquefaction, bearing capacity, settlement
Procedia PDF Downloads 1978486 Heuristic Classification of Hydrophone Recordings
Authors: Daniel M. Wolff, Patricia Gray, Rafael de la Parra Venegas
Abstract:
An unsupervised machine listening system is constructed and applied to a dataset of 17,195 30-second marine hydrophone recordings. The system is then heuristically supplemented with anecdotal listening, contextual recording information, and supervised learning techniques to reduce the number of false positives. Features for classification are assembled by extracting the following data from each of the audio files: the spectral centroid, root-mean-squared values for each frequency band of a 10-octave filter bank, and mel-frequency cepstral coefficients in 5-second frames. In this way both time- and frequency-domain information are contained in the features to be passed to a clustering algorithm. Classification is performed using the k-means algorithm and then a k-nearest neighbors search. Different values of k are experimented with, in addition to different combinations of the available feature sets. Hypothesized class labels are 'primarily anthrophony' and 'primarily biophony', where the best class result conforming to the former label has 104 members after heuristic pruning. This demonstrates how a large audio dataset has been made more tractable with machine learning techniques, forming the foundation of a framework designed to acoustically monitor and gauge biological and anthropogenic activity in a marine environment.Keywords: anthrophony, hydrophone, k-means, machine learning
Procedia PDF Downloads 1718485 Educational Data Mining: The Case of the Department of Mathematics and Computing in the Period 2009-2018
Authors: Mário Ernesto Sitoe, Orlando Zacarias
Abstract:
University education is influenced by several factors that range from the adoption of strategies to strengthen the whole process to the academic performance improvement of the students themselves. This work uses data mining techniques to develop a predictive model to identify students with a tendency to evasion and retention. To this end, a database of real students’ data from the Department of University Admission (DAU) and the Department of Mathematics and Informatics (DMI) was used. The data comprised 388 undergraduate students admitted in the years 2009 to 2014. The Weka tool was used for model building, using three different techniques, namely: K-nearest neighbor, random forest, and logistic regression. To allow for training on multiple train-test splits, a cross-validation approach was employed with a varying number of folds. To reduce bias variance and improve the performance of the models, ensemble methods of Bagging and Stacking were used. After comparing the results obtained by the three classifiers, Logistic Regression using Bagging with seven folds obtained the best performance, showing results above 90% in all evaluated metrics: accuracy, rate of true positives, and precision. Retention is the most common tendency.Keywords: evasion and retention, cross-validation, bagging, stacking
Procedia PDF Downloads 878484 Self-Marketing on Line Person-to-Person Social Media
Authors: Chih-Ping Chen
Abstract:
Today, technology does not necessitate change; rather, social media has afforded a new arena and digital tools for users/individuals to be symbolized and marketed in meaningful exchanges of digital identities. We argue that these symbolic interactions may afford individuals the ability to create and present less restricted Line person-to-person (P2P) chats than would be possible in face-to-face communications. Individuals can select flexible influence strategies to market themselves, which enables them to create and present their digital identities and impressions in alternative ways within a dynamic sociocultural context. Therefore, this paper aims to explore the novel phenomenon of how individuals market themselves to manage their digital identities and impressions to connect with other users through the symbolic interactions created by new digital tools (e.g., stickers). A netnographic approach was developed by applying a triangulated methodology consisting of user self-diary reports, in-depth interviews, and observations. Totally, 20 participants (10 females and 10 males) were of Taiwanese origin, and their ages ranged from 20–47 years old. The findings of this research showed that individuals on Line P2P social media where traditional cultural gender norms have shifted. Both male and female participants market their modern digital identities by adopting a combination of flexible influence tactics/strategies when using digital stickers. Some findings showed that their influence tactics/strategies often flouted Taiwanese cultural gender norms or skirted traditional rules to fit individual or P2P needs. Finally, these findings potentially contributed to the literature regarding the consumer culture theory and symbolic interaction theory in digital marketing and social media fields.Keywords: Consumer culture theory, Digital sticker, Self-marketing, Impression, Symbolic interaciton
Procedia PDF Downloads 838483 An Empirical Investigation of Factors Influencing Construction Project Selection Processes within the Nigeria Public Sector
Authors: Emmanuel U. Unuafe, Oyegoke T. Bukoye, Sandhya Sastry, Yanqing Duan
Abstract:
Globally, there is increasing interest in project management due to a shortage in infrastructure services supply capability. Hence, it is of utmost importance that organisations understand that choosing a particular project over another is an opportunity cost – tying up the organisations resources. In order to devise constructive ways to bring direction, structure, and oversight to the process of project selection has led to the development of tools and techniques by researchers and practitioners. However, despite the development of various frameworks to assist in the appraisal and selection of government projects, failures are still being recorded with government projects. In developing countries, where frameworks are rarely used, the problems are compounded. To improve the situation, this study will investigate the current practice of construction project selection processes within the Nigeria public sector in order to inform theories of decision making from the perspective of developing nations and project management practice. Unlike other research around construction projects in Nigeria this research concentrate on factors influencing the selection process within the Nigeria public sector, which has received limited study. The authors report the findings of semi-structured interviews of top management in the Nigerian public sector and draw conclusions in terms of decision making extant theory and current practice. Preliminary results from the data analysis show that groups make project selection decisions and this forces sub-optimal decisions due to pressure on time, clashes of interest, lack of standardised framework for selecting projects, lack of accountability and poor leadership. Consequently, because decision maker is usually drawn from different fields, religious beliefs, ethnic group and with different languages. The choice of a project by an individual will be greatly influence by experience, political precedence than by realistic investigation as well as his understanding of the desired outcome of the project, in other words, the individual’s ideology and their level of fairness.Keywords: factors influencing project selection, public sector construction project selection, projects portfolio selection, strategic decision-making
Procedia PDF Downloads 3318482 Improving the Strength Characteristics of Soil Using Cotton Fibers
Authors: Bindhu Lal, Karnika Kochal
Abstract:
Clayey soil contains clay minerals with traces of metal oxides and organic matter, which exhibits properties like low drainage, high plasticity, and shrinkage. To overcome these issues, various soil reinforcement techniques are used to elevate the stiffness, water tightness, and bearing capacity of the soil. Such techniques include cementation, bituminization, freezing, fiber inclusion, geo-synthetics, nailing, etc. Reinforcement of soil with fibers has been a cost-effective solution to soil improvement problems. An experimental study was undertaken involving the inclusion of cotton waste fibers in clayey soil as reinforcement with different fiber contents (1%, 1.5%, 2%, and 2.5% by weight) and analyzing its effects on the unconfined compressive strength of the soil. Two categories of soil were taken, comprising of natural clay and clay mixed with 5% sodium bentonite by weight. The soil specimens were subjected to proctor compaction and unconfined compression tests. The validated outcome shows that fiber inclusion has a strikingly positive impact on the compressive strength and axial strain at failure of the soil. Based on the commendatory results procured, compressive strength was found to be directly proportional to the fiber content, with the effect being more pronounced at lower water content.Keywords: bentonite clay, clay, cotton fibers, unconfined compressive strength
Procedia PDF Downloads 1818481 A Review of Benefit-Risk Assessment over the Product Lifecycle
Authors: M. Miljkovic, A. Urakpo, M. Simic-Koumoutsaris
Abstract:
Benefit-risk assessment (BRA) is a valuable tool that takes place in multiple stages during a medicine's lifecycle, and this assessment can be conducted in a variety of ways. The aim was to summarize current BRA methods used during approval decisions and in post-approval settings and to see possible future directions. Relevant reviews, recommendations, and guidelines published in medical literature and through regulatory agencies over the past five years have been examined. BRA implies the review of two dimensions: the dimension of benefits (determined mainly by the therapeutic efficacy) and the dimension of risks (comprises the safety profile of a drug). Regulators, industry, and academia have developed various approaches, ranging from descriptive textual (qualitative) to decision-analytic (quantitative) models, to facilitate the BRA of medicines during the product lifecycle (from Phase I trials, to authorization procedure, post-marketing surveillance and health technology assessment for inclusion in public formularies). These approaches can be classified into the following categories: stepwise structured approaches (frameworks); measures for benefits and risks that are usually endpoint specific (metrics), simulation techniques and meta-analysis (estimation techniques), and utility survey techniques to elicit stakeholders’ preferences (utilities). All these approaches share the following two common goals: to assist this analysis and to improve the communication of decisions, but each is subject to its own specific strengths and limitations. Before using any method, its utility, complexity, the extent to which it is established, and the ease of results interpretation should be considered. Despite widespread and long-time use, BRA is subject to debate, suffers from a number of limitations, and currently is still under development. The use of formal, systematic structured approaches to BRA for regulatory decision-making and quantitative methods to support BRA during the product lifecycle is a standard practice in medicine that is subject to continuous improvement and modernization, not only in methodology but also in cooperation between organizations.Keywords: benefit-risk assessment, benefit-risk profile, product lifecycle, quantitative methods, structured approaches
Procedia PDF Downloads 1608480 Teaching English for Specific Purposes to Business Students through Social Media
Authors: Candela Contero Urgal
Abstract:
Using realia to teach English for Specific Purposes (ESP) is a must, as it is thought to be designed to meet the students’ real needs in their professional life. Teachers are then expected to offer authentic materials and set students in authentic contexts where their learning outcomes can be highly meaningful. One way of engaging students is using social networks as a way to bridge the gap between their everyday life and their ESP learning outcomes. It is in ESP, particularly in Business English teaching, that our study focuses, as the ongoing process of digitalization is leading firms to use social media to communicate with potential clients. The present paper is aimed at carrying out a case study in which different digital tools are employed as a way to offer a collection of formats businesses are currently using so as to internationalize and advertise their products and services. A secondary objective of our study will then be to progress on the development of multidisciplinary competencies students are to acquire during their degree. A two-phased study will be presented. The first phase will cover the analysis of course tasks accomplished by undergraduate students at the University of Cadiz (Spain) in their third year of the Degree in Business Management and Administration by comparing the results obtained during the years 2019 to 2021. The second part of our study will present a survey conducted to these students in 2021 and 2022 so as to verify their interest in learning new ways to digitalize as well as internationalize their future businesses. Findings will confirm students’ interest in working with updated realia in their Business English lessons, as a consequence of their strong belief in the necessity to have authentic contexts and didactic resources. Despite the limitations social media can have as a means to teach business English, students will still find it highly beneficial since it will foster their familiarisation with the digital tools they will need to use when they get to the labour market.Keywords: English for specific purposes, business English, internationalization of higher education, foreign language teaching
Procedia PDF Downloads 1178479 A Metallography Study of Secondary A226 Aluminium Alloy Used in Automotive Industries
Authors: Lenka Hurtalová, Eva Tillová, Mária Chalupová, Juraj Belan, Milan Uhríčik
Abstract:
The secondary alloy A226 is used for many automotive casting produced by mould casting and high pressure die-casting. This alloy has excellent castability, good mechanical properties and cost-effectiveness. Production of primary aluminium alloys belong to heavy source fouling of life environs. The European Union calls for the emission reduction and reduction in energy consumption, therefore, increase production of recycled (secondary) aluminium cast alloys. The contribution is deal with influence of recycling on the quality of the casting made from A226 in automotive industry. The properties of the casting made from secondary aluminium alloys were compared with the required properties of primary aluminium alloys. The effect of recycling on microstructure was observed using combination different analytical techniques (light microscopy upon black-white etching, scanning electron microscopy-SEM upon deep etching and energy dispersive X-ray analysis-EDX). These techniques were used for the identification of the various structure parameters, which was used to compare secondary alloy microstructure with primary alloy microstructure.Keywords: A226 secondary aluminium alloy, deep etching, mechanical properties, recycling foundry aluminium alloy
Procedia PDF Downloads 5468478 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming
Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero
Abstract:
Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up
Procedia PDF Downloads 2458477 Reducing Crash Risk at Intersections with Safety Improvements
Authors: Upal Barua
Abstract:
Crash risk at intersections is a critical safety issue. This paper examines the effectiveness of removing an existing off-set at an intersection by realignment, in reducing crashes. Empirical Bayes method was applied to conduct a before-and-after study to assess the effect of this safety improvement. The Transportation Safety Improvement Program in Austin Transportation Department completed several safety improvement projects at high crash intersections with a view to reducing crashes. One of the common safety improvement techniques applied was the realignment of intersection approaches removing an existing off-set. This paper illustrates how this safety improvement technique is applied at a high crash intersection from inception to completion. This paper also highlights the significant crash reductions achieved from this safety improvement technique applying Empirical Bayes method in a before-and-after study. The result showed that realignment of intersection approaches removing an existing off-set can reduce crashes by 53%. This paper also features the state of the art techniques applied in planning, engineering, designing and construction of this safety improvement, key factors driving the success, and lessons learned in the process.Keywords: crash risk, intersection, off-set, safety improvement technique, before-and-after study, empirical Bayes method
Procedia PDF Downloads 2468476 Electrochemical and Theoretical Quantum Approaches on the Inhibition of C1018 Carbon Steel Corrosion in Acidic Medium Containing Chloride Using Newly Synthesized Phenolic Schiff Bases Compounds
Authors: Hany M. Abd El-Lateef
Abstract:
Two novel Schiff bases, 5-bromo-2-[(E)-(pyridin-3-ylimino) methyl] phenol (HBSAP) and 5-bromo-2-[(E)-(quinolin-8-ylimino) methyl] phenol (HBSAQ) have been synthesized. They have been characterized by elemental analysis and spectroscopic techniques (UV–Vis, IR and NMR). Moreover, the molecular structure of HBSAP and HBSAQ compounds are determined by single crystal X-ray diffraction technique. The inhibition activity of HBSAP and HBSAQ for carbon steel in 3.5 %NaCl+0.1 M HCl for both short and long immersion time, at different temperatures (20-50 ºC), was investigated using electrochemistry and surface characterization. The potentiodynamic polarization shows that the inhibitors molecule is more adsorbed on the cathodic sites. Its efficiency increases with increasing inhibitor concentrations (92.8 % at the optimal concentration of 10-3 M for HBSAQ). Adsorption of the inhibitors on the carbon steel surface was found to obey Langmuir’s adsorption isotherm with physical/chemical nature of the adsorption, as it is shown also by scanning electron microscopy. Further, the electronic structural calculations using quantum chemical methods were found to be in a good agreement with the results of the experimental studies.Keywords: carbon steel, Schiff bases, corrosion inhibition, SEM, electrochemical techniques
Procedia PDF Downloads 3948475 Methods for Enhancing Ensemble Learning or Improving Classifiers of This Technique in the Analysis and Classification of Brain Signals
Authors: Seyed Mehdi Ghezi, Hesam Hasanpoor
Abstract:
This scientific article explores enhancement methods for ensemble learning with the aim of improving the performance of classifiers in the analysis and classification of brain signals. The research approach in this field consists of two main parts, each with its own strengths and weaknesses. The choice of approach depends on the specific research question and available resources. By combining these approaches and leveraging their respective strengths, researchers can enhance the accuracy and reliability of classification results, consequently advancing our understanding of the brain and its functions. The first approach focuses on utilizing machine learning methods to identify the best features among the vast array of features present in brain signals. The selection of features varies depending on the research objective, and different techniques have been employed for this purpose. For instance, the genetic algorithm has been used in some studies to identify the best features, while optimization methods have been utilized in others to identify the most influential features. Additionally, machine learning techniques have been applied to determine the influential electrodes in classification. Ensemble learning plays a crucial role in identifying the best features that contribute to learning, thereby improving the overall results. The second approach concentrates on designing and implementing methods for selecting the best classifier or utilizing meta-classifiers to enhance the final results in ensemble learning. In a different section of the research, a single classifier is used instead of multiple classifiers, employing different sets of features to improve the results. The article provides an in-depth examination of each technique, highlighting their advantages and limitations. By integrating these techniques, researchers can enhance the performance of classifiers in the analysis and classification of brain signals. This advancement in ensemble learning methodologies contributes to a better understanding of the brain and its functions, ultimately leading to improved accuracy and reliability in brain signal analysis and classification.Keywords: ensemble learning, brain signals, classification, feature selection, machine learning, genetic algorithm, optimization methods, influential features, influential electrodes, meta-classifiers
Procedia PDF Downloads 808474 Materials for Sustainability
Authors: Qiuying Li
Abstract:
It is a shared opinion that sustainable development requires a system discontinuity, meaning that radical changes in the way we produce and consume are needed. Within this framework there is an emerging understanding that an important contribution to this change can be directly linked to decisions taken in the design phase of products, services and systems. Design schools have therefore to be able to provide design students with a broad knowledge and effective Design for Sustainability tools, in order to enable a new generation of designers in playing an active role in reorienting our consumption and production patterns.Keywords: design for sustainability, services, systems, materials, ecomaterials
Procedia PDF Downloads 4498473 Development of an Automatic Calibration Framework for Hydrologic Modelling Using Approximate Bayesian Computation
Authors: A. Chowdhury, P. Egodawatta, J. M. McGree, A. Goonetilleke
Abstract:
Hydrologic models are increasingly used as tools to predict stormwater quantity and quality from urban catchments. However, due to a range of practical issues, most models produce gross errors in simulating complex hydraulic and hydrologic systems. Difficulty in finding a robust approach for model calibration is one of the main issues. Though automatic calibration techniques are available, they are rarely used in common commercial hydraulic and hydrologic modelling software e.g. MIKE URBAN. This is partly due to the need for a large number of parameters and large datasets in the calibration process. To overcome this practical issue, a framework for automatic calibration of a hydrologic model was developed in R platform and presented in this paper. The model was developed based on the time-area conceptualization. Four calibration parameters, including initial loss, reduction factor, time of concentration and time-lag were considered as the primary set of parameters. Using these parameters, automatic calibration was performed using Approximate Bayesian Computation (ABC). ABC is a simulation-based technique for performing Bayesian inference when the likelihood is intractable or computationally expensive to compute. To test the performance and usefulness, the technique was used to simulate three small catchments in Gold Coast. For comparison, simulation outcomes from the same three catchments using commercial modelling software, MIKE URBAN were used. The graphical comparison shows strong agreement of MIKE URBAN result within the upper and lower 95% credible intervals of posterior predictions as obtained via ABC. Statistical validation for posterior predictions of runoff result using coefficient of determination (CD), root mean square error (RMSE) and maximum error (ME) was found reasonable for three study catchments. The main benefit of using ABC over MIKE URBAN is that ABC provides a posterior distribution for runoff flow prediction, and therefore associated uncertainty in predictions can be obtained. In contrast, MIKE URBAN just provides a point estimate. Based on the results of the analysis, it appears as though ABC the developed framework performs well for automatic calibration.Keywords: automatic calibration framework, approximate bayesian computation, hydrologic and hydraulic modelling, MIKE URBAN software, R platform
Procedia PDF Downloads 3108472 Laboratory Model Tests on Encased Group Columns
Authors: Kausar Ali
Abstract:
There are several ground treatment techniques which may meet the twin objectives of increasing the bearing capacity with simultaneous reduction of settlements, but the use of stone columns is one of the most suited techniques for flexible structures such as embankments, oil storage tanks etc. that can tolerate some settlement and used worldwide. However, when the stone columns in very soft soils are loaded; stone columns undergo excessive settlement due to low lateral confinement provided by the soft soil, leading to the failure of the structure. The poor performance of stone columns under these conditions can be improved by encasing the columns with a suitable geosynthetic. In this study, the effect of reinforcement on bearing capacity of composite soil has been investigated by conducting laboratory model tests on floating and end bearing long stone columns with l/d ratio of 12. The columns were reinforced by providing geosynthetic encasement over varying column length (upper 25%, 50%, 75%, and 100% column length). In this study, a group of columns has been used instead of single column, because in the field, columns used for the purpose always remain in groups. The tests indicate that the encasement over the full column length gives higher failure stress as compared to the encasement over the partial column length for both floating and end bearing long columns. The performance of end-bearing columns was found much better than the floating columns.Keywords: geosynthetic, ground improvement, soft clay, stone column
Procedia PDF Downloads 4348471 Qualitative Data Summary of Piloted Observation Instrument for Designing Adaptations in Inclusive Settings
Authors: Rebecca Lynn
Abstract:
The successful inclusion of students with disabilities depends upon many factors, including the collaboration between general and special education teachers for meeting student learning goals as outlined in the Individualized Education Plan (IEP). However, Individualized Education Plans do not provide sufficient information on accommodations and modifications for the variety of general education contexts and content areas in which a student may participate. In addition, general and special education teachers lack observation skills and tools for gathering essential information about the strengths and needs of students with disabilities in relation to general education instruction and classrooms. More research and tools are needed for planning adaptations that increase access to content in general education classrooms. This paper will discuss the outcomes of a qualitative field-based study of a structured observation instrument used for gathering information on student strengths and needs in relation to social, academic and regulatory expectations during instruction in general education classrooms. The study explores the following questions: To what extent does the observation structure and instrument increase collaborative planning of adaptations in general education classrooms for students with disabilities? To what extent does the observation structure and instrument change pedagogical practices and collaboration in general education classrooms for fostering successful inclusion? A hypothesis of this study was that use of the instrument in the context of lessons and in collaborative debriefing would increase awareness and use of meaningful adaptations, and lead to universal design in the planning of instruction. A finding of the study is a shift from viewing students with disabilities as passive participants to a more pedagogical inclusion as teachers developed skills in observation and created content/context-specific adaptations for students with disabilities in the general education classroom.Keywords: adaptations, collaboration, inclusion, observations
Procedia PDF Downloads 128