Search results for: scientific models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8628

Search results for: scientific models

6918 On an Experimental Method for Investigating the Dynamic Parameters of Multi-Story Buildings at Vibrating Seismic Loadings

Authors: Shakir Mamedov, Tukezban Hasanova

Abstract:

Research of dynamic properties of various materials and elements of structures at shock affecting and on the waves so many scientific works of the Azerbaijani scientists are devoted. However, Experimental definition of dynamic parameters of fluctuations of constructions and buildings while carries estimated character. The purpose of the present experimental researches is definition of parameters of fluctuations of installation of observations. In this case, a mockup of four floor buildings and sixteen floor skeleton-type buildings built in the Baku with the stiffening diaphragm at natural vibrating seismic affectings.

Keywords: fluctuations, seismoreceivers, dynamic experiments, acceleration

Procedia PDF Downloads 398
6917 Assessing Local Authorities’ Interest in Addressing Urban Challenges through Nature Based Solutions in Romania

Authors: Athanasios A. Gavrilidis, Mihai R. Nita, Larissa N. Stoia, Diana A. Onose

Abstract:

Contemporary global environmental challenges must be primarily addressed at local levels. Cities are under continuous pressure as they must ensure high quality of life levels for their citizens and at the same time to adapt and address specific environmental issues. Innovative solutions using natural features or mimicking natural systems are endorsed by the scientific community as efficient approaches for both mitigating climate change effects and the decrease of environmental quality and for maintaining high standards of living for urban dwellers. The aim of this study was to assess whether Romanian cities’ authorities are considering nature-based innovation as solutions for their planning, management, and environmental issues. Data were gathered by applying 140 questionnaires to urban authorities throughout the country. The questionnaire was designed for assessinglocal policy makers’ perspective over the efficiency of nature-based innovations as a tool to address specific challenges. It also focused on extracting data about financing sources and challenges they must overcome for adopting nature-based approaches. The gather results from the municipalities participating in our study were statistically processed, and they revealed that Romanian city managers acknowledge the benefits of nature-based innovations, but investments in this sector are not on top of their priorities. More than 90% of the selected cities have agreed that in the last 10 years, their major concern was to expand the grey infrastructure (roads and public amenities) using traditional approaches. When asked how they would react if faced with different socio-economic and environmental challenges, local urban managers indicated investments nature-based solutions as a priority only in case of biodiversity loss and extreme weather, while for other 14 proposed scenarios, they would embrace the business-as-usual approach. Our study indicates that while new concepts of sustainable urban planning emerge within the scientific community, local authorities need more time to understand and implement them. Without the proper knowledge, personnel, policies, or dedicated budgets, local administrators will not embrace nature-based innovations as solutions for their challenges.

Keywords: nature based innovations, perception analysis, policy making, urban planning

Procedia PDF Downloads 174
6916 Bioinformatics High Performance Computation and Big Data

Authors: Javed Mohammed

Abstract:

Right now, bio-medical infrastructure lags well behind the curve. Our healthcare system is dispersed and disjointed; medical records are a bit of a mess; and we do not yet have the capacity to store and process the crazy amounts of data coming our way from widespread whole-genome sequencing. And then there are privacy issues. Despite these infrastructure challenges, some researchers are plunging into bio medical Big Data now, in hopes of extracting new and actionable knowledge. They are doing delving into molecular-level data to discover bio markers that help classify patients based on their response to existing treatments; and pushing their results out to physicians in novel and creative ways. Computer scientists and bio medical researchers are able to transform data into models and simulations that will enable scientists for the first time to gain a profound under-standing of the deepest biological functions. Solving biological problems may require High-Performance Computing HPC due either to the massive parallel computation required to solve a particular problem or to algorithmic complexity that may range from difficult to intractable. Many problems involve seemingly well-behaved polynomial time algorithms (such as all-to-all comparisons) but have massive computational requirements due to the large data sets that must be analyzed. High-throughput techniques for DNA sequencing and analysis of gene expression have led to exponential growth in the amount of publicly available genomic data. With the increased availability of genomic data traditional database approaches are no longer sufficient for rapidly performing life science queries involving the fusion of data types. Computing systems are now so powerful it is possible for researchers to consider modeling the folding of a protein or even the simulation of an entire human body. This research paper emphasizes the computational biology's growing need for high-performance computing and Big Data. It illustrates this article’s indispensability in meeting the scientific and engineering challenges of the twenty-first century, and how Protein Folding (the structure and function of proteins) and Phylogeny Reconstruction (evolutionary history of a group of genes) can use HPC that provides sufficient capability for evaluating or solving more limited but meaningful instances. This article also indicates solutions to optimization problems, and benefits Big Data and Computational Biology. The article illustrates the Current State-of-the-Art and Future-Generation Biology of HPC Computing with Big Data.

Keywords: high performance, big data, parallel computation, molecular data, computational biology

Procedia PDF Downloads 363
6915 Designing Product-Service-System Applied to Reusable Packaging Solutions: A Strategic Design Tool

Authors: Yuan Long, Fabrizio Ceschin, David Harrison

Abstract:

Environmental sustainability is under the threat of excessive single-use plastic packaging waste, and current waste management fails to address this issue. Therefore, it has led to a reidentification of the alternative, which can curb the packaging waste without reducing social needs. Reusable packaging represents a circular approach to close the loop of consumption in which packaging can stay longer in the system to satisfy social needs. However, the implementation of reusable packaging is fragmented and lacks systematic approaches. The product-service system (PSS) is widely regarded as a sustainable business model innovation for embracing circular consumption. As a result, applying PSS to reusable packaging solutions will be promising to address the packaging waste issue. This paper aims at filling the knowledge gap relating to apply PSS to reusable packaging solutions and provide a strategic design tool that could support packaging professionals to design reusable packaging solutions. The methodology of this paper is case studies and workshops to provide a design tool. The respondents are packaging professionals who are packaging consultants, NGO professionals, and entrepreneurs. 57 cases collected show that 15 archetypal models operate in the market. Subsequently, a polarity diagram is developed to embrace those 15 archetypal models, and a total number of 24 experts were invited for the workshop to evaluate the design tool. This research finally provides a strategic design tool to support packaging professionals to design reusable packaging solutions. The application of the tool is to support the understanding of the reusable packaging solutions, analyzing the markets, identifying new opportunities, and generate new business models. The implication of this research is to provide insights for academics and businesses in terms of tackling single-use packaging waste and build a foundation for further development of the reusable packaging solution tool.

Keywords: environmental sustainability, product-service system, reusable packaging, design tool

Procedia PDF Downloads 148
6914 Concurrent Engineering Challenges and Resolution Mechanisms from Quality Perspectives

Authors: Grmanesh Gidey Kahsay

Abstract:

In modern technical engineering applications, quality is defined in two ways. The first one is that quality is the parameter that measures a product or service’s characteristics to meet and satisfy the pre-stated or fundamental needs (reliability, durability, serviceability). The second one is the quality of a product or service free of any defect or deficiencies. The American Society for Quality (ASQ) describes quality as a pursuit of optimal solutions to confirm successes and fulfillment to be accountable for the product or service's requirements and expectations. This article focuses on quality engineering tools in modern industrial applications. Quality engineering is a field of engineering that deals with the principles, techniques, models, and applications of the product or service to guarantee quality. Including the entire activities to analyze the product’s design and development, quality engineering emphasizes how to make sure that products and services are designed and developed to meet consumers’ requirements. This episode acquaints with quality tools such as quality systems, auditing, product design, and process control. The finding presents thoughts that aim to improve quality engineering proficiency and effectiveness by introducing essential quality techniques and tools in some selected industries.

Keywords: essential quality tools, quality systems and models, quality management systems, and quality assurance

Procedia PDF Downloads 152
6913 Empirical Modeling of Air Dried Rubberwood Drying System

Authors: S. Khamtree, T. Ratanawilai, C. Nuntadusit

Abstract:

Rubberwood is a crucial commercial timber in Southern Thailand. All processes in a rubberwood production depend on the knowledge and expertise of the technicians, especially the drying process. This research aims to develop an empirical model for drying kinetics in rubberwood. During the experiment, the temperature of the hot air and the average air flow velocity were kept at 80-100 °C and 1.75 m/s, respectively. The moisture content in the samples was determined less than 12% in the achievement of drying basis. The drying kinetic was simulated using an empirical solver. The experimental results illustrated that the moisture content was reduced whereas the drying temperature and time were increased. The coefficient of the moisture ratio between the empirical and the experimental model was tested with three statistical parameters, R-square (), Root Mean Square Error (RMSE) and Chi-square (χ²) to predict the accuracy of the parameters. The experimental moisture ratio had a good fit with the empirical model. Additionally, the results indicated that the drying of rubberwood using the Henderson and Pabis model revealed the suitable level of agreement. The result presented an excellent estimation (= 0.9963) for the moisture movement compared to the other models. Therefore, the empirical results were valid and can be implemented in the future experiments.

Keywords: empirical models, rubberwood, moisture ratio, hot air drying

Procedia PDF Downloads 267
6912 Cognitive eTransformation Framework for Education Sector

Authors: A. Hol

Abstract:

21st century brought waves of business and industry eTransformations. The impact of change is also being seen in education. To identify the extent of this, scenario analysis methodology was utilised with the aim to assess business transformations across industry sectors ranging from craftsmanship, medicine, finance and manufacture to innovations and adoptions of new technologies and business models. Firstly, scenarios were drafted based on the current eTransformation models and its dimensions. Following this, eTransformation framework was utilised with the aim to derive the key eTransformation parameters, the essential characteristics that have enabled eTransformations across the sectors. Following this, identified key parameters were mapped to the transforming domain-education. The mapping assisted in deriving a cognitive eTransformation framework for education sector. The framework highlights the importance of context and the notion that education today needs not only to deliver content to students but it also needs to be able to meet the dynamically changing demands of specific student and industry groups. Furthermore, it pinpoints that for such processes to be supported, specific technology is required, so that instant, on demand and periodic feedback as well as flexible, dynamically expanding study content can be sought and received via multiple education mediums.

Keywords: education sector, business transformation, eTransformation model, cognitive model, cognitive systems, eTransformation

Procedia PDF Downloads 136
6911 A Dynamic Neural Network Model for Accurate Detection of Masked Faces

Authors: Oladapo Tolulope Ibitoye

Abstract:

Neural networks have become prominent and widely engaged in algorithmic-based machine learning networks. They are perfect in solving day-to-day issues to a certain extent. Neural networks are computing systems with several interconnected nodes. One of the numerous areas of application of neural networks is object detection. This is a prominent area due to the coronavirus disease pandemic and the post-pandemic phases. Wearing a face mask in public slows the spread of the virus, according to experts’ submission. This calls for the development of a reliable and effective model for detecting face masks on people's faces during compliance checks. The existing neural network models for facemask detection are characterized by their black-box nature and large dataset requirement. The highlighted challenges have compromised the performance of the existing models. The proposed model utilized Faster R-CNN Model on Inception V3 backbone to reduce system complexity and dataset requirement. The model was trained and validated with very few datasets and evaluation results shows an overall accuracy of 96% regardless of skin tone.

Keywords: convolutional neural network, face detection, face mask, masked faces

Procedia PDF Downloads 68
6910 Fragile States as the Fertile Ground for Non-State Actors: Colombia and Somalia

Authors: Giorgi Goguadze, Jakub Zajączkowski

Abstract:

This paper is written due to overview the connection between fragile states and non-state actors, we should take into account that fragile states may vary from weak, failing and failed. In this paper we will discuss about two countries, one of them is weak (Colombia/ second one is already failed- Somalia. We will try to understand what feeds ill non-state actors such as: terrorist organizations, criminal entities and other cells in these countries, what threats are they representing and how to eliminate these dangers in both national and international scope. This paper is mainly based on literature overview and personal attitude and doesn’t claim to be in scientific chain.

Keywords: fragile States, terrorism, tribalism, Somalia

Procedia PDF Downloads 367
6909 Use Cloud-Based Watson Deep Learning Platform to Train Models Faster and More Accurate

Authors: Susan Diamond

Abstract:

Machine Learning workloads have traditionally been run in high-performance computing (HPC) environments, where users log in to dedicated machines and utilize the attached GPUs to run training jobs on huge datasets. Training of large neural network models is very resource intensive, and even after exploiting parallelism and accelerators such as GPUs, a single training job can still take days. Consequently, the cost of hardware is a barrier to entry. Even when upfront cost is not a concern, the lead time to set up such an HPC environment takes months from acquiring hardware to set up the hardware with the right set of firmware, software installed and configured. Furthermore, scalability is hard to achieve in a rigid traditional lab environment. Therefore, it is slow to react to the dynamic change in the artificial intelligent industry. Watson Deep Learning as a service, a cloud-based deep learning platform that mitigates the long lead time and high upfront investment in hardware. It enables robust and scalable sharing of resources among the teams in an organization. It is designed for on-demand cloud environments. Providing a similar user experience in a multi-tenant cloud environment comes with its own unique challenges regarding fault tolerance, performance, and security. Watson Deep Learning as a service tackles these challenges and present a deep learning stack for the cloud environments in a secure, scalable and fault-tolerant manner. It supports a wide range of deep-learning frameworks such as Tensorflow, PyTorch, Caffe, Torch, Theano, and MXNet etc. These frameworks reduce the effort and skillset required to design, train, and use deep learning models. Deep Learning as a service is used at IBM by AI researchers in areas including machine translation, computer vision, and healthcare. 

Keywords: deep learning, machine learning, cognitive computing, model training

Procedia PDF Downloads 209
6908 Numerical Investigation of Cavitation on Different Venturi Shapes by Computational Fluid Dynamics

Authors: Sedat Yayla, Mehmet Oruc, Shakhwan Yaseen

Abstract:

Cavitation phenomena might rigorously impair machine parts such as pumps, propellers and impellers or devices as the pressure in the fluid declines under the liquid's saturation pressure. To evaluate the influence of cavitation, in this research two-dimensional computational fluid dynamics (CFD) venturi models with variety of inlet pressure values, throat lengths and vapor fluid contents were applied. In this research three different vapor contents (0%, 5% 10%), four inlet pressures (2, 4, 6, 8 and 10 atm) and two venturi models were employed at different throat lengths ( 5, 10, 15 and 20 mm) for discovering the impact of each parameter on the cavitation number. It is uncovered that there is a positive correlation between pressure inlet and vapor fluid content and cavitation number. Furthermore, it is unveiled that velocity remains almost constant at the inlet pressures of 6, 8,10atm, nevertheless increasing the length of throat results in the substantial escalation in the velocity of the throat at inlet pressures of 2 and 4 atm. Furthermore, velocity and cavitation number were negatively correlated. The results of the cavitation number varied between 0.092 and 0.495 depending upon the velocity values of the throat.

Keywords: cavitation number, computational fluid dynamics, mixture of fluid, two-phase flow, velocity of throat

Procedia PDF Downloads 400
6907 Simulation of the Visco-Elasto-Plastic Deformation Behaviour of Short Glass Fibre Reinforced Polyphthalamides

Authors: V. Keim, J. Spachtholz, J. Hammer

Abstract:

The importance of fibre reinforced plastics continually increases due to the excellent mechanical properties, low material and manufacturing costs combined with significant weight reduction. Today, components are usually designed and calculated numerically by using finite element methods (FEM) to avoid expensive laboratory tests. These programs are based on material models including material specific deformation characteristics. In this research project, material models for short glass fibre reinforced plastics are presented to simulate the visco-elasto-plastic deformation behaviour. Prior to modelling specimens of the material EMS Grivory HTV-5H1, consisting of a Polyphthalamide matrix reinforced by 50wt.-% of short glass fibres, are characterized experimentally in terms of the highly time dependent deformation behaviour of the matrix material. To minimize the experimental effort, the cyclic deformation behaviour under tensile and compressive loading (R = −1) is characterized by isothermal complex low cycle fatigue (CLCF) tests. Combining cycles under two strain amplitudes and strain rates within three orders of magnitude and relaxation intervals into one experiment the visco-elastic deformation is characterized. To identify visco-plastic deformation monotonous tensile tests either displacement controlled or strain controlled (CERT) are compared. All relevant modelling parameters for this complex superposition of simultaneously varying mechanical loadings are quantified by these experiments. Subsequently, two different material models are compared with respect to their accuracy describing the visco-elasto-plastic deformation behaviour. First, based on Chaboche an extended 12 parameter model (EVP-KV2) is used to model cyclic visco-elasto-plasticity at two time scales. The parameters of the model including a total separation of elastic and plastic deformation are obtained by computational optimization using an evolutionary algorithm based on a fitness function called genetic algorithm. Second, the 12 parameter visco-elasto-plastic material model by Launay is used. In detail, the model contains a different type of a flow function based on the definition of the visco-plastic deformation as a part of the overall deformation. The accuracy of the models is verified by corresponding experimental LCF testing.

Keywords: complex low cycle fatigue, material modelling, short glass fibre reinforced polyphthalamides, visco-elasto-plastic deformation

Procedia PDF Downloads 215
6906 Financial Liberalization, Exchange Rates and Demand for Money in Developing Economies: The Case of Nigeria, Ghana and Gambia

Authors: John Adebayo Oloyhede

Abstract:

This paper examines effect of financial liberalization on the stability of the demand for money function and its implication for exchange rate behaviour of three African countries. As the demand for money function is regarded as one of the two main building blocks of most exchange rate determination models, the other being purchasing power parity, its stability is required for the monetary models of exchange rate determination to hold. To what extent has the liberalisation policy of these countries, for instance liberalised interest rate, affected the demand for money function and what has been the consequence on the validity and relevance of floating exchange rate models? The study adopts the Autoregressive Instrumental Package (AIV) of multiple regression technique and followed the Almon Polynomial procedure with zero-end constraint. Data for the period 1986 to 2011 were drawn from three developing countries of Africa, namely: Gambia, Ghana and Nigeria, which did not only start the liberalization and floating system almost at the same period but share similar and diverse economic and financial structures. Its findings show that the demand for money was a stable function of income and interest rate at home and abroad. Other factors such as exchange rate and foreign interest rate exerted some significant effect on domestic money demand. The short-run and long-run elasticity with respect to income, interest rates, expected inflation rate and exchange rate expectation are not greater than zero. This evidence conforms to some extent to the expected behaviour of the domestic money function and underscores its ability to serve as good building block or assumption of the monetary model of exchange rate determination. This will, therefore, assist appropriate monetary authorities in the design and implementation of further financial liberalization policy packages in developing countries.

Keywords: financial liberalisation, exchange rates, demand for money, developing economies

Procedia PDF Downloads 372
6905 Unlocking the Health Benefits of Goat Meat

Authors: K. Makangali, G. Tokysheva, A. Shoman

Abstract:

Goat meat and goat meat products have garnered increasing attention within the realm of nutrition and health due to their potential to provide a myriad of benefits. This scientific article presents a comprehensive review of the health advantages associated with goat meat consumption and the products derived from it. The paper explores the nutritional content of goat meat, highlighting its favorable composition in terms of protein, essential minerals, and amino acids. It delves into the intricate balance of macronutrients, with lower fat and cholesterol levels compared to other meats, making goat meat a desirable choice for individuals seeking healthier dietary options.

Keywords: goat meat, amino acid, nutrition, meat products, meat

Procedia PDF Downloads 79
6904 Development of a Tesla Music Coil from Signal Processing

Authors: Samaniego Campoverde José Enrique, Rosero Muñoz Jorge Enrique, Luzcando Narea Lorena Elizabeth

Abstract:

This paper presents a practical and theoretical model for the operation of the Tesla coil using digital signal processing. The research is based on the analysis of ten scientific papers exploring the development and operation of the Tesla coil. Starting from the Testa coil, several modifications were carried out on the Tesla coil, with the aim of amplifying the digital signal by making use of digital signal processing. To achieve this, an amplifier with a transistor and digital filters provided by MATLAB software were used, which were chosen according to the characteristics of the signals in question.

Keywords: tesla coil, digital signal process, equalizer, graphical environment

Procedia PDF Downloads 117
6903 Multi-Faceted Growth in Creative Industries

Authors: Sanja Pfeifer, Nataša Šarlija, Marina Jeger, Ana Bilandžić

Abstract:

The purpose of this study is to explore the different facets of growth among micro, small and medium-sized firms in Croatia and to analyze the differences between models designed for all micro, small and medium-sized firms and those in creative industries. Three growth prediction models were designed and tested using the growth of sales, employment and assets of the company as dependent variables. The key drivers of sales growth are: prudent use of cash, industry affiliation and higher share of intangible assets. Growth of assets depends on retained profits, internal and external sources of financing, as well as industry affiliation. Growth in employment is closely related to sources of financing, in particular, debt and it occurs less frequently than growth in sales and assets. The findings confirm the assumption that growth strategies of small and medium-sized enterprises (SMEs) in creative industries have specific differences in comparison to SMEs in general. Interestingly, only 2.2% of growing enterprises achieve growth in employment, assets and sales simultaneously.

Keywords: creative industries, growth prediction model, growth determinants, growth measures

Procedia PDF Downloads 332
6902 Development of Anterior Lumbar Interbody Fusion (ALIF) Peek Cage Based on the Korean Lumbar Anatomical Information

Authors: Chang Soo Chon, Cheol Woong Ko, Han Sung Kim

Abstract:

The aim of this study is to develop an anterior lumbar interbody fusion (ALIF) PEEK cage suitable for Korean people. In this study, CT images were obtained from Korean male (173cm, 71kg) and 3D Korean lumbar models were reconstructed based on the CT images to investigate anatomical characteristics. Major design parameters of anterior lumbar interbody fusion (ALIF) PEEK Cage were selected using the morphological measurement information of the Korean Lumbar models. Through finite element analysis and mechanical tests, the developed ALIF PEEK Cage prototype was compared with the Fidji Cage (Zimmer.Inc, USA) and it was found that the ALIF prototype showed similar and/or superior mechanical performance compared to the FidJi Cage. Also, clinical validation for the ALIF PEEK Cage prototype was carried out to check predictable troubles in surgical operations. Finally, it is considered that the convenience and stability of the prototype was clinically verified.

Keywords: inter-body anterior fusion, ALIF cage, PEEK, Korean lumbar, CT image, animal test

Procedia PDF Downloads 523
6901 Use of Artificial Intelligence and Two Object-Oriented Approaches (k-NN and SVM) for the Detection and Characterization of Wetlands in the Centre-Val de Loire Region, France

Authors: Bensaid A., Mostephaoui T., Nedjai R.

Abstract:

Nowadays, wetlands are the subject of contradictory debates opposing scientific, political and administrative meanings. Indeed, given their multiple services (drinking water, irrigation, hydrological regulation, mineral, plant and animal resources...), wetlands concentrate many socio-economic and biodiversity issues. In some regions, they can cover vast areas (>100 thousand ha) of the landscape, such as the Camargue area in the south of France, inside the Rhone delta. The high biological productivity of wetlands, the strong natural selection pressures and the diversity of aquatic environments have produced many species of plants and animals that are found nowhere else. These environments are tremendous carbon sinks and biodiversity reserves depending on their age, composition and surrounding environmental conditions, wetlands play an important role in global climate projections. Covering more than 3% of the earth's surface, wetlands have experienced since the beginning of the 1990s a tremendous revival of interest, which has resulted in the multiplication of inventories, scientific studies and management experiments. The geographical and physical characteristics of the wetlands of the central region conceal a large number of natural habitats that harbour a great biological diversity. These wetlands, one of the natural habitats, are still influenced by human activities, especially agriculture, which affects its layout and functioning. In this perspective, decision-makers need to delimit spatial objects (natural habitats) in a certain way to be able to take action. Thus, wetlands are no exception to this rule even if it seems to be a difficult exercise to delimit a type of environment as whose main characteristic is often to occupy the transition between aquatic and terrestrial environment. However, it is possible to map wetlands with databases, derived from the interpretation of photos and satellite images, such as the European database Corine Land cover, which allows quantifying and characterizing for each place the characteristic wetland types. Scientific studies have shown limitations when using high spatial resolution images (SPOT, Landsat, ASTER) for the identification and characterization of small wetlands (1 hectare). To address this limitation, it is important to note that these wetlands generally represent spatially complex features. Indeed, the use of very high spatial resolution images (>3m) is necessary to map small and large areas. However, with the recent evolution of artificial intelligence (AI) and deep learning methods for satellite image processing have shown a much better performance compared to traditional processing based only on pixel structures. Our research work is also based on spectral and textural analysis on THR images (Spot and IRC orthoimage) using two object-oriented approaches, the nearest neighbour approach (k-NN) and the Super Vector Machine approach (SVM). The k-NN approach gave good results for the delineation of wetlands (wet marshes and moors, ponds, artificial wetlands water body edges, ponds, mountain wetlands, river edges and brackish marshes) with a kappa index higher than 85%.

Keywords: land development, GIS, sand dunes, segmentation, remote sensing

Procedia PDF Downloads 72
6900 The Best Prediction Data Mining Model for Breast Cancer Probability in Women Residents in Kabul

Authors: Mina Jafari, Kobra Hamraee, Saied Hossein Hosseini

Abstract:

The prediction of breast cancer disease is one of the challenges in medicine. In this paper we collected 528 records of women’s information who live in Kabul including demographic, life style, diet and pregnancy data. There are many classification algorithm in breast cancer prediction and tried to find the best model with most accurate result and lowest error rate. We evaluated some other common supervised algorithms in data mining to find the best model in prediction of breast cancer disease among afghan women living in Kabul regarding to momography result as target variable. For evaluating these algorithms we used Cross Validation which is an assured method for measuring the performance of models. After comparing error rate and accuracy of three models: Decision Tree, Naive Bays and Rule Induction, Decision Tree with accuracy of 94.06% and error rate of %15 is found the best model to predicting breast cancer disease based on the health care records.

Keywords: decision tree, breast cancer, probability, data mining

Procedia PDF Downloads 138
6899 Conduction Transfer Functions for the Calculation of Heat Demands in Heavyweight Facade Systems

Authors: Mergim Gasia, Bojan Milovanovica, Sanjin Gumbarevic

Abstract:

Better energy performance of the building envelope is one of the most important aspects of energy savings if the goals set by the European Union are to be achieved in the future. Dynamic heat transfer simulations are being used for the calculation of building energy consumption because they give more realistic energy demands compared to the stationary calculations that do not take the building’s thermal mass into account. Software used for these dynamic simulation use methods that are based on the analytical models since numerical models are insufficient for longer periods. The analytical models used in this research fall in the category of the conduction transfer functions (CTFs). Two methods for calculating the CTFs covered by this research are the Laplace method and the State-Space method. The literature review showed that the main disadvantage of these methods is that they are inadequate for heavyweight façade elements and shorter time periods used for the calculation. The algorithms for both the Laplace and State-Space methods are implemented in Mathematica, and the results are compared to the results from EnergyPlus and TRNSYS since these software use similar algorithms for the calculation of the building’s energy demand. This research aims to check the efficiency of the Laplace and the State-Space method for calculating the building’s energy demand for heavyweight building elements and shorter sampling time, and it also gives the means for the improvement of the algorithms used by these methods. As the reference point for the boundary heat flux density, the finite difference method (FDM) is used. Even though the dynamic heat transfer simulations are superior to the calculation based on the stationary boundary conditions, they have their limitations and will give unsatisfactory results if not properly used.

Keywords: Laplace method, state-space method, conduction transfer functions, finite difference method

Procedia PDF Downloads 133
6898 The Patterns Designation by the Inspiration from Flower at Suan Sunandha Palace

Authors: Nawaporn Srisarankullawong

Abstract:

This research is about the creating the design by the inspiration of the flowers, which were once planted in Suan Sunandha Palace. The researcher have conducted the research regarding the history of Suan Sunandha Palace and the flowers which have been planted in the palace’s garden, in order to use this research to create the new designs in the future. The objective are as follows; 1. To study the shape and the pattern of the flowers in Suan Sunandha Palace, in order to select a few of them as the model to create the new design. 2. In order to create the flower design from the flowers in Suan Sunandha Palace by using the current photograph of the flowers which were once used to be planted inside the palace and using adobe Illustrator and Adobe Photoshop programs to create the patterns and the model. The result of the research: From the research, the researcher had selected three types of flowers to crate the pattern model; they are Allamanda, Orchids and Flamingo Plant. The details of the flowers had been reduced in order to show the simplicity and create the pattern model to use them for models, so three flowers had created three pattern models and they had been developed into six patterns, using universal artist techniques, so the pattern created are modern and they can be used for further decoration.

Keywords: patterns design, Suan Sunandha Palace, pattern of the flowers, visual arts and design

Procedia PDF Downloads 374
6897 Modelling Operational Risk Using Extreme Value Theory and Skew t-Copulas via Bayesian Inference

Authors: Betty Johanna Garzon Rozo, Jonathan Crook, Fernando Moreira

Abstract:

Operational risk losses are heavy tailed and are likely to be asymmetric and extremely dependent among business lines/event types. We propose a new methodology to assess, in a multivariate way, the asymmetry and extreme dependence between severity distributions, and to calculate the capital for Operational Risk. This methodology simultaneously uses (i) several parametric distributions and an alternative mix distribution (the Lognormal for the body of losses and the Generalized Pareto Distribution for the tail) via extreme value theory using SAS®, (ii) the multivariate skew t-copula applied for the first time for operational losses and (iii) Bayesian theory to estimate new n-dimensional skew t-copula models via Markov chain Monte Carlo (MCMC) simulation. This paper analyses a newly operational loss data set, SAS Global Operational Risk Data [SAS OpRisk], to model operational risk at international financial institutions. All the severity models are constructed in SAS® 9.2. We implement the procedure PROC SEVERITY and PROC NLMIXED. This paper focuses in describing this implementation.

Keywords: operational risk, loss distribution approach, extreme value theory, copulas

Procedia PDF Downloads 603
6896 Explaining the Impact of Poverty Risk on Frailty Trajectories in Old Age Using Growth Curve Models

Authors: Erwin Stolz, Hannes Mayerl, Anja Waxenegger, Wolfgang Freidl

Abstract:

Research has often found poverty associated with adverse health outcomes, but it is unclear which (interplay of) mechanisms actually translate low economic resources into poor physical health. The goal of this study was to assess the impact of educational, material, psychosocial and behavioural factors in explaining the poverty-health association in old age. We analysed 28,360 observations from 11,390 community-dwelling respondents (65+) from the Survey of Health, Ageing and Retirement in Europe (SHARE, 2004-2013, 10 countries). We used multilevel growth curve models to assess the impact of combined income- and asset poverty risk on old age frailty index levels and trajectories. In total, 61.8% of the variation of poverty risk on frailty levels could be explained by direct and indirect effects, thereby highlighting the role of material and particularly psychosocial factors, such as perceived control and social isolation. We suggest strengthening social policy and public health efforts in order to fight poverty and its deleterious effects from early age on and to broaden the scope of interventions with regard to psychosocial factors.

Keywords: frailty, health inequality, old age, poverty

Procedia PDF Downloads 333
6895 Simulation of Flow through Dam Foundation by FEM and ANN Methods Case Study: Shahid Abbaspour Dam

Authors: Mehrdad Shahrbanozadeh, Gholam Abbas Barani, Saeed Shojaee

Abstract:

In this study, a finite element (Seep3D model) and an artificial neural network (ANN) model were developed to simulate flow through dam foundation. Seep3D model is capable of simulating three-dimensional flow through a heterogeneous and anisotropic, saturated and unsaturated porous media. Flow through the Shahid Abbaspour dam foundation has been used as a case study. The FEM with 24960 triangular elements and 28707 nodes applied to model flow through foundation of this dam. The FEM being made denser in the neighborhood of the curtain screen. The ANN model developed for Shahid Abbaspour dam is a feedforward four layer network employing the sigmoid function as an activator and the back-propagation algorithm for the network learning. The water level elevations of the upstream and downstream of the dam have been used as input variables and the piezometric heads as the target outputs in the ANN model. The two models are calibrated and verified using the Shahid Abbaspour’s dam piezometric data. Results of the models were compared with those measured by the piezometers which are in good agreement. The model results also revealed that the ANN model performed as good as and in some cases better than the FEM.

Keywords: seepage, dam foundation, finite element method, neural network, seep 3D model

Procedia PDF Downloads 474
6894 Deep-Learning to Generation of Weights for Image Captioning Using Part-of-Speech Approach

Authors: Tiago do Carmo Nogueira, Cássio Dener Noronha Vinhal, Gélson da Cruz Júnior, Matheus Rudolfo Diedrich Ullmann

Abstract:

Generating automatic image descriptions through natural language is a challenging task. Image captioning is a task that consistently describes an image by combining computer vision and natural language processing techniques. To accomplish this task, cutting-edge models use encoder-decoder structures. Thus, Convolutional Neural Networks (CNN) are used to extract the characteristics of the images, and Recurrent Neural Networks (RNN) generate the descriptive sentences of the images. However, cutting-edge approaches still suffer from problems of generating incorrect captions and accumulating errors in the decoders. To solve this problem, we propose a model based on the encoder-decoder structure, introducing a module that generates the weights according to the importance of the word to form the sentence, using the part-of-speech (PoS). Thus, the results demonstrate that our model surpasses state-of-the-art models.

Keywords: gated recurrent units, caption generation, convolutional neural network, part-of-speech

Procedia PDF Downloads 102
6893 Machine Learning-Driven Prediction of Cardiovascular Diseases: A Supervised Approach

Authors: Thota Sai Prakash, B. Yaswanth, Jhade Bhuvaneswar, Marreddy Divakar Reddy, Shyam Ji Gupta

Abstract:

Across the globe, there are a lot of chronic diseases, and heart disease stands out as one of the most perilous. Sadly, many lives are lost to this condition, even though early intervention could prevent such tragedies. However, identifying heart disease in its initial stages is not easy. To address this challenge, we propose an automated system aimed at predicting the presence of heart disease using advanced techniques. By doing so, we hope to empower individuals with the knowledge needed to take proactive measures against this potentially fatal illness. Our approach towards this problem involves meticulous data preprocessing and the development of predictive models utilizing classification algorithms such as Support Vector Machines (SVM), Decision Tree, and Random Forest. We assess the efficiency of every model based on metrics like accuracy, ensuring that we select the most reliable option. Additionally, we conduct thorough data analysis to reveal the importance of different attributes. Among the models considered, Random Forest emerges as the standout performer with an accuracy rate of 96.04% in our study.

Keywords: support vector machines, decision tree, random forest

Procedia PDF Downloads 40
6892 Comparative Analysis of Predictive Models for Customer Churn Prediction in the Telecommunication Industry

Authors: Deepika Christopher, Garima Anand

Abstract:

To determine the best model for churn prediction in the telecom industry, this paper compares 11 machine learning algorithms, namely Logistic Regression, Support Vector Machine, Random Forest, Decision Tree, XGBoost, LightGBM, Cat Boost, AdaBoost, Extra Trees, Deep Neural Network, and Hybrid Model (MLPClassifier). It also aims to pinpoint the top three factors that lead to customer churn and conducts customer segmentation to identify vulnerable groups. According to the data, the Logistic Regression model performs the best, with an F1 score of 0.6215, 81.76% accuracy, 68.95% precision, and 56.57% recall. The top three attributes that cause churn are found to be tenure, Internet Service Fiber optic, and Internet Service DSL; conversely, the top three models in this article that perform the best are Logistic Regression, Deep Neural Network, and AdaBoost. The K means algorithm is applied to establish and analyze four different customer clusters. This study has effectively identified customers that are at risk of churn and may be utilized to develop and execute strategies that lower customer attrition.

Keywords: attrition, retention, predictive modeling, customer segmentation, telecommunications

Procedia PDF Downloads 57
6891 A Predictive MOC Solver for Water Hammer Waves Distribution in Network

Authors: A. Bayle, F. Plouraboué

Abstract:

Water Distribution Network (WDN) still suffers from a lack of knowledge about fast pressure transient events prediction, although the latter may considerably impact their durability. Accidental or planned operating activities indeed give rise to complex pressure interactions and may drastically modified the local pressure value generating leaks and, in rare cases, pipe’s break. In this context, a numerical predictive analysis is conducted to prevent such event and optimize network management. A couple of Python/FORTRAN 90, home-made software, has been developed using Method Of Characteristic (MOC) solving for water-hammer equations. The solver is validated by direct comparison with theoretical and experimental measurement in simple configurations whilst afterward extended to network analysis. The algorithm's most costly steps are designed for parallel computation. A various set of boundary conditions and energetic losses models are considered for the network simulations. The results are analyzed in both real and frequencies domain and provide crucial information on the pressure distribution behavior within the network.

Keywords: energetic losses models, method of characteristic, numerical predictive analysis, water distribution network, water hammer

Procedia PDF Downloads 232
6890 Referencing Anna: Findings From Eye-tracking During Dutch Pronoun Resolution

Authors: Robin Devillers, Chantal van Dijk

Abstract:

Children face ambiguities in everyday language use. Particularly ambiguity in pronoun resolution can be challenging, whereas adults can rapidly identify the antecedent of the mentioned pronoun. Two main factors underlie this process, namely the accessibility of the referent and the syntactic cues of the pronoun. After 200ms, adults have converged the accessibility and the syntactic constraints, while relieving cognitive effort by considering contextual cues. As children are still developing their cognitive capacity, they are not able yet to simultaneously assess and integrate accessibility, contextual cues and syntactic information. As such, they fail to identify the correct referent and possibly fixate more on the competitor in comparison to adults. In this study, Dutch while-clauses were used to investigate the interpretation of pronouns by children. The aim is to a) examine the extent to which 7-10 year old children are able to utilise discourse and syntactic information during online and offline sentence processing and b) analyse the contribution of individual factors, including age, working memory, condition and vocabulary. Adult and child participants are presented with filler-items and while-clauses, and the latter follows a particular structure: ‘Anna and Sophie are sitting in the library. While Anna is reading a book, she is taking a sip of water.’ This sentence illustrates the ambiguous situation, as it is unclear whether ‘she’ refers to Anna or Sophie. In the unambiguous situation, either Anna or Sophie would be substituted by a boy, such as ‘Peter’. The pronoun in the second sentence will unambiguously refer to one of the characters due to the syntactic constraints of the pronoun. Children’s and adults’ responses were measured by means of a visual world paradigm. This paradigm consisted of two characters, of which one was the referent (the target) and the other was the competitor. A sentence was presented and followed by a question, which required the participant to choose which character was the referent. Subsequently, this paradigm yields an online (fixations) and offline (accuracy) score. These findings will be analysed using Generalised Additive Mixed Models, which allow for a thorough estimation of the individual variables. These findings will contribute to the scientific literature in several ways; firstly, the use of while-clauses has not been studied much and it’s processing has not yet been identified. Moreover, online pronoun resolution has not been investigated much in both children and adults, and therefore, this study will contribute to adults and child’s pronoun resolution literature. Lastly, pronoun resolution has not been studied yet in Dutch and as such, this study adds to the languages

Keywords: pronouns, online language processing, Dutch, eye-tracking, first language acquisition, language development

Procedia PDF Downloads 99
6889 The Musician as the Athlete: Psychological Response to Injury

Authors: Shulamit Sternin

Abstract:

Athletes experience injuries that can have both a physical and psychological impact on the individual. In such instances, athletes are able to rely on the established field of sports psychology to facilitate holistic rehabilitation. Musicians, like athletes rely on their bodies to perform in much the same way athletes do and are also susceptible to injury. Due to the similar performative nature of succeeding as an athletes or a musician, these careers share many of the same primary psychological concerns and therefore it is reasonable that athletes and musicians may require similar rehabilitation post-injury. However, musicians face their own unique psychological challenges and understanding the needs of an injured athlete can serve as a foundation for understanding the injured musician but is not enough to fully rehabilitate an injured musician. The current research surrounding musicians and their injuries is primarily focused on physiological aspects of injury and rehabilitation; the psychological aspects have not yet received adequate attention resulting in poor musician rehabilitation post- injury. This review paper uses current models of psychological response to injury in athletes to draw parallels with the psychological response to injury in musicians. Search engines such as Medline and PsycInfo were systematically searched using specific key words, such as psychological response, injury, athlete, and musician. Studies that focused on post-injury psychology of either the musician or the athlete were included. Within the literature there is evidence to support psychological responses, unique to the musician, that are not accounted for by current models of response in athletes. The models of psychological response to injury in athletes are inadequate tools for application to the musician. Future directions for performance arts research that can fill the gaps in our understanding and modeling of musicians’ response to injury are discussed. A better understanding of the psychological impact of injuries on musicians holds significant implications for health care practitioners working with injured musicians. Understanding the unique barriers musicians face post-injury, and how support for this population must be tailored to properly suit musicians’ needs will aid in more holistic rehabilitation and a higher likelihood of musician’s returning to pre-injury performance levels.

Keywords: athlete, injury, musician, psychological response

Procedia PDF Downloads 205