Search results for: regression models drone
7210 Forecast of the Small Wind Turbines Sales with Replacement Purchases and with or without Account of Price Changes
Authors: V. Churkin, M. Lopatin
Abstract:
The purpose of the paper is to estimate the US small wind turbines market potential and forecast the small wind turbines sales in the US. The forecasting method is based on the application of the Bass model and the generalized Bass model of innovations diffusion under replacement purchases. In the work an exponential distribution is used for modeling of replacement purchases. Only one parameter of such distribution is determined by average lifetime of small wind turbines. The identification of the model parameters is based on nonlinear regression analysis on the basis of the annual sales statistics which has been published by the American Wind Energy Association (AWEA) since 2001 up to 2012. The estimation of the US average market potential of small wind turbines (for adoption purchases) without account of price changes is 57080 (confidence interval from 49294 to 64866 at P = 0.95) under average lifetime of wind turbines 15 years, and 62402 (confidence interval from 54154 to 70648 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 90,7%, while in the second - 91,8%. The effect of the wind turbines price changes on their sales was estimated using generalized Bass model. This required a price forecast. To do this, the polynomial regression function, which is based on the Berkeley Lab statistics, was used. The estimation of the US average market potential of small wind turbines (for adoption purchases) in that case is 42542 (confidence interval from 32863 to 52221 at P = 0.95) under average lifetime of wind turbines 15 years, and 47426 (confidence interval from 36092 to 58760 at P = 0.95) under average lifetime of wind turbines 20 years. In the first case the explained variance is 95,3%, while in the second –95,3%.Keywords: bass model, generalized bass model, replacement purchases, sales forecasting of innovations, statistics of sales of small wind turbines in the United States
Procedia PDF Downloads 3487209 A Comprehensive Study of Spread Models of Wildland Fires
Authors: Manavjit Singh Dhindsa, Ursula Das, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran
Abstract:
These days, wildland fires, also known as forest fires, are more prevalent than ever. Wildfires have major repercussions that affect ecosystems, communities, and the environment in several ways. Wildfires lead to habitat destruction and biodiversity loss, affecting ecosystems and causing soil erosion. They also contribute to poor air quality by releasing smoke and pollutants that pose health risks, especially for individuals with respiratory conditions. Wildfires can damage infrastructure, disrupt communities, and cause economic losses. The economic impact of firefighting efforts, combined with their direct effects on forestry and agriculture, causes significant financial difficulties for the areas impacted. This research explores different forest fire spread models and presents a comprehensive review of various techniques and methodologies used in the field. A forest fire spread model is a computational or mathematical representation that is used to simulate and predict the behavior of a forest fire. By applying scientific concepts and data from empirical studies, these models attempt to capture the intricate dynamics of how a fire spreads, taking into consideration a variety of factors like weather patterns, topography, fuel types, and environmental conditions. These models assist authorities in understanding and forecasting the potential trajectory and intensity of a wildfire. Emphasizing the need for a comprehensive understanding of wildfire dynamics, this research explores the approaches, assumptions, and findings derived from various models. By using a comparison approach, a critical analysis is provided by identifying patterns, strengths, and weaknesses among these models. The purpose of the survey is to further wildfire research and management techniques. Decision-makers, researchers, and practitioners can benefit from the useful insights that are provided by synthesizing established information. Fire spread models provide insights into potential fire behavior, facilitating authorities to make informed decisions about evacuation activities, allocating resources for fire-fighting efforts, and planning for preventive actions. Wildfire spread models are also useful in post-wildfire mitigation strategies as they help in assessing the fire's severity, determining high-risk regions for post-fire dangers, and forecasting soil erosion trends. The analysis highlights the importance of customized modeling approaches for various circumstances and promotes our understanding of the way forest fires spread. Some of the known models in this field are Rothermel’s wildland fuel model, FARSITE, WRF-SFIRE, FIRETEC, FlamMap, FSPro, cellular automata model, and others. The key characteristics that these models consider include weather (includes factors such as wind speed and direction), topography (includes factors like landscape elevation), and fuel availability (includes factors like types of vegetation) among other factors. The models discussed are physics-based, data-driven, or hybrid models, also utilizing ML techniques like attention-based neural networks to enhance the performance of the model. In order to lessen the destructive effects of forest fires, this initiative aims to promote the development of more precise prediction tools and effective management techniques. The survey expands its scope to address the practical needs of numerous stakeholders. Access to enhanced early warning systems enables decision-makers to take prompt action. Emergency responders benefit from improved resource allocation strategies, strengthening the efficacy of firefighting efforts.Keywords: artificial intelligence, deep learning, forest fire management, fire risk assessment, fire simulation, machine learning, remote sensing, wildfire modeling
Procedia PDF Downloads 817208 Models of Start-Ups Created in Cooperation with a State University
Authors: Roman Knizek, Denisa Knizkova, Ludmila Fridrichova
Abstract:
The academic environment in Central Europe has recently been transforming itself and is trying to link its research and development with the private sector. However, compared to Western countries, there is a lack of history and continuity because of the centrally controlled economy from the end of the Second World War until the early 1990s. There are two basic models of how to carry out technology transfer between the academic and the business world. The first is to develop something new and then find a suitable private sector partner; the second is to find a partner who has the basic idea and then develop something new in collaboration. This study, unlike some other ones, describes two specific cases that took place in cooperation with the Technical University of Liberec, Faculty of Textiles. As was said before, in one case, a product was first developed, and after that, an investor was sought, and in the other case, there was an investor who wanted a specific product and wanted to help with its development. The study describes the various advantages and disadvantages, including a practical example of the creation of a subsequent start-up.Keywords: start-up, state university, academic environment, licensing agreement
Procedia PDF Downloads 157207 Enhancing Signal Reception in a Mobile Radio Network Using Adaptive Beamforming Antenna Arrays Technology
Authors: Ugwu O. C., Mamah R. O., Awudu W. S.
Abstract:
This work is aimed at enhancing signal reception on a mobile radio network and minimizing outage probability in a mobile radio network using adaptive beamforming antenna arrays. In this research work, an empirical real-time drive measurement was done in a cellular network of Globalcom Nigeria Limited located at Ikeja, the headquarters of Lagos State, Nigeria, with reference base station number KJA 004. The empirical measurement includes Received Signal Strength and Bit Error Rate which were recorded for exact prediction of the signal strength of the network as at the time of carrying out this research work. The Received Signal Strength and Bit Error Rate were measured with a spectrum monitoring Van with the help of a Ray Tracer at an interval of 100 meters up to 700 meters from the transmitting base station. The distance and angular location measurements from the reference network were done with the help Global Positioning System (GPS). The other equipment used were transmitting equipment measurements software (Temsoftware), Laptops and log files, which showed received signal strength with distance from the base station. Results obtained were about 11% from the real-time experiment, which showed that mobile radio networks are prone to signal failure and can be minimized using an Adaptive Beamforming Antenna Array in terms of a significant reduction in Bit Error Rate, which implies improved performance of the mobile radio network. In addition, this work did not only include experiments done through empirical measurement but also enhanced mathematical models that were developed and implemented as a reference model for accurate prediction. The proposed signal models were based on the analysis of continuous time and discrete space, and some other assumptions. These developed (proposed) enhanced models were validated using MATLAB (version 7.6.3.35) program and compared with the conventional antenna for accuracy. These outage models were used to manage the blocked call experience in the mobile radio network. 20% improvement was obtained when the adaptive beamforming antenna arrays were implemented on the wireless mobile radio network.Keywords: beamforming algorithm, adaptive beamforming, simulink, reception
Procedia PDF Downloads 417206 An Agent-Based Model of Innovation Diffusion Using Heterogeneous Social Interaction and Preference
Authors: Jang kyun Cho, Jeong-dong Lee
Abstract:
The advent of the Internet, mobile communications, and social network services has stimulated social interactions among consumers, allowing people to affect one another’s innovation adoptions by exchanging information more frequently and more quickly. Previous diffusion models, such as the Bass model, however, face limitations in reflecting such recent phenomena in society. These models are weak in their ability to model interactions between agents; they model aggregated-level behaviors only. The agent based model, which is an alternative to the aggregate model, is good for individual modeling, but it is still not based on an economic perspective of social interactions so far. This study assumes the presence of social utility from other consumers in the adoption of innovation and investigates the effect of individual interactions on innovation diffusion by developing a new model called the interaction-based diffusion model. By comparing this model with previous diffusion models, the study also examines how the proposed model explains innovation diffusion from the perspective of economics. In addition, the study recommends the use of a small-world network topology instead of cellular automata to describe innovation diffusion. This study develops a model based on individual preference and heterogeneous social interactions using utility specification, which is expandable and, thus, able to encompass various issues in diffusion research, such as reservation price. Furthermore, the study proposes a new framework to forecast aggregated-level market demand from individual level modeling. The model also exhibits a good fit to real market data. It is expected that the study will contribute to our understanding of the innovation diffusion process through its microeconomic theoretical approach.Keywords: innovation diffusion, agent based model, small-world network, demand forecasting
Procedia PDF Downloads 3417205 Analyzing Bridge Response to Wind Loads and Optimizing Design for Wind Resistance and Stability
Authors: Abdul Haq
Abstract:
The goal of this research is to better understand how wind loads affect bridges and develop strategies for designing bridges that are more stable and resistant to wind. The effect of wind on bridges is essential to their safety and functionality, especially in areas that are prone to high wind speeds or violent wind conditions. The study looks at the aerodynamic forces and vibrations caused by wind and how they affect bridge construction. Part of the research method involves first understanding the underlying ideas influencing wind flow near bridges. Computational fluid dynamics (CFD) simulations are used to model and forecast the aerodynamic behaviour of bridges under different wind conditions. These models incorporate several factors, such as wind directionality, wind speed, turbulence intensity, and the influence of nearby structures or topography. The results provide significant new insights into the loads and pressures that wind places on different bridge elements, such as decks, pylons, and connections. Following the determination of the wind loads, the structural response of bridges is assessed. By simulating their dynamic behavior under wind-induced forces, Finite Element Analysis (FEA) is used to model the bridge's component parts. This work contributes to the understanding of which areas are at risk of experiencing excessive stresses, vibrations, or oscillations due to wind excitations. Because the bridge has inherent modes and frequencies, the study considers both static and dynamic responses. Various strategies are examined to maximize the design of bridges to withstand wind. It is possible to alter the bridge's geometry, add aerodynamic components, add dampers or tuned mass dampers to lessen vibrations, and boost structural rigidity. Through an analysis of several design modifications and their effectiveness, the study aims to offer guidelines and recommendations for wind-resistant bridge design. In addition to the numerical simulations and analyses, there are experimental studies. In order to assess the computational models and validate the practicality of proposed design strategies, scaled bridge models are tested in a wind tunnel. These investigations help to improve numerical models and prediction precision by providing valuable information on wind-induced forces, pressures, and flow patterns. Using a combination of numerical models, actual testing, and long-term performance evaluation, the project aims to offer practical insights and recommendations for building wind-resistant bridges that are secure, long-lasting, and comfortable for users.Keywords: wind effects, aerodynamic forces, computational fluid dynamics, finite element analysis
Procedia PDF Downloads 667204 Association of Maternal Age, Ethnicity and BMI with Gestational Diabetes Prevalence in Multi-Racial Singapore
Authors: Nur Atiqah Adam, Mor Jack Ng, Bernard Chern, Kok Hian Tan
Abstract:
Introduction: Gestational diabetes (GDM) is a common pregnancy complication with short and long-term health consequences for both mother and fetus. Factors such as family history of diabetes mellitus, maternal obesity, maternal age, ethnicity and parity have been reported to influence the risk of GDM. In a multi-racial country like Singapore, it is worthwhile to study the GDM prevalences of different ethnicities. We aim to investigate the influence of ethnicity on the racial prevalences of GDM in Singapore. This is important as it may help us to improve guidelines on GDM healthcare services according to significant risk factors unique to Singapore. Materials and Methods: Obstetric cohort data of 926 singleton deliveries in KK Women’s and Children’s Hospital (KKH) from 2011 to 2013 was obtained. Only patients aged 18 and above and without complicated pregnancies or chronic illnesses were targeted. Factors such as ethnicity, maternal age, parity and maternal body mass index (BMI) at booking visit were studied. A multivariable logistic regression model, adjusted for confounders, was used to determine which of these factors are significantly associated with an increased risk of GDM. Results: The overall GDM prevalence rate based on WHO 1999 criteria & at risk screening (race alone not a risk factor) was 8.86%. GDM rates were higher among women above 35 years old (15.96%), obese (15.15%) and multiparous women (10.12%). Indians had a higher GDM rate (13.0 %) compared to the Chinese (9.57%) and Malays (5.20%). However, using multiple logistic regression model, variables that are significantly related to GDM rates were maternal age (p < 0.001) and maternal BMI at booking visit (p = 0.006). Conclusion: Maternal age (p < 0.001) and maternal booking BMI (p = 0.006) are the strongest risk factors for GDM. Ethnicity per se does not seem to have a significant influence on the prevalence of GDM in Singapore (p = 0.064). Hence we should tailor guidelines on GDM healthcare services according to maternal age and booking BMI rather than ethnicity.Keywords: ethnicity, gestational diabetes, healthcare, pregnancy
Procedia PDF Downloads 2267203 Global Positioning System Match Characteristics as a Predictor of Badminton Players’ Group Classification
Authors: Yahaya Abdullahi, Ben Coetzee, Linda Van Den Berg
Abstract:
The study aimed at establishing the global positioning system (GPS) determined singles match characteristics that act as predictors of successful and less-successful male singles badminton players’ group classification. Twenty-two (22) male single players (aged: 23.39 ± 3.92 years; body stature: 177.11 ± 3.06cm; body mass: 83.46 ± 14.59kg) who represented 10 African countries participated in the study. Players were categorised as successful and less-successful players according to the results of five championships’ of the 2014/2015 season. GPS units (MinimaxX V4.0), Polar Heart Rate Transmitter Belts and digital video cameras were used to collect match data. GPS-related variables were corrected for match duration and independent t-tests, a cluster analysis and a binary forward stepwise logistic regression were calculated. A Receiver Operating Characteristic Curve (ROC) was used to determine the validity of the group classification model. High-intensity accelerations per second were identified as the only GPS-determined variable that showed a significant difference between groups. Furthermore, only high-intensity accelerations per second (p=0.03) and low-intensity efforts per second (p=0.04) were identified as significant predictors of group classification with 76.88% of players that could be classified back into their original groups by making use of the GPS-based logistic regression formula. The ROC showed a value of 0.87. The identification of the last-mentioned GPS-related variables for the attainment of badminton performances, emphasizes the importance of using badminton drills and conditioning techniques to not only improve players’ physical fitness levels but also their abilities to accelerate at high intensities.Keywords: badminton, global positioning system, match analysis, inertial movement analysis, intensity, effort
Procedia PDF Downloads 1917202 The Combination of the Mel Frequency Cepstral Coefficients (MFCC), Perceptual Linear Prediction (PLP), JITTER and SHIMMER Coefficients for the Improvement of Automatic Recognition System for Dysarthric Speech
Authors: Brahim-Fares Zaidi, Malika Boudraa, Sid-Ahmed Selouani
Abstract:
Our work aims to improve our Automatic Recognition System for Dysarthria Speech (ARSDS) based on the Hidden Models of Markov (HMM) and the Hidden Markov Model Toolkit (HTK) to help people who are sick. With pronunciation problems, we applied two techniques of speech parameterization based on Mel Frequency Cepstral Coefficients (MFCC's) and Perceptual Linear Prediction (PLP's) and concatenated them with JITTER and SHIMMER coefficients in order to increase the recognition rate of a dysarthria speech. For our tests, we used the NEMOURS database that represents speakers with dysarthria and normal speakers.Keywords: hidden Markov model toolkit (HTK), hidden models of Markov (HMM), Mel-frequency cepstral coefficients (MFCC), perceptual linear prediction (PLP’s)
Procedia PDF Downloads 1617201 A Review of Research on Pre-training Technology for Natural Language Processing
Authors: Moquan Gong
Abstract:
In recent years, with the rapid development of deep learning, pre-training technology for natural language processing has made great progress. The early field of natural language processing has long used word vector methods such as Word2Vec to encode text. These word vector methods can also be regarded as static pre-training techniques. However, this context-free text representation brings very limited improvement to subsequent natural language processing tasks and cannot solve the problem of word polysemy. ELMo proposes a context-sensitive text representation method that can effectively handle polysemy problems. Since then, pre-training language models such as GPT and BERT have been proposed one after another. Among them, the BERT model has significantly improved its performance on many typical downstream tasks, greatly promoting the technological development in the field of natural language processing, and has since entered the field of natural language processing. The era of dynamic pre-training technology. Since then, a large number of pre-trained language models based on BERT and XLNet have continued to emerge, and pre-training technology has become an indispensable mainstream technology in the field of natural language processing. This article first gives an overview of pre-training technology and its development history, and introduces in detail the classic pre-training technology in the field of natural language processing, including early static pre-training technology and classic dynamic pre-training technology; and then briefly sorts out a series of enlightening technologies. Pre-training technology, including improved models based on BERT and XLNet; on this basis, analyze the problems faced by current pre-training technology research; finally, look forward to the future development trend of pre-training technology.Keywords: natural language processing, pre-training, language model, word vectors
Procedia PDF Downloads 577200 An Exploratory Study on 'Sub-Region Life Circle' in Chinese Big Cities Based on Human High-Probability Daily Activity: Characteristic and Formation Mechanism as a Case of Wuhan
Authors: Zhuoran Shan, Li Wan, Xianchun Zhang
Abstract:
With an increasing trend of regionalization and polycentricity in Chinese contemporary big cities, “sub-region life circle” turns to be an effective method on rational organization of urban function and spatial structure. By the method of questionnaire, network big data, route inversion on internet map, GIS spatial analysis and logistic regression, this article makes research on characteristic and formation mechanism of “sub-region life circle” based on human high-probability daily activity in Chinese big cities. Firstly, it shows that “sub-region life circle” has been a new general spatial sphere of residents' high-probability daily activity and mobility in China. Unlike the former analysis of the whole metropolitan or the micro community, “sub-region life circle” has its own characteristic on geographical sphere, functional element, spatial morphology and land distribution. Secondly, according to the analysis result with Binary Logistic Regression Model, the research also shows that seven factors including land-use mixed degree and bus station density impact the formation of “sub-region life circle” most, and then analyzes the index critical value of each factor. Finally, to establish a smarter “sub-region life circle”, this paper indicates that several strategies including jobs-housing fit, service cohesion and space reconstruction are the keys for its spatial organization optimization. This study expands the further understanding of cities' inner sub-region spatial structure based on human daily activity, and contributes to the theory of “life circle” in urban's meso-scale.Keywords: sub-region life circle, characteristic, formation mechanism, human activity, spatial structure
Procedia PDF Downloads 3007199 Information Communication Technology (ICT) Using Management in Nursing College under the Praboromarajchanok Institute
Authors: Suphaphon Udomluck, Pannathorn Chachvarat
Abstract:
Information Communication Technology (ICT) using management is essential for effective decision making in organization. The Concerns Based Adoption Model (CBAM) was employed as the conceptual framework. The purposes of the study were to assess the situation of Information Communication Technology (ICT) using management in College of Nursing under the Praboromarajchanok Institute. The samples were multi – stage sampling of 10 colleges of nursing that participated include directors, vice directors, head of learning groups, teachers, system administrator and responsible for ICT. The total participants were 280; the instrument used were questionnaires that include 4 parts, general information, Information Communication Technology (ICT) using management, the Stage of concern Questionnaires (SoC), and the Levels of Use (LoU) ICT Questionnaires respectively. Reliability coefficients were tested; alpha coefficients were 0.967for Information Communication Technology (ICT) using management, 0.884 for SoC and 0.945 for LoU. The data were analyzed by frequency, percentage, mean, standard deviation, Pearson Product Moment Correlation and Multiple Regression. They were founded as follows: The high level overall score of Information Communication Technology (ICT) using management and issue were administration, hardware, software, and people. The overall score of the Stage of concern (SoC)ICTis at high level and the overall score of the Levels of Use (LoU) ICTis at moderate. The Information Communication Technology (ICT) using management had the positive relationship with the Stage of concern (SoC)ICTand the Levels of Use (LoU) ICT(p < .01). The results of Multiple Regression revealed that administration hardwear, software and people ware could predict SoC of ICT (18.5%) and LoU of ICT (20.8%).The factors that were significantly influenced by SoCs were people ware. The factors that were significantly influenced by LoU of ICT were administration hardware and people ware.Keywords: information communication technology (ICT), management, the concerns-based adoption model (CBAM), stage of concern(SoC), the levels of use(LoU)
Procedia PDF Downloads 3187198 Image-Based UAV Vertical Distance and Velocity Estimation Algorithm during the Vertical Landing Phase Using Low-Resolution Images
Authors: Seyed-Yaser Nabavi-Chashmi, Davood Asadi, Karim Ahmadi, Eren Demir
Abstract:
The landing phase of a UAV is very critical as there are many uncertainties in this phase, which can easily entail a hard landing or even a crash. In this paper, the estimation of relative distance and velocity to the ground, as one of the most important processes during the landing phase, is studied. Using accurate measurement sensors as an alternative approach can be very expensive for sensors like LIDAR, or with a limited operational range, for sensors like ultrasonic sensors. Additionally, absolute positioning systems like GPS or IMU cannot provide distance to the ground independently. The focus of this paper is to determine whether we can measure the relative distance and velocity of UAV and ground in the landing phase using just low-resolution images taken by a monocular camera. The Lucas-Konda feature detection technique is employed to extract the most suitable feature in a series of images taken during the UAV landing. Two different approaches based on Extended Kalman Filters (EKF) have been proposed, and their performance in estimation of the relative distance and velocity are compared. The first approach uses the kinematics of the UAV as the process and the calculated optical flow as the measurement; On the other hand, the second approach uses the feature’s projection on the camera plane (pixel position) as the measurement while employing both the kinematics of the UAV and the dynamics of variation of projected point as the process to estimate both relative distance and relative velocity. To verify the results, a sequence of low-quality images taken by a camera that is moving on a specifically developed testbed has been used to compare the performance of the proposed algorithm. The case studies show that the quality of images results in considerable noise, which reduces the performance of the first approach. On the other hand, using the projected feature position is much less sensitive to the noise and estimates the distance and velocity with relatively high accuracy. This approach also can be used to predict the future projected feature position, which can drastically decrease the computational workload, as an important criterion for real-time applications.Keywords: altitude estimation, drone, image processing, trajectory planning
Procedia PDF Downloads 1137197 An Alternative Richards’ Growth Model Based on Hyperbolic Sine Function
Authors: Samuel Oluwafemi Oyamakin, Angela Unna Chukwu
Abstract:
Richrads growth equation being a generalized logistic growth equation was improved upon by introducing an allometric parameter using the hyperbolic sine function. The integral solution to this was called hyperbolic Richards growth model having transformed the solution from deterministic to a stochastic growth model. Its ability in model prediction was compared with the classical Richards growth model an approach which mimicked the natural variability of heights/diameter increment with respect to age and therefore provides a more realistic height/diameter predictions using the coefficient of determination (R2), Mean Absolute Error (MAE) and Mean Square Error (MSE) results. The Kolmogorov-Smirnov test and Shapiro-Wilk test was also used to test the behavior of the error term for possible violations. The mean function of top height/Dbh over age using the two models under study predicted closely the observed values of top height/Dbh in the hyperbolic Richards nonlinear growth models better than the classical Richards growth model.Keywords: height, diameter at breast height, DBH, hyperbolic sine function, Pinus caribaea, Richards' growth model
Procedia PDF Downloads 3937196 Study on 3D FE Analysis on Normal and Osteoporosis Mouse Models Based on 3-Point Bending Tests
Authors: Tae-min Byun, Chang-soo Chon, Dong-hyun Seo, Han-sung Kim, Bum-mo Ahn, Hui-suk Yun, Cheolwoong Ko
Abstract:
In this study, a 3-point bending computational analysis of normal and osteoporosis mouse models was performed based on the Micro-CT image information of the femurs. The finite element analysis (FEA) found 1.68 N (normal group) and 1.39 N (osteoporosis group) in the average maximum force, and 4.32 N/mm (normal group) and 3.56 N/mm (osteoporosis group) in the average stiffness. In the comparison of the 3-point bending test results, the maximum force and the stiffness were different about 9.4 times in the normal group and about 11.2 times in the osteoporosis group. The difference between the analysis and the test was greatly significant and this result demonstrated improvement points of the material properties applied to the computational analysis of this study. For the next study, the material properties of the mouse femur will be supplemented through additional computational analysis and test.Keywords: 3-point bending test, mouse, osteoporosis, FEA
Procedia PDF Downloads 3517195 The Role of Creative Entrepreneurship in the Development of Croatian Economy
Authors: Marko Kolakovic
Abstract:
Creative industries are an important sector of growth and development of knowledge economies. They have a positive impact on employment, economic growth, export and the quality of life in the areas where they are developed. Creative sectors include architecture, design, advertising, publishing, music, film, television and radio, video games, visual and performing arts and heritage. Following the positive trends of development of creative industries on the global and European level, this paper analyzes creative industries in general and specific characteristics of creative entrepreneurship. Special focus in this paper is put on the influence of the information communication technology on the development of new creative business models and protection of the intellectual property rights. One part of the paper is oriented on the analysis of the status of creative industries and creative entrepreneurship in Croatia. The main objective of the paper is by using the statistical analysis of creative industries in Croatia and information gained during the interviews with entrepreneurs, to make conclusions about potentials and development of creative industries in Croatia. Creative industries in Croatia are at the beginning of their development and growth strategy still does not exist at the national level. Statistical analysis pointed out that in 2015 creative enterprises made 9% of all enterprises in Croatia, employed 5,5% of employed people and their share in GDP was 4,01%. Croatian creative entrepreneurs are building competitive advantage using their creative resources and creating specific business models. The main obstacles they meet are lack of business experience and impossibility of focusing on the creative activities only. In their business, they use digital technologies and are focused on export. The conclusion is that creative industries in Croatia have development potential, but it is necessary to take adequate measures to use this potential in a right way.Keywords: creative entrepreneurship, knowledge economy, business models, intellectual property
Procedia PDF Downloads 2087194 Reservoir Potential, Net Pay Zone and 3D Modeling of Cretaceous Clastic Reservoir in Eastern Sulieman Belt Pakistan
Authors: Hadayat Ullah, Pervez Khalid, Saad Ahmed Mashwani, Zaheer Abbasi, Mubashir Mehmood, Muhammad Jahangir, Ehsan ul Haq
Abstract:
The aim of the study is to explore subsurface structures through data that is acquired from the seismic survey to delineate the characteristics of the reservoir through petrophysical analysis. Ghazij Shale of Eocene age is regional seal rock in this field. In this research work, 3D property models of subsurface were prepared by applying Petrel software to identify various lithologies and reservoir fluids distribution throughout the field. The 3D static modeling shows a better distribution of the discrete and continuous properties in the field. This model helped to understand the reservoir properties and enhance production by selecting the best location for future drilling. A complete workflow is proposed for formation evaluation, electrofacies modeling, and structural interpretation of the subsurface geology. Based on the wireline logs, it is interpreted that the thickness of the Pab Sandstone varies from 250 m to 350 m in the entire study area. The sandstone is massive with high porosity and intercalated layers of shales. Faulted anticlinal structures are present in the study area, which are favorable for the accumulation of hydrocarbon. 3D structural models and various seismic attribute models were prepared to analyze the reservoir character of this clastic reservoir. Based on wireline logs and seismic data, clean sand, shaly sand, and shale are marked as dominant facies in the study area. However, clean sand facies are more favorable to act as a potential net pay zone.Keywords: cretaceous, pab sandstone, petrophysics, electrofacies, hydrocarbon
Procedia PDF Downloads 1437193 Recommendations for Data Quality Filtering of Opportunistic Species Occurrence Data
Authors: Camille Van Eupen, Dirk Maes, Marc Herremans, Kristijn R. R. Swinnen, Ben Somers, Stijn Luca
Abstract:
In ecology, species distribution models are commonly implemented to study species-environment relationships. These models increasingly rely on opportunistic citizen science data when high-quality species records collected through standardized recording protocols are unavailable. While these opportunistic data are abundant, uncertainty is usually high, e.g., due to observer effects or a lack of metadata. Data quality filtering is often used to reduce these types of uncertainty in an attempt to increase the value of studies relying on opportunistic data. However, filtering should not be performed blindly. In this study, recommendations are built for data quality filtering of opportunistic species occurrence data that are used as input for species distribution models. Using an extensive database of 5.7 million citizen science records from 255 species in Flanders, the impact on model performance was quantified by applying three data quality filters, and these results were linked to species traits. More specifically, presence records were filtered based on record attributes that provide information on the observation process or post-entry data validation, and changes in the area under the receiver operating characteristic (AUC), sensitivity, and specificity were analyzed using the Maxent algorithm with and without filtering. Controlling for sample size enabled us to study the combined impact of data quality filtering, i.e., the simultaneous impact of an increase in data quality and a decrease in sample size. Further, the variation among species in their response to data quality filtering was explored by clustering species based on four traits often related to data quality: commonness, popularity, difficulty, and body size. Findings show that model performance is affected by i) the quality of the filtered data, ii) the proportional reduction in sample size caused by filtering and the remaining absolute sample size, and iii) a species ‘quality profile’, resulting from a species classification based on the four traits related to data quality. The findings resulted in recommendations on when and how to filter volunteer generated and opportunistically collected data. This study confirms that correctly processed citizen science data can make a valuable contribution to ecological research and species conservation.Keywords: citizen science, data quality filtering, species distribution models, trait profiles
Procedia PDF Downloads 2037192 Estimation Atmospheric parameters for Weather Study and Forecast over Equatorial Regions Using Ground-Based Global Position System
Authors: Asmamaw Yehun, Tsegaye Kassa, Addisu Hunegnaw, Martin Vermeer
Abstract:
There are various models to estimate the neutral atmospheric parameter values, such as in-suite and reanalysis datasets from numerical models. Accurate estimated values of the atmospheric parameters are useful for weather forecasting and, climate modeling and monitoring of climate change. Recently, Global Navigation Satellite System (GNSS) measurements have been applied for atmospheric sounding due to its robust data quality and wide horizontal and vertical coverage. The Global Positioning System (GPS) solutions that includes tropospheric parameters constitute a reliable set of data to be assimilated into climate models. The objective of this paper is, to estimate the neutral atmospheric parameters such as Wet Zenith Delay (WZD), Precipitable Water Vapour (PWV) and Total Zenith Delay (TZD) using six selected GPS stations in the equatorial regions, more precisely, the Ethiopian GPS stations from 2012 to 2015 observational data. Based on historic estimated GPS-derived values of PWV, we forecasted the PWV from 2015 to 2030. During data processing and analysis, we applied GAMIT-GLOBK software packages to estimate the atmospheric parameters. In the result, we found that the annual averaged minimum values of PWV are 9.72 mm for IISC and maximum 50.37 mm for BJCO stations. The annual averaged minimum values of WZD are 6 cm for IISC and maximum 31 cm for BDMT stations. In the long series of observations (from 2012 to 2015), we also found that there is a trend and cyclic patterns of WZD, PWV and TZD for all stations.Keywords: atmosphere, GNSS, neutral atmosphere, precipitable water vapour
Procedia PDF Downloads 617191 The Feasibility and Usability of Antennas Silence Zone for Localization and Path Finding
Authors: S. Malebary, W. Xu
Abstract:
Antennas are important components that enable transmitting and receiving signals in mid-air (wireless). The radiation pattern of omni-directional (i.e., dipole) antennas, reflects the variation of power radiated by an antenna as a function of direction when transmitting. As the performance of the antenna is the same in transmitting and receiving, it also reflects the sensitivity of the antenna in different directions when receiving. The main observation when dealing with omni-directional antennas, regardless the application, is they equally radiate power in all directions in reference to Equivalent Isotropically Radiated Power (EIRP). Disseminating radio frequency signals in an omni-directional manner form a doughnut-shape-field with a cone in the middle of the elevation plane (when mounted vertically). In this paper, we investigate the existence of this physical phenomena namely silence cone zone (the zone where radiated power is nulled). First, we overview antenna types and properties that have the major impact on the shape of the electromagnetic field. Then we model various off the shelf dipoles in Matlab based on antennas’ features (dimensions, gain, operating frequency, … etc.) and compare the resulting radiation patterns. After that, we validate the existence of the null zone in Omni-directional antennas by conducting experiments and generating waveforms (using USRP1 and USRP2) at various frequencies using different types of antennas and gains in indoor/outdoor. We capture the generated waveforms around antennas' null zone in the reactive, near, and far field with a spectrum analyzer mounted on a drone, using various off the shelf antennas. We analyze the captured signals in RF-Explorer and plot the impact on received power and signal amplitude inside and around the null zone. Finally, it is concluded from evaluation and measurements the existence of null zones in Omni-directional antennas which we plan on extending this work in the near future to investigate the usability of the null zone for various applications such as localization and path finding.Keywords: antennas, amplitude, field regions, frequency, FSPL, omni-directional, radiation pattern, RSSI, silence zone cone
Procedia PDF Downloads 3037190 Mechanistic Understanding of the Difference in two Strains Cholerae Causing Pathogens and Predicting Therapeutic Strategies for Cholera Patients Affected with new Strain Vibrio Cholerae El.tor. Using Constrain-based Modelling
Authors: Faiz Khan Mohammad, Saumya Ray Chaudhari, Raghunathan Rengaswamy, Swagatika Sahoo
Abstract:
Cholera caused by pathogenic gut bacteria Vibrio Cholerae (VC), is a major health problem in developing countries. Different strains of VC exhibit variable responses subject to different extracellular medium (Nag et al, Infect Immun, 2018). In this study, we present a new approach to model the variable VC responses in mono- and co-cultures, subject to continuously changing growth medium, which is otherwise difficult via simple FBA model. Nine VC strain and seven E. coli (EC) models were assembled and considered. A continuously changing medium is modelled using a new iterative-based controlled medium technique (ITC). The medium is appropriately prefixed with the VC model secretome. As the flux through the bacteria biomass increases secretes certain by-products. These products shall add-on to the medium, either deviating the nutrient potential or block certain metabolic components of the model, effectively forming a controlled feed-back loop. Different VC models were setup as monoculture of VC in glucose enriched medium, and in co-culture with VC strains and EC. Constrained to glucose enriched medium, (i) VC_Classical model resulted in higher flux through acidic secretome suggesting a pH change of the medium, leading to lowering of its biomass. This is in consonance with the literature reports. (ii) When compared for neutral secretome, flux through acetoin exchange was higher in VC_El tor than the classical models, suggesting El tor requires an acidic partner to lower its biomass. (iii) Seven of nine VC models predicted 3-methyl-2-Oxovaleric acid, mysirtic acid, folic acid, and acetate significantly affect corresponding biomass reactions. (iv) V. parhemolyticus and vulnificus were found to be phenotypically similar to VC Classical strain, across the nine VC strains. The work addresses the advantage of the ITC over regular flux balance analysis for modelling varying growth medium. Future expansion to co-cultures, potentiates the identification of novel interacting partners as effective cholera therapeutics.Keywords: cholera, vibrio cholera El. tor, vibrio cholera classical, acetate
Procedia PDF Downloads 1627189 Biosorption of Fluoride from Aqueous Solutions by Tinospora Cordifolia Leaves
Authors: Srinivasulu Dasaiah, Kalyan Yakkala, Gangadhar Battala, Pavan Kumar Pindi, Ramakrishna Naidu Gurijala
Abstract:
Tinospora cordifolia leaves biomass used for the removal fluoride from aqueous solutions. Batch biosorption technique was applied, pH, contact time, biosorbent dose and initial fluoride concentration was studied. The Scanning Electron Microscopy (SEM) and Fourier Transform Infrared (FTIR) techniques used to study the surface characteristics and the presence of chemical functional groups on the biosorbent. Biosorption isotherm models and kinetic models were applied to understand the sorption mechanism. Results revealed that pH, contact time, biosorbent dose and initial fluoride concentration played a significant effect on fluoride removal from aqueous solutions. The developed biosorbent derived from Tinospora cordifolia leaves biomass found to be a low-cost biosorbent and could be used for the effective removal of fluoride in synthetic as well as real water samples.Keywords: biosorption, contact time, fluoride, isotherms
Procedia PDF Downloads 1777188 Predictive Models of Ruin Probability in Retirement Withdrawal Strategies
Authors: Yuanjin Liu
Abstract:
Retirement withdrawal strategies are very important to minimize the probability of ruin in retirement. The ruin probability is modeled as a function of initial withdrawal age, gender, asset allocation, inflation rate, and initial withdrawal rate. The ruin probability is obtained based on the 2019 period life table for the Social Security, IRS Required Minimum Distribution (RMD) Worksheets, US historical bond and equity returns, and inflation rates using simulation. Several popular machine learning algorithms of the generalized additive model, random forest, support vector machine, extreme gradient boosting, and artificial neural network are built. The model validation and selection are based on the test errors using hyperparameter tuning and train-test split. The optimal model is recommended for retirees to monitor the ruin probability. The optimal withdrawal strategy can be obtained based on the optimal predictive model.Keywords: ruin probability, retirement withdrawal strategies, predictive models, optimal model
Procedia PDF Downloads 747187 Low-Cost Parking Lot Mapping and Localization for Home Zone Parking Pilot
Authors: Hongbo Zhang, Xinlu Tang, Jiangwei Li, Chi Yan
Abstract:
Home zone parking pilot (HPP) is a fast-growing segment in low-speed autonomous driving applications. It requires the car automatically cruise around a parking lot and park itself in a range of up to 100 meters inside a recurrent home/office parking lot, which requires precise parking lot mapping and localization solution. Although Lidar is ideal for SLAM, the car OEMs favor a low-cost fish-eye camera based visual SLAM approach. Recent approaches have employed segmentation models to extract semantic features and improve mapping accuracy, but these AI models are memory unfriendly and computationally expensive, making deploying on embedded ADAS systems difficult. To address this issue, we proposed a new method that utilizes object detection models to extract robust and accurate parking lot features. The proposed method could reduce computational costs while maintaining high accuracy. Once combined with vehicles’ wheel-pulse information, the system could construct maps and locate the vehicle in real-time. This article will discuss in detail (1) the fish-eye based Around View Monitoring (AVM) with transparent chassis images as the inputs, (2) an Object Detection (OD) based feature point extraction algorithm to generate point cloud, (3) a low computational parking lot mapping algorithm and (4) the real-time localization algorithm. At last, we will demonstrate the experiment results with an embedded ADAS system installed on a real car in the underground parking lot.Keywords: ADAS, home zone parking pilot, object detection, visual SLAM
Procedia PDF Downloads 677186 Mathematical Modeling of Bi-Substrate Enzymatic Reactions in the Presence of Different Types of Inhibitors
Authors: Rafayel Azizyan, Valeri Arakelyan, Aram Gevorgyan, Varduhi Balayan, Emil Gevorgyan
Abstract:
Currently, mathematical and computer modeling are widely used in different biological studies to predict or assess behavior of such complex systems as biological ones. This study deals with mathematical and computer modeling of bi-substrate enzymatic reactions, which play an important role in different biochemical pathways. The main objective of this study is to represent the results from in silico investigation of bi-substrate enzymatic reactions in the presence of uncompetitive inhibitors, as well as to describe in details the inhibition effects. Four models of uncompetitive inhibition were designed using different software packages. Particularly, uncompetitive inhibitor to the first [ES1] and the second ([ES1S2]; [FS2]) enzyme-substrate complexes have been studied. The simulation, using the same kinetic parameters for all models allowed investigating the behavior of reactions as well as determined some interesting aspects concerning influence of different cases of uncompetitive inhibition. Besides that shown, that uncompetitive inhibitors exhibit specific selectivity depending on mechanism of bi-substrate enzymatic reaction.Keywords: mathematical modeling, bi-substrate enzymatic reactions, reversible inhibition
Procedia PDF Downloads 3477185 Comparing Business Excellence Models Using Quantitative Methods: A First Step
Authors: Mohammed Alanazi, Dimitrios Tsagdis
Abstract:
Established Business Excellence Models (BEMs), like the Malcolm Baldrige National Quality Award (MBNQA) model and the European Foundation for Quality Management (EFQM) model, have been adopted by firms all over the world. They exist alongside more recent country-specific BEMs; e.g. the Australian, Canadian, China, New Zealand, Singapore, and Taiwan quality awards that although not as widespread as MBNQA and EFQM have nonetheless strong national followings. Regardless of any differences in their following or prestige, the emergence and development of all BEMs have been shaped both by their local context (e.g. underlying socio-economic dynamics) as well as by global best practices. Besides such similarities, that render them into objects (i.e. models) of the same class (i.e. BEMs), BEMs exhibit non-trivial differences in their criteria, relations, and emphasis. Given the evolution of BEMs (e.g. the MBNQA underwent seven evolutions since its inception in 1987 while the EFQM five since 1993), it is unsurprising that comparative studies of their validity are few and far in between. This poses challenges for practitioners and policy makers alike; as it is not always clear which BEM is to be preferred or better fitting to a particular context. Especially, in contexts that differ substantially from the original context of BEM development. This paper aims to fill this gap by presenting a research design and measurement model for comparing BEMs using quantitative methods (e.g. structural equations). Three BEMs will be focused upon in particular for illustration purposes; the MBNQA, the EFQM, and the King Abdul Aziz Quality Award (KAQA) model. They have been selected so to reflect the two established and widely spread traditions as well as a more recent context-specific arrival promising a better fit.Keywords: Baldrige, business excellence, European Foundation for Quality Management, Structural Equation Model, total quality management
Procedia PDF Downloads 2387184 Investigation of the Effect of Lecturers' Attributes on Students' Interest in Learning Statistic Ghanaian Tertiary Institutions
Authors: Samuel Asiedu-Addo, Jonathan Annan, Yarhands Dissou Arthur
Abstract:
The study aims to explore the relational effect of lecturers’ personal attribute on student’s interest in statistics. In this study personal attributes of lecturers’ such as lecturer’s dynamism, communication strategies and rapport in the classroom as well as applied knowledge during lecture were examined. Here, exploratory research design was used to establish the effect of lecturer’s personal attributes on student’s interest. Data were analyzed by means of confirmatory factor analysis and structural equation modeling (SEM) using the SmartPLS 3 program. The study recruited 376 students from the faculty of technical and vocational education of the University of Education Winneba Kumasi campus, and Ghana Technology University College as well as Kwame Nkrumah University of science and Technology. The results revealed that personal attributes of an effective lecturer were lecturer’s dynamism, rapport, communication and applied knowledge contribute (52.9%) in explaining students interest in statistics. Our regression analysis and structural equation modeling confirm that lecturers personal attribute contribute effectively by predicting student’s interest of 52.9% and 53.7% respectively. The paper concludes that the total effect of a lecturer’s attribute on student’s interest is moderate and significant. While a lecturer’s communication and dynamism were found to contribute positively to students’ interest, they were insignificant in predicting students’ interest. We further showed that a lecturer’s personal attributes such as applied knowledge and rapport have positive and significant effect on tertiary student’s interest in statistic, whilst lecturers’ communication and dynamism do not significantly affect student interest in statistics; though positively related.Keywords: student interest, effective teacher, personal attributes, regression and SEM
Procedia PDF Downloads 3597183 Why and When to Teach Definitions: Necessary and Unnecessary Discontinuities Resulting from the Definition of Mathematical Concepts
Authors: Josephine Shamash, Stuart Smith
Abstract:
We examine reasons for introducing definitions in teaching mathematics in a number of different cases. We try to determine if, where, and when to provide a definition, and which definition to choose. We characterize different types of definitions and the different purposes we may have for formulating them, and detail examples of each type. Giving a definition at a certain stage can sometimes be detrimental to the development of the concept image. In such a case, it is advisable to delay the precise definition to a later stage. We describe two models, the 'successive approximation model', and the 'model of the extending definition' that fit such situations. Detailed examples that fit the different models are given based on material taken from a number of textbooks, and analysis of the way the concept is introduced, and where and how its definition is given. Our conclusions, based on this analysis, is that some of the definitions given may cause discontinuities in the learning sequence and constitute obstacles and unnecessary cognitive conflicts in the formation of the concept definition. However, in other cases, the discontinuity in passing from definition to definition actually serves a didactic purpose, is unavoidable for the mathematical evolution of the concept image, and is essential for students to deepen their understanding.Keywords: concept image, mathematical definitions, mathematics education, mathematics teaching
Procedia PDF Downloads 1297182 Risk of Androgen Deprivation Therapy-Induced Metabolic Syndrome-Related Complications for Prostate Cancer in Taiwan
Authors: Olivia Rachel Hwang, Yu-Hsuan Joni Shao
Abstract:
Androgen Deprivation Therapy (ADT) has been a primary treatment for patients with advanced prostate cancer. However, it is associated with numerous adverse effects related to Metabolic Syndrome (MetS), including hypertension, diabetes, hyperlipidaemia, heart diseases and ischemic strokes. However, complications associated with ADT for prostate cancer in Taiwan is not well documented. The purpose of this study is to utilize the data from NHIRD (National Health Insurance Research Database) to examine the trajectory changes of MetS-related complications in men receiving ADT. The risks of developing complications after the treatment were analyzed with multivariate Cox regression model. Covariates including in the model were the complications before the diagnosis of prostate cancer, the age, and the year at cancer diagnosis. A total number of 17268 patients from 1997-2013 were included in this study. The exclusion criteria were patients with any other types of cancer or with the existing MetS-related complications. Changes in MetS-related complications were observed among two treatment groups: 1) ADT (n=9042), and 2) non-ADT (n=8226). The ADT group appeared to have an increased risk in hypertension (hazard ratio 1.08, 95% confidence interval 1.03-1.13, P = 0.001) and hyperlipidemia (hazard ratio 1.09, 95% confidence interval 1.01-1.17, P = 0.02) when compared with non-ADT group in the multivariate Cox regression analyses. In the risk of diabetes, heart diseases, and ischemic strokes, ADT group appeared to have an increased but not significant hazard ratio. In conclusion, ADT was associated with an increased risk in hypertension and hyperlipidemia in prostate cancer patients in Taiwan. The risk of hypertension and hyperlipidemia should be considered while deciding on ADT, especially those with the known history of hypertension and hyperlipidemia.Keywords: androgen deprivation therapy, ADT, complications, metabolic syndrome, MetS, prostate cancer
Procedia PDF Downloads 2887181 Revealing the Risks of Obstructive Sleep Apnea
Authors: Oyuntsetseg Sandag, Lkhagvadorj Khosbayar, Naidansuren Tsendeekhuu, Densenbal Dansran, Bandi Solongo
Abstract:
Introduction: Obstructive sleep apnea (OSA) is a common disorder affecting at least 2% to 4% of the adult population. It is estimated that nearly 80% of men and 93% of women with moderate to severe sleep apnea are undiagnosed. A number of screening questionnaires and clinical screening models have been developed to help identify patients with OSA, also it’s indeed to clinical practice. Purpose of study: Determine dependence of obstructive sleep apnea between for severe risk and risk factor. Material and Methods: A cross-sectional study included 114 patients presenting from theCentral state 3th hospital and Central state 1th hospital. Patients who had obstructive sleep apnea (OSA)selected in this study. Standard StopBang questionnaire was obtained from all patients.According to the patients’ response to the StopBang questionnaire was divided into low risk, intermediate risk, and high risk.Descriptive statistics were presented mean ± standard deviation (SD). Each questionnaire was compared on the likelihood ratio for a positive result, the likelihood ratio for a negative test result of regression. Statistical analyses were performed utilizing SPSS 16. Results: 114 patients were obtained (mean age 48 ± 16, male 57)that divided to low risk 54 (47.4%), intermediate risk 33 (28.9%), high risk 27 (23.7%). Result of risk factor showed significantly increasing that mean age (38 ± 13vs. 54 ± 14 vs. 59 ± 10, p<0.05), blood pressure (115 ± 18vs. 133 ± 19vs. 142 ± 21, p<0.05), BMI(24 IQR 22; 26 vs. 24 IQR 22; 29 vs. 28 IQR 25; 34, p<0.001), neck circumference (35 ± 3.4 vs. 38 ± 4.7 vs. 41 ± 4.4, p<0.05)were increased. Results from multiple logistic regressions showed that age is significantly independently factor for OSA (odds ratio 1.07, 95% CI 1.02-1.23, p<0.01). Predictive value of age was significantly higher factor for OSA (AUC=0.833, 95% CI 0.758-0.909, p<0.001). Our study showing that risk of OSA is beginning 47 years old (sensitivity 78.3%, specifity74.1%). Conclusions: According to most of all patients’ response had intermediate risk and high risk. Also, age, blood pressure, neck circumference and BMI were increased such as risk factor was increased for OSA. Especially age is independently factor and highest significance for OSA. Patients’ age one year is increased likelihood risk factor 1.1 times is increased.Keywords: obstructive sleep apnea, Stop-Bang, BMI (Body Mass Index), blood pressure
Procedia PDF Downloads 310