Search results for: inclusive business models
8483 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications
Authors: H. Hruschka
Abstract:
This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models
Procedia PDF Downloads 1998482 Business Intelligent to a Decision Support Tool for Green Entrepreneurship: Meso and Macro Regions
Authors: Anishur Rahman, Maria Areias, Diogo Simões, Ana Figeuiredo, Filipa Figueiredo, João Nunes
Abstract:
The circular economy (CE) has gained increased awareness among academics, businesses, and decision-makers as it stimulates resource circularity in the production and consumption systems. A large epistemological study has explored the principles of CE, but scant attention eagerly focused on analysing how CE is evaluated, consented to, and enforced using economic metabolism data and business intelligent framework. Economic metabolism involves the ongoing exchange of materials and energy within and across socio-economic systems and requires the assessment of vast amounts of data to provide quantitative analysis related to effective resource management. Limited concern, the present work has focused on the regional flows pilot region from Portugal. By addressing this gap, this study aims to promote eco-innovation and sustainability in the regions of Intermunicipal Communities Região de Coimbra, Viseu Dão Lafões and Beiras e Serra da Estrela, using this data to find precise synergies in terms of material flows and give companies a competitive advantage in form of valuable waste destinations, access to new resources and new markets, cost reduction and risk sharing benefits. In our work, emphasis on applying artificial intelligence (AI) and, more specifically, on implementing state-of-the-art deep learning algorithms is placed, contributing to construction a business intelligent approach. With the emergence of new approaches generally highlighted under the sub-heading of AI and machine learning (ML), the methods for statistical analysis of complex and uncertain production systems are facing significant changes. Therefore, various definitions of AI and its differences from traditional statistics are presented, and furthermore, ML is introduced to identify its place in data science and the differences in topics such as big data analytics and in production problems that using AI and ML are identified. A lifecycle-based approach is then taken to analyse the use of different methods in each phase to identify the most useful technologies and unifying attributes of AI in manufacturing. Most of macroeconomic metabolisms models are mainly direct to contexts of large metropolis, neglecting rural territories, so within this project, a dynamic decision support model coupled with artificial intelligence tools and information platforms will be developed, focused on the reality of these transition zones between the rural and urban. Thus, a real decision support tool is under development, which will surpass the scientific developments carried out to date and will allow to overcome imitations related to the availability and reliability of data.Keywords: circular economy, artificial intelligence, economic metabolisms, machine learning
Procedia PDF Downloads 738481 Elastoplastic and Ductile Damage Model Calibration of Steels for Bolt-Sphere Joints Used in China’s Space Structure Construction
Authors: Huijuan Liu, Fukun Li, Hao Yuan
Abstract:
The bolted spherical node is a common type of joint in space steel structures. The bolt-sphere joint portion almost always controls the bearing capacity of the bolted spherical node. The investigation of the bearing performance and progressive failure in service often requires high-fidelity numerical models. This paper focuses on the constitutive models of bolt steel and sphere steel used in China’s space structure construction. The elastoplastic model is determined by a standard tensile test and calibrated Voce saturated hardening rule. The ductile damage is found dominant based on the fractography analysis. Then Rice-Tracey ductile fracture rule is selected and the model parameters are calibrated based on tensile tests of notched specimens. These calibrated material models can benefit research or engineering work in similar fields.Keywords: bolt-sphere joint, steel, constitutive model, ductile damage, model calibration
Procedia PDF Downloads 1378480 Adding Business Value in Enterprise Applications through Quality Matrices Using Agile
Authors: Afshan Saad, Muhammad Saad, Shah Muhammad Emaduddin
Abstract:
Nowadays the business condition is so quick paced that enhancing ourselves consistently has turned into a huge factor for the presence of an undertaking. We can check this for structural building and significantly more so in the quick-paced universe of data innovation and programming designing. The lithe philosophies, similar to Scrum, have a devoted advance in the process that objectives the enhancement of the improvement procedure and programming items. Pivotal to process enhancement is to pick up data that grants you to assess the condition of the procedure and its items. From the status data, you can design activities for the upgrade and furthermore assess the accomplishment of those activities. This investigation builds a model that measures the product nature of the improvement procedure. The product quality is dependent on the useful and auxiliary nature of the product items, besides the nature of the advancement procedure is likewise vital to enhance programming quality. Utilitarian quality covers the adherence to client prerequisites, while the auxiliary quality tends to the structure of the product item's source code with reference to its practicality. The procedure quality is identified with the consistency and expectedness of the improvement procedure. The product quality model is connected in a business setting by social occasion the information for the product measurements in the model. To assess the product quality model, we investigate the information and present it to the general population engaged with the light-footed programming improvement process. The outcomes from the application and the client input recommend that the model empowers a reasonable evaluation of the product quality and that it very well may be utilized to help the persistent enhancement of the advancement procedure and programming items.Keywords: Agile SDLC Tools, Agile Software development, business value, enterprise applications, IBM, IBM Rational Team Concert, RTC, software quality, software metrics
Procedia PDF Downloads 1748479 Modeling Core Flooding Experiments for Co₂ Geological Storage Applications
Authors: Avinoam Rabinovich
Abstract:
CO₂ geological storage is a proven technology for reducing anthropogenic carbon emissions, which is paramount for achieving the ambitious net zero emissions goal. Core flooding experiments are an important step in any CO₂ storage project, allowing us to gain information on the flow of CO₂ and brine in the porous rock extracted from the reservoir. This information is important for understanding basic mechanisms related to CO₂ geological storage as well as for reservoir modeling, which is an integral part of a field project. In this work, a different method for constructing accurate models of CO₂-brine core flooding will be presented. Results for synthetic cases and real experiments will be shown and compared with numerical models to exhibit their predictive capabilities. Furthermore, the various mechanisms which impact the CO₂ distribution and trapping in the rock samples will be discussed, and examples from models and experiments will be provided. The new method entails solving an inverse problem to obtain a three-dimensional permeability distribution which, along with the relative permeability and capillary pressure functions, constitutes a model of the flow experiments. The model is more accurate when data from a number of experiments are combined to solve the inverse problem. This model can then be used to test various other injection flow rates and fluid fractions which have not been tested in experiments. The models can also be used to bridge the gap between small-scale capillary heterogeneity effects (sub-core and core scale) and large-scale (reservoir scale) effects, known as the upscaling problem.Keywords: CO₂ geological storage, residual trapping, capillary heterogeneity, core flooding, CO₂-brine flow
Procedia PDF Downloads 708478 Developing A Third Degree Of Freedom For Opinion Dynamics Models Using Scales
Authors: Dino Carpentras, Alejandro Dinkelberg, Michael Quayle
Abstract:
Opinion dynamics models use an agent-based modeling approach to model people’s opinions. Model's properties are usually explored by testing the two 'degrees of freedom': the interaction rule and the network topology. The latter defines the connection, and thus the possible interaction, among agents. The interaction rule, instead, determines how agents select each other and update their own opinion. Here we show the existence of the third degree of freedom. This can be used for turning one model into each other or to change the model’s output up to 100% of its initial value. Opinion dynamics models represent the evolution of real-world opinions parsimoniously. Thus, it is fundamental to know how real-world opinion (e.g., supporting a candidate) could be turned into a number. Specifically, we want to know if, by choosing a different opinion-to-number transformation, the model’s dynamics would be preserved. This transformation is typically not addressed in opinion dynamics literature. However, it has already been studied in psychometrics, a branch of psychology. In this field, real-world opinions are converted into numbers using abstract objects called 'scales.' These scales can be converted one into the other, in the same way as we convert meters to feet. Thus, in our work, we analyze how this scale transformation may affect opinion dynamics models. We perform our analysis both using mathematical modeling and validating it via agent-based simulations. To distinguish between scale transformation and measurement error, we first analyze the case of perfect scales (i.e., no error or noise). Here we show that a scale transformation may change the model’s dynamics up to a qualitative level. Meaning that a researcher may reach a totally different conclusion, even using the same dataset just by slightly changing the way data are pre-processed. Indeed, we quantify that this effect may alter the model’s output by 100%. By using two models from the standard literature, we show that a scale transformation can transform one model into the other. This transformation is exact, and it holds for every result. Lastly, we also test the case of using real-world data (i.e., finite precision). We perform this test using a 7-points Likert scale, showing how even a small scale change may result in different predictions or a number of opinion clusters. Because of this, we think that scale transformation should be considered as a third-degree of freedom for opinion dynamics. Indeed, its properties have a strong impact both on theoretical models and for their application to real-world data.Keywords: degrees of freedom, empirical validation, opinion scale, opinion dynamics
Procedia PDF Downloads 1558477 Understanding the Role of Gas Hydrate Morphology on the Producibility of a Hydrate-Bearing Reservoir
Authors: David Lall, Vikram Vishal, P. G. Ranjith
Abstract:
Numerical modeling of gas production from hydrate-bearing reservoirs requires the solution of various thermal, hydrological, chemical, and mechanical phenomena in a coupled manner. Among the various reservoir properties that influence gas production estimates, the distribution of permeability across the domain is one of the most crucial parameters since it determines both heat transfer and mass transfer. The aspect of permeability in hydrate-bearing reservoirs is particularly complex compared to conventional reservoirs since it depends on the saturation of gas hydrates and hence, is dynamic during production. The dependence of permeability on hydrate saturation is mathematically represented using permeability-reduction models, which are specific to the expected morphology of hydrate accumulations (such as grain-coating or pore-filling hydrates). In this study, we demonstrate the impact of various permeability-reduction models, and consequently, different morphologies of hydrate deposits on the estimates of gas production using depressurization at the reservoir scale. We observe significant differences in produced water volumes and cumulative mass of produced gas between the models, thereby highlighting the uncertainty in production behavior arising from the ambiguity in the prevalent gas hydrate morphology.Keywords: gas hydrate morphology, multi-scale modeling, THMC, fluid flow in porous media
Procedia PDF Downloads 2208476 The Modern Era in the Cricket World: How Far Have We Really Come?
Authors: Habib Noorbhai
Abstract:
History of Cricket: Cricket has a known history spanning from the 16th century till present, with international matches having been played since 1844. The game of cricket arrived in Australia as soon as colonization began in 1788. Cricketers started playing on turf wickets in the late 1800’s and dimensions for both the boundary and pitch later became assimilated. As the years evolved, cricket bats and balls, protective equipment, playing surfaces and the three formats of the game adapted to the playing conditions and laws of cricket. Business of Cricket: During the late 1900's, the shorter version of the game (T20) was introduced in order to attract the crowds to stadiums and television viewers for broadcasting rights. One could argue if this was merely a business venture or a platform for enhancing the performance of cricketers. Between the 16th and 20th century, cricket was a common sport played for passion and pure enjoyment. Industries saw a potential in diversified business ventures in the game (as well as other sports played globally) and cricket subsequently became a career for players, administrators and coaches, the media, health professionals, managers and the corporate world. Pros and Cons of Cricket Developments: At present, the game has significantly gained from the use of technology, sports sciences and varied mechanisms to optimize the performances and forecast frameworks for injury prevention in cricket players. Unfortunately, these had not been utilized in the earlier times of cricket and it would prove interesting to observe how the greats of the game would have benefited with such developments. Cricketers in the 21st century are faced with many overwhelming commitments. One of these is playing cricket for 11 months in a year, making it more than 250 days away from home and their families. As the demand of player contracts increase, the supply of commitment and performances from players increase. Way Forward and Future Implications: The questions are: Are such disadvantages contributing to the overload and injury risks of players? How far have we really come in the cricketing world or has everything since the game’s inception become institutionalized with a business model? These are the fundamental questions which need to be addressed and legislation, policies and ethical considerations need to be drafted and implemented. These will ensure that there is equilibrium of effective transitions and management of not only the players, but also the credibility of the wonderful game.Keywords: enterprising business of cricket, technology, legislation, credibility
Procedia PDF Downloads 4488475 Hybrid Direct Numerical Simulation and Large Eddy Simulating Wall Models Approach for the Analysis of Turbulence Entropy
Authors: Samuel Ahamefula
Abstract:
Turbulent motion is a highly nonlinear and complex phenomenon, and its modelling is still very challenging. In this study, we developed a hybrid computational approach to accurately simulate fluid turbulence phenomenon. The focus is coupling and transitioning between Direct Numerical Simulation (DNS) and Large Eddy Simulating Wall Models (LES-WM) regions. In the framework, high-order fidelity fluid dynamical methods are utilized to simulate the unsteady compressible Navier-Stokes equations in the Eulerian format on the unstructured moving grids. The coupling and transitioning of DNS and LES-WM are conducted through the linearly staggered Dirichlet-Neumann coupling scheme. The high-fidelity framework is verified and validated based on namely, DNS ability for capture full range of turbulent scales, giving accurate results and LES-WM efficiency in simulating near-wall turbulent boundary layer by using wall models.Keywords: computational methods, turbulence modelling, turbulence entropy, navier-stokes equations
Procedia PDF Downloads 1018474 The Cost of Solar-Centric Renewable Portfolio
Authors: Timothy J. Considine, Edward J. M. Manderson
Abstract:
This paper develops an econometric forecasting system of energy demand coupled with engineering-economic models of energy supply. The framework is used to quantify the impact of state-level renewable portfolio standards (RPSs) achieved predominately with solar generation on electricity rates, electricity consumption, and environmental quality. We perform the analysis using Arizona’s RPS as a case study. We forecast energy demand in Arizona out to 2035, and find by this time the state will require an additional 35 million MWh of electricity generation. If Arizona implements its RPS when supplying this electricity demand, we find there will be a substantial increase in electricity rates (relative to a business-as-usual scenario of reliance on gas-fired generation). Extending the current regime of tax credits can greatly reduce this increase, at the taxpayers’ expense. We find that by 2025 Arizona’s RPS will implicitly abate carbon dioxide emissions at a cost between $101 and $135 per metric ton, and by 2035 abatement costs are between $64 and $112 per metric ton (depending on the future evolution of nature gas prices).Keywords: electricity demand, renewable portfolio standard, solar, carbon dioxide
Procedia PDF Downloads 4858473 Comparison of Spiking Neuron Models in Terms of Biological Neuron Behaviours
Authors: Fikret Yalcinkaya, Hamza Unsal
Abstract:
To understand how neurons work, it is required to combine experimental studies on neural science with numerical simulations of neuron models in a computer environment. In this regard, the simplicity and applicability of spiking neuron modeling functions have been of great interest in computational neuron science and numerical neuroscience in recent years. Spiking neuron models can be classified by exhibiting various neuronal behaviors, such as spiking and bursting. These classifications are important for researchers working on theoretical neuroscience. In this paper, three different spiking neuron models; Izhikevich, Adaptive Exponential Integrate Fire (AEIF) and Hindmarsh Rose (HR), which are based on first order differential equations, are discussed and compared. First, the physical meanings, derivatives, and differential equations of each model are provided and simulated in the Matlab environment. Then, by selecting appropriate parameters, the models were visually examined in the Matlab environment and it was aimed to demonstrate which model can simulate well-known biological neuron behaviours such as Tonic Spiking, Tonic Bursting, Mixed Mode Firing, Spike Frequency Adaptation, Resonator and Integrator. As a result, the Izhikevich model has been shown to perform Regular Spiking, Continuous Explosion, Intrinsically Bursting, Thalmo Cortical, Low-Threshold Spiking and Resonator. The Adaptive Exponential Integrate Fire model has been able to produce firing patterns such as Regular Ignition, Adaptive Ignition, Initially Explosive Ignition, Regular Explosive Ignition, Delayed Ignition, Delayed Regular Explosive Ignition, Temporary Ignition and Irregular Ignition. The Hindmarsh Rose model showed three different dynamic neuron behaviours; Spike, Burst and Chaotic. From these results, the Izhikevich cell model may be preferred due to its ability to reflect the true behavior of the nerve cell, the ability to produce different types of spikes, and the suitability for use in larger scale brain models. The most important reason for choosing the Adaptive Exponential Integrate Fire model is that it can create rich ignition patterns with fewer parameters. The chaotic behaviours of the Hindmarsh Rose neuron model, like some chaotic systems, is thought to be used in many scientific and engineering applications such as physics, secure communication and signal processing.Keywords: Izhikevich, adaptive exponential integrate fire, Hindmarsh Rose, biological neuron behaviours, spiking neuron models
Procedia PDF Downloads 1818472 Aggregate Production Planning Framework in a Multi-Product Factory: A Case Study
Authors: Ignatio Madanhire, Charles Mbohwa
Abstract:
This study looks at the best model of aggregate planning activity in an industrial entity and uses the trial and error method on spreadsheets to solve aggregate production planning problems. Also linear programming model is introduced to optimize the aggregate production planning problem. Application of the models in a furniture production firm is evaluated to demonstrate that practical and beneficial solutions can be obtained from the models. Finally some benchmarking of other furniture manufacturing industries was undertaken to assess relevance and level of use in other furniture firmsKeywords: aggregate production planning, trial and error, linear programming, furniture industry
Procedia PDF Downloads 5568471 Strategy, Intellectual Capital Disclosure, Competition, and Market Performance
Authors: Agnes Utari Widyaningdyah
Abstract:
This study investigates the relationship between strategy, intellectual capital (IC) disclosure, and the firm’s performance by considering business competition as a moderating variable. The secondary sectors manufacturing firms in the Jakarta Stock Industrial Classification as sample because this group represents a knowledge-intensive firm according to the OECD (Organization for Economic Cooperation and Development) criteria. Using path analysis, this study reveals that there is a significant influence of strategy toward IC disclosure. Firms with differentiation strategy tend to withhold its strategic information included IC because of afraid in losing their competitive advantage. The results also indicate that firms are more likely to withhold information about IC if they perceive that current or potential competition is strong. However, firms should consider that IC disclosure is a positive signal to the investor.Keywords: strategy, IC disclosure, market performance, business competition
Procedia PDF Downloads 2988470 Machine Learning Techniques for Estimating Ground Motion Parameters
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this study is to evaluate the advantages and disadvantages of various machine learning techniques in forecasting ground-motion intensity measures given source characteristics, source-to-site distance, and local site condition. Intensity measures such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Estimating these variables for future earthquake events is a key step in seismic hazard assessment and potentially subsequent risk assessment of different types of structures. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as a statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The algorithms are adjusted to quantify event-to-event and site-to-site variability of the ground motions by implementing them as random effects in the proposed models to reduce the aleatory uncertainty. All the algorithms are trained using a selected database of 4,528 ground-motions, including 376 seismic events with magnitude 3 to 5.8, recorded over the hypocentral distance range of 4 to 500 km in Oklahoma, Kansas, and Texas since 2005. The main reason of the considered database stems from the recent increase in the seismicity rate of these states attributed to petroleum production and wastewater disposal activities, which necessities further investigation in the ground motion models developed for these states. Accuracy of the models in predicting intensity measures, generalization capability of the models for future data, as well as usability of the models are discussed in the evaluation process. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available.Keywords: artificial neural network, ground-motion models, machine learning, random forest, support vector machine
Procedia PDF Downloads 1238469 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models
Authors: I. V. Pinto, M. R. Sooriyarachchi
Abstract:
It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.Keywords: goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, penalized quasi-likelihood, power, quasi-likelihood, type-I error
Procedia PDF Downloads 1428468 Towards a Framework for Embedded Weight Comparison Algorithm with Business Intelligence in the Plantation Domain
Authors: M. Pushparani, A. Sagaya
Abstract:
Embedded systems have emerged as important elements in various domains with extensive applications in automotive, commercial, consumer, healthcare and transportation markets, as there is emphasis on intelligent devices. On the other hand, Business Intelligence (BI) has also been extensively used in a range of applications, especially in the agriculture domain which is the area of this research. The aim of this research is to create a framework for Embedded Weight Comparison Algorithm with Business Intelligence (EWCA-BI). The weight comparison algorithm will be embedded within the plantation management system and the weighbridge system. This algorithm will be used to estimate the weight at the site and will be compared with the actual weight at the plantation. The algorithm will be used to build the necessary alerts when there is a discrepancy in the weight, thus enabling better decision making. In the current practice, data are collected from various locations in various forms. It is a challenge to consolidate data to obtain timely and accurate information for effective decision making. Adding to this, the unstable network connection leads to difficulty in getting timely accurate information. To overcome the challenges embedding is done on a portable device that will have the embedded weight comparison algorithm to also assist in data capture and synchronize data at various locations overcoming the network short comings at collection points. The EWCA-BI will provide real-time information at any given point of time, thus enabling non-latent BI reports that will provide crucial information to enable efficient operational decision making. This research has a high potential in bringing embedded system into the agriculture industry. EWCA-BI will provide BI reports with accurate information with uncompromised data using an embedded system and provide alerts, therefore, enabling effective operation management decision-making at the site.Keywords: embedded business intelligence, weight comparison algorithm, oil palm plantation, embedded systems
Procedia PDF Downloads 2868467 Using Machine Learning to Classify Different Body Parts and Determine Healthiness
Authors: Zachary Pan
Abstract:
Our general mission is to solve the problem of classifying images into different body part types and deciding if each of them is healthy or not. However, for now, we will determine healthiness for only one-sixth of the body parts, specifically the chest. We will detect pneumonia in X-ray scans of those chest images. With this type of AI, doctors can use it as a second opinion when they are taking CT or X-ray scans of their patients. Another ad-vantage of using this machine learning classifier is that it has no human weaknesses like fatigue. The overall ap-proach to this problem is to split the problem into two parts: first, classify the image, then determine if it is healthy. In order to classify the image into a specific body part class, the body parts dataset must be split into test and training sets. We can then use many models, like neural networks or logistic regression models, and fit them using the training set. Now, using the test set, we can obtain a realistic accuracy the models will have on images in the real world since these testing images have never been seen by the models before. In order to increase this testing accuracy, we can also apply many complex algorithms to the models, like multiplicative weight update. For the second part of the problem, to determine if the body part is healthy, we can have another dataset consisting of healthy and non-healthy images of the specific body part and once again split that into the test and training sets. We then use another neural network to train on those training set images and use the testing set to figure out its accuracy. We will do this process only for the chest images. A major conclusion reached is that convolutional neural networks are the most reliable and accurate at image classification. In classifying the images, the logistic regression model, the neural network, neural networks with multiplicative weight update, neural networks with the black box algorithm, and the convolutional neural network achieved 96.83 percent accuracy, 97.33 percent accuracy, 97.83 percent accuracy, 96.67 percent accuracy, and 98.83 percent accuracy, respectively. On the other hand, the overall accuracy of the model that de-termines if the images are healthy or not is around 78.37 percent accuracy.Keywords: body part, healthcare, machine learning, neural networks
Procedia PDF Downloads 1048466 Review of Hydrologic Applications of Conceptual Models for Precipitation-Runoff Process
Authors: Oluwatosin Olofintoye, Josiah Adeyemo, Gbemileke Shomade
Abstract:
The relationship between rainfall and runoff is an important issue in surface water hydrology therefore the understanding and development of accurate rainfall-runoff models and their applications in water resources planning, management and operation are of paramount importance in hydrological studies. This paper reviews some of the previous works on the rainfall-runoff process modeling. The hydrologic applications of conceptual models and artificial neural networks (ANNs) for the precipitation-runoff process modeling were studied. Gradient training methods such as error back-propagation (BP) and evolutionary algorithms (EAs) are discussed in relation to the training of artificial neural networks and it is shown that application of EAs to artificial neural networks training could be an alternative to other training methods. Therefore, further research interest to exploit the abundant expert knowledge in the area of artificial intelligence for the solution of hydrologic and water resources planning and management problems is needed.Keywords: artificial intelligence, artificial neural networks, evolutionary algorithms, gradient training method, rainfall-runoff model
Procedia PDF Downloads 4548465 The Effect of Symmetry on the Perception of Happiness and Boredom in Design Products
Authors: Michele Sinico
Abstract:
The present research investigates the effect of symmetry on the perception of happiness and boredom in design products. Three experiments were carried out in order to verify the degree of the visual expressive value on different models of bookcases, wall clocks, and chairs. 60 participants directly indicated the degree of happiness and boredom using 7-point rating scales. The findings show that the participants acknowledged a different value of expressive quality in the different product models. Results show also that symmetry is not a significant constraint for an emotional design project.Keywords: product experience, emotional design, symmetry, expressive qualities
Procedia PDF Downloads 1478464 Airliner-UAV Flight Formation in Climb Regime
Authors: Pavel Zikmund, Robert Popela
Abstract:
Extreme formation is a theoretical concept of self-sustain flight when a big Airliner is followed by a small UAV glider flying in airliner’s wake vortex. The paper presents results of climb analysis with a goal to lift the gliding UAV to airliner’s cruise altitude. Wake vortex models, the UAV drag polar and basic parameters and airliner’s climb profile are introduced at first. Then, flight performance of the UAV in the wake vortex is evaluated by analytical methods. Time history of optimal distance between the airliner and the UAV during the climb is determined. The results are encouraging, therefore available UAV drag margin for electricity generation is figured out for different vortex models.Keywords: flight in formation, self-sustained flight, UAV, wake vortex
Procedia PDF Downloads 4418463 Department of Social Development/Japan International Cooperation Agency's Journey from South African Community to Southern African Region
Authors: Daisuke Sagiya, Ren Kamioka
Abstract:
South Africa has ratified the United Nations Convention on the Rights of Persons with Disabilities (UNCRPD) on 30th November 2007. In line with this, the Department of Social Development (DSD) revised the White Paper on the Rights of Persons with Disabilities (WPRPD), and the Cabinet approved it on 9th December 2015. The South African government is striving towards the elimination of poverty and inequality in line with UNCRPD and WPRPD. However, there are minimal programmes and services that have been provided to persons with disabilities in the rural community. In order to address current discriminative practices, disunity and limited self-representation in rural community, DSD in cooperation with the Japan International Cooperation Agency (JICA) is implementing the 'Project for the Promotion of Empowerment of Persons with Disabilities and Disability Mainstreaming' from May 2016 to May 2020. The project is targeting rural community as the project sites, namely 1) Collins Chabane municipality, Vhembe district, Limpopo and 2) Maluti-a-Phofung municipality, Thabo Mofutsanyana district, Free State. The project aims at developing good practices on Community-Based Inclusive Development (CBID) at the project sites which will be documented as a guideline and applied in other provinces in South Africa and neighbouring countries (Lesotho, Swaziland, Botswana, Namibia, Zimbabwe, and Mozambique). In cooperation with provincial and district DSD and local government, the project is currently implementing various community activities, for example: Establishment of Self-Help Group (SHG) of persons with disabilities and Peer Counselling in the villages, and will conduct Disability Equality Training (DET) and accessibility workshop in order to enhance the CBID in the project sites. In order to universalise good practices on CBID, the authors will explain lessons learned from the project by utilising the theories of disability and development studies and community psychology such as social model of disability, twin-track approach, empowerment theory, sense of community, helper therapy principle, etc. And the authors conclude that in order to realise social participation of persons with disabilities in rural community, CBID is a strong tool and persons with disabilities must play central roles in all spheres of CBID activities.Keywords: community-based inclusive development, disability mainstreaming, empowerment of persons with disabilities, self-help group
Procedia PDF Downloads 2408462 Why Trust Matters for Women Entrepreneurs: Insights from Malaysia
Authors: Suraini Mohd Rhouse, Noor Lela Ahmad, Nek Kamal Yeop Yunus, Rosfizah Md Taib
Abstract:
This article aims to explore the importance of trust to women entrepreneurs. In particular, the research uses a social constructionist lens to examine ways in which women entrepreneurs construct trust in relation to their various stakeholders. A semi-structured interview was used to gather the data. The findings suggest women highlight the importance of trust in order to establish customer satisfaction that can further develop customer loyalty. In addition, aspect of trust with the employees is seen as vital for building organizational commitment to the business organization. Women also see the trust dimension in terms of their relationships with financial providers in order to gain approval for financial resources. This article contributes to the literature on the value of trust to women’s business environments.Keywords: qualitative, social constructionist, trust, women entrepreneurship
Procedia PDF Downloads 5608461 The Role of Employee Incentives in Financing from Customers
Authors: Mengyu Lu, Yongsheng Guo
Abstract:
This study investigates how employee incentives affect employee performance in financing from customers. This study followed a grounded theory approach where data were collected through 29 interviews. Main themes and categories were identified through the coding processes. This study found that casual conditions, including financial barriers, informal finance, business location, customer base and customer relationship, influenced the adoption of customer finance in the case of SMEs. The SMEs build and maintain long-term relationships with customers through personal communications. The SMEs engage and motivate employees in customer communications and business financing strategy through financial incentives programs, including bonuses, salary rises, rewards and non-financial incentives, including training opportunities, extra holiday leave, and flexible working hours. Employee performance was measured through financing contribution and job contribution. As a consequence, customers will be well served by employees and get a better customer experience. SMEs can get benefits such as employee engagement, employee satisfaction and sustainable financing sources. This study gets in sight of employee incentives in improving employee performance in customer finance and makes implications to human capital theories. Suggestions are provided to the decision-makers in businesses as incentive programs improve employee performance that, eventually contributes to overall business performance.Keywords: SMEs, financing from customers, employee incentives, performance-based measurement
Procedia PDF Downloads 568460 The Role of Businesses in Peacebuilding in Nigeria: A Stakeholder Approach
Authors: Jamila Mohammed Makarfi, Yontem Sonmez
Abstract:
Developing countries like Nigeria have recently been affected by conflicts characterized by violence, high levels of risk and insecurity, resulting in loss of lives, livelihoods, displacement of communities, degradation of health, educational and social infrastructure as well as economic underdevelopment. The Nigerian government’s response to most of these conflicts has mainly been reactionary in the form of military deployments, as against precautionary to prevent or address the root causes of the conflicts. Several studies have shown that at various points of a conflict, conflict regions can benefit from the resources and expertise available outside the government, mainly from the private sector through mechanisms such as corporate social responsibility (CSR) by businesses. The main aim of this study is to examine the role of businesses in peacebuilding in Northern Nigeria through CSR in the last decade. The expected contributions from this will answer research questions, such as the key business motivations to engage in peacebuilding, as well as the degree of influence exerted from various stakeholder groups on the business decision to engage. The methodology of the study adopts a multiple case study of over 120 businesses of various sizes, ranging from small, medium and large-scale. A mixed method enabled the collection of quantitative and qualitative primary data to augment the secondary data. The results indicated that the most important business motivations to engage in peacebuilding were the negative effects of the conflict on economic stability, as well as stakeholder-driven motives. On the other hand, out of the 12 identified stakeholders, micro-, small- and medium-scale enterprises (MSMEs) considered the chief executive officer’s interest to be the most important factor, while large companies rated the government and community pressure as the highest. Overall, the foreign stakeholders scored low on the influence chart for all business types.Keywords: conflict, corporate social responsibility, peacebuilding, stakeholder
Procedia PDF Downloads 2218459 Optimal Diversification and Bank Value Maximization
Authors: Chien-Chih Lin
Abstract:
This study argues that the optimal diversifications for the maximization of bank value are asymmetrical; they depend on the business cycle. During times of expansion, systematic risks are relatively low, and hence there is only a slight effect from raising them with a diversified portfolio. Consequently, the benefit of reducing individual risks dominates any loss from raising systematic risks, leading to a higher value for a bank by holding a diversified portfolio of assets. On the contrary, in times of recession, systematic risks are relatively high. It is more likely that the loss from raising systematic risks surpasses the benefit of reducing individual risks from portfolio diversification. Consequently, more diversification leads to lower bank values. Finally, some empirical evidence from the banks in Taiwan is provided.Keywords: diversification, default probability, systemic risk, banking, business cycle
Procedia PDF Downloads 4378458 Problem Gambling in the Conceptualization of Health Professionals: A Qualitative Analysis of the Discourses Produced by Psychologists, Psychiatrists and General Practitioners
Authors: T. Marinaci, C. Venuleo
Abstract:
Different conceptualizations of disease affect patient care. This study aims to address this gap. It explores how health professionals conceptualize gambling problem, addiction and the goals of recovery process. In-depth, semi-structured, open-ended interviews were conducted with Italian psychologists, psychiatrists, general practitioners, and support staff (N= 114), working within health centres for the treatment of addiction (public health services or therapeutic communities) or medical offices. A Lexical Correspondence Analysis (LCA) was applied to the verbatim transcripts. LCA allowed to identify two main factorial dimensions, which organize similarity and dissimilarity in the discourses of the interviewed. The first dimension labelled 'Models of relationship with the problem', concerns two different models of relationship with the health problem: one related to the request for help and the process of taking charge and the other related to the identification of the psychopathology underlying the disorder. The second dimension, labelled 'Organisers of the intervention' reflects the dialectic between two ways to address the problem. On the one hand, they are the gambling dynamics and its immediate life-consequences to organize the intervention (whatever the request of the user is); on the other hand, they are the procedures and the tools which characterize the health service to organize the way the professionals deal with the user’ s problem (whatever it is and despite the specify of the user’s request). The results highlight how, despite the differences, the respondents share a central assumption: understanding gambling problem implies the reference to the gambler’s identity, more than, for instance, to the relational, social, cultural or political context where the gambler lives. A passive stance is attributed to the user, who does not play any role in the definition of the goal of the intervention. The results will be discussed to highlight the relationship between professional models and users’ ways to understand and deal with the problems related to gambling.Keywords: cultural models, health professionals, intervention models, problem gambling
Procedia PDF Downloads 1548457 Probing Syntax Information in Word Representations with Deep Metric Learning
Authors: Bowen Ding, Yihao Kuang
Abstract:
In recent years, with the development of large-scale pre-trained lan-guage models, building vector representations of text through deep neural network models has become a standard practice for natural language processing tasks. From the performance on downstream tasks, we can know that the text representation constructed by these models contains linguistic information, but its encoding mode and extent are unclear. In this work, a structural probe is proposed to detect whether the vector representation produced by a deep neural network is embedded with a syntax tree. The probe is trained with the deep metric learning method, so that the distance between word vectors in the metric space it defines encodes the distance of words on the syntax tree, and the norm of word vectors encodes the depth of words on the syntax tree. The experiment results on ELMo and BERT show that the syntax tree is encoded in their parameters and the word representations they produce.Keywords: deep metric learning, syntax tree probing, natural language processing, word representations
Procedia PDF Downloads 688456 An enhanced Framework for Regional Tourism Sustainable Adaptation to Climate Change
Authors: Joseph M. Njoroge
Abstract:
The need for urgent adaptation have triggered tourism stakeholders and research community to develop generic adaptation framework(s) for national, regional and or local tourism desti-nations. Such frameworks have been proposed to guide the tourism industry in the adaptation process with an aim of reducing tourism industry’s vulnerability and to enhance their ability to cope to climate associated externalities. However research show that current approaches are far from sustainability since the adaptation options sought are usually closely associated with development needs-‘business as usual’-where the implication of adaptation to social justice and environmental integrity are often neglected. Based on this view there is a need to look at adaptation beyond addressing vulnerability and resilience to include the need for adaptation to enhance social justice and environmental integrity. This paper reviews the existing adaptation frameworks/models and evaluates their suitability in enhancing sustainable adaptation for regional tourist destinations. It is noted that existing frameworks contradicts the basic ‘principles of sustainable adaptation’. Further attempts are made to propose a Sustainable Regional Tourism Adaptation Framework (SRTAF) to assist regional tourism stakeholders in the achieving sustainable adaptation.Keywords: sustainable adaptation, sustainability principles, sustainability portfolio, Regional Tourism
Procedia PDF Downloads 3978455 Prediction of Bodyweight of Cattle by Artificial Neural Networks Using Digital Images
Authors: Yalçın Bozkurt
Abstract:
Prediction models were developed for accurate prediction of bodyweight (BW) by using Digital Images of beef cattle body dimensions by Artificial Neural Networks (ANN). For this purpose, the animal data were collected at a private slaughter house and the digital images and the weights of each live animal were taken just before they were slaughtered and the body dimensions such as digital wither height (DJWH), digital body length (DJBL), digital body depth (DJBD), digital hip width (DJHW), digital hip height (DJHH) and digital pin bone length (DJPL) were determined from the images, using the data with 1069 observations for each traits. Then, prediction models were developed by ANN. Digital body measurements were analysed by ANN for body prediction and R2 values of DJBL, DJWH, DJHW, DJBD, DJHH and DJPL were approximately 94.32, 91.31, 80.70, 83.61, 89.45 and 70.56 % respectively. It can be concluded that in management situations where BW cannot be measured it can be predicted accurately by measuring DJBL and DJWH alone or both DJBD and even DJHH and different models may be needed to predict BW in different feeding and environmental conditions and breedsKeywords: artificial neural networks, bodyweight, cattle, digital body measurements
Procedia PDF Downloads 3728454 Forecasting Equity Premium Out-of-Sample with Sophisticated Regression Training Techniques
Authors: Jonathan Iworiso
Abstract:
Forecasting the equity premium out-of-sample is a major concern to researchers in finance and emerging markets. The quest for a superior model that can forecast the equity premium with significant economic gains has resulted in several controversies on the choice of variables and suitable techniques among scholars. This research focuses mainly on the application of Regression Training (RT) techniques to forecast monthly equity premium out-of-sample recursively with an expanding window method. A broad category of sophisticated regression models involving model complexity was employed. The RT models include Ridge, Forward-Backward (FOBA) Ridge, Least Absolute Shrinkage and Selection Operator (LASSO), Relaxed LASSO, Elastic Net, and Least Angle Regression were trained and used to forecast the equity premium out-of-sample. In this study, the empirical investigation of the RT models demonstrates significant evidence of equity premium predictability both statistically and economically relative to the benchmark historical average, delivering significant utility gains. They seek to provide meaningful economic information on mean-variance portfolio investment for investors who are timing the market to earn future gains at minimal risk. Thus, the forecasting models appeared to guarantee an investor in a market setting who optimally reallocates a monthly portfolio between equities and risk-free treasury bills using equity premium forecasts at minimal risk.Keywords: regression training, out-of-sample forecasts, expanding window, statistical predictability, economic significance, utility gains
Procedia PDF Downloads 107