Search results for: inverse models of data envelopment analysis
44396 The Visualization of Hydrological and Hydraulic Models Based on the Platform of Autodesk Civil 3D
Authors: Xiyue Wang, Shaoning Yan
Abstract:
Cities in China today is faced with an increasingly serious river ecological crisis accompanying with the development of urbanization: waterlogging on account of the fragmented urban natural hydrological system; the limited ecological function of the hydrological system caused by a destruction of water system and waterfront ecological environment. Additionally, the eco-hydrological processes of rivers are affected by various environmental factors, which are more complex in the context of urban environment. Therefore, efficient hydrological monitoring and analysis tools, accurate and visual hydrological and hydraulic models are becoming more important basis for decision-makers and an important way for landscape architects to solve urban hydrological problems, formulating sustainable and forward-looking schemes. The study mainly introduces the river and flood analysis model based on the platform of Autodesk Civil 3D. Taking the Luanhe River in Qian'an City of Hebei Province as an example, the 3D models of the landform, river, embankment, shoal, pond, underground stream and other land features were initially built, with which the water transfer simulation analysis, river floodplain analysis, and river ecology analysis were carried out, ultimately the real-time visualized simulation and analysis of rivers in various hypothetical scenarios were realized. Through the establishment of digital hydrological and hydraulic model, the hydraulic data can be accurately and intuitively simulated, which provides basis for rational water system and benign urban ecological system design. Though, the hydrological and hydraulic model based on Autodesk Civil3D own its boundedness: the interaction between the model and other data and software is unfavorable; the huge amount of 3D data and the lack of basic data restrict the accuracy and application range. The hydrological and hydraulic model based on Autodesk Civil3D platform provides more possibility to access convenient and intelligent tool for urban planning and monitoring, a solid basis for further urban research and design.Keywords: visualization, hydrological and hydraulic model, Autodesk Civil 3D, urban river
Procedia PDF Downloads 29744395 Cessna Citation X Business Aircraft Stability Analysis Using Linear Fractional Representation LFRs Model
Authors: Yamina Boughari, Ruxandra Mihaela Botez, Florian Theel, Georges Ghazi
Abstract:
Clearance of flight control laws of a civil aircraft is a long and expensive process in the Aerospace industry. Thousands of flight combinations in terms of speeds, altitudes, gross weights, centers of gravity and angles of attack have to be investigated, and proved to be safe. Nonetheless, in this method, a worst flight condition can be easily missed, and its missing would lead to a critical situation. Definitively, it would be impossible to analyze a model because of the infinite number of cases contained within its flight envelope, that might require more time, and therefore more design cost. Therefore, in industry, the technique of the flight envelope mesh is commonly used. For each point of the flight envelope, the simulation of the associated model ensures the satisfaction or not of specifications. In order to perform fast, comprehensive and effective analysis, other varying parameters models were developed by incorporating variations, or uncertainties in the nominal models, known as Linear Fractional Representation LFR models; these LFR models were able to describe the aircraft dynamics by taking into account uncertainties over the flight envelope. In this paper, the LFRs models are developed using the speeds and altitudes as varying parameters; The LFR models were built using several flying conditions expressed in terms of speeds and altitudes. The use of such a method has gained a great interest by the aeronautical companies that have seen a promising future in the modeling, and particularly in the design and certification of control laws. In this research paper, we will focus on the Cessna Citation X open loop stability analysis. The data are provided by a Research Aircraft Flight Simulator of Level D, that corresponds to the highest level flight dynamics certification; this simulator was developed by CAE Inc. and its development was based on the requirements of research at the LARCASE laboratory. The acquisition of these data was used to develop a linear model of the airplane in its longitudinal and lateral motions, and was further used to create the LFR’s models for 12 XCG /weights conditions, and thus the whole flight envelope using a friendly Graphical User Interface developed during this study. Then, the LFR’s models are analyzed using Interval Analysis method based upon Lyapunov function, and also the ‘stability and robustness analysis’ toolbox. The results were presented under the form of graphs, thus they have offered good readability, and were easily exploitable. The weakness of this method stays in a relatively long calculation, equal to about four hours for the entire flight envelope.Keywords: flight control clearance, LFR, stability analysis, robustness analysis
Procedia PDF Downloads 35244394 Analysis of Spatial and Temporal Data Using Remote Sensing Technology
Authors: Kapil Pandey, Vishnu Goyal
Abstract:
Spatial and temporal data analysis is very well known in the field of satellite image processing. When spatial data are correlated with time, series analysis it gives the significant results in change detection studies. In this paper the GIS and Remote sensing techniques has been used to find the change detection using time series satellite imagery of Uttarakhand state during the years of 1990-2010. Natural vegetation, urban area, forest cover etc. were chosen as main landuse classes to study. Landuse/ landcover classes within several years were prepared using satellite images. Maximum likelihood supervised classification technique was adopted in this work and finally landuse change index has been generated and graphical models were used to present the changes.Keywords: GIS, landuse/landcover, spatial and temporal data, remote sensing
Procedia PDF Downloads 43344393 Evaluation of Newly Synthesized Steroid Derivatives Using In silico Molecular Descriptors and Chemometric Techniques
Authors: Milica Ž. Karadžić, Lidija R. Jevrić, Sanja Podunavac-Kuzmanović, Strahinja Z. Kovačević, Anamarija I. Mandić, Katarina Penov-Gaši, Andrea R. Nikolić, Aleksandar M. Oklješa
Abstract:
This study considered selection of the in silico molecular descriptors and the models for newly synthesized steroid derivatives description and their characterization using chemometric techniques. Multiple linear regression (MLR) models were established and gave the best molecular descriptors for quantitative structure-retention relationship (QSRR) modeling of the retention of the investigated molecules. MLR models were without multicollinearity among the selected molecular descriptors according to the variance inflation factor (VIF) values. Used molecular descriptors were ranked using generalized pair correlation method (GPCM). In this method, the significant difference between independent variables can be noticed regardless almost equal correlation between dependent variable. Generated MLR models were statistically and cross-validated and the best models were kept. Models were ranked using sum of ranking differences (SRD) method. According to this method, the most consistent QSRR model can be found and similarity or dissimilarity between the models could be noticed. In this study, SRD was performed using average values of experimentally observed data as a golden standard. Chemometric analysis was conducted in order to characterize newly synthesized steroid derivatives for further investigation regarding their potential biological activity and further synthesis. This article is based upon work from COST Action (CM1105), supported by COST (European Cooperation in Science and Technology).Keywords: generalized pair correlation method, molecular descriptors, regression analysis, steroids, sum of ranking differences
Procedia PDF Downloads 34744392 Data-driven Decision-Making in Digital Entrepreneurship
Authors: Abeba Nigussie Turi, Xiangming Samuel Li
Abstract:
Data-driven business models are more typical for established businesses than early-stage startups that strive to penetrate a market. This paper provided an extensive discussion on the principles of data analytics for early-stage digital entrepreneurial businesses. Here, we developed data-driven decision-making (DDDM) framework that applies to startups prone to multifaceted barriers in the form of poor data access, technical and financial constraints, to state some. The startup DDDM framework proposed in this paper is novel in its form encompassing startup data analytics enablers and metrics aligning with startups' business models ranging from customer-centric product development to servitization which is the future of modern digital entrepreneurship.Keywords: startup data analytics, data-driven decision-making, data acquisition, data generation, digital entrepreneurship
Procedia PDF Downloads 32844391 On Stochastic Models for Fine-Scale Rainfall Based on Doubly Stochastic Poisson Processes
Authors: Nadarajah I. Ramesh
Abstract:
Much of the research on stochastic point process models for rainfall has focused on Poisson cluster models constructed from either the Neyman-Scott or Bartlett-Lewis processes. The doubly stochastic Poisson process provides a rich class of point process models, especially for fine-scale rainfall modelling. This paper provides an account of recent development on this topic and presents the results based on some of the fine-scale rainfall models constructed from this class of stochastic point processes. Amongst the literature on stochastic models for rainfall, greater emphasis has been placed on modelling rainfall data recorded at hourly or daily aggregation levels. Stochastic models for sub-hourly rainfall are equally important, as there is a need to reproduce rainfall time series at fine temporal resolutions in some hydrological applications. For example, the study of climate change impacts on hydrology and water management initiatives requires the availability of data at fine temporal resolutions. One approach to generating such rainfall data relies on the combination of an hourly stochastic rainfall simulator, together with a disaggregator making use of downscaling techniques. Recent work on this topic adopted a different approach by developing specialist stochastic point process models for fine-scale rainfall aimed at generating synthetic precipitation time series directly from the proposed stochastic model. One strand of this approach focused on developing a class of doubly stochastic Poisson process (DSPP) models for fine-scale rainfall to analyse data collected in the form of rainfall bucket tip time series. In this context, the arrival pattern of rain gauge bucket tip times N(t) is viewed as a DSPP whose rate of occurrence varies according to an unobserved finite state irreducible Markov process X(t). Since the likelihood function of this process can be obtained, by conditioning on the underlying Markov process X(t), the models were fitted with maximum likelihood methods. The proposed models were applied directly to the raw data collected by tipping-bucket rain gauges, thus avoiding the need to convert tip-times to rainfall depths prior to fitting the models. One advantage of this approach was that the use of maximum likelihood methods enables a more straightforward estimation of parameter uncertainty and comparison of sub-models of interest. Another strand of this approach employed the DSPP model for the arrivals of rain cells and attached a pulse or a cluster of pulses to each rain cell. Different mechanisms for the pattern of the pulse process were used to construct variants of this model. We present the results of these models when they were fitted to hourly and sub-hourly rainfall data. The results of our analysis suggest that the proposed class of stochastic models is capable of reproducing the fine-scale structure of the rainfall process, and hence provides a useful tool in hydrological modelling.Keywords: fine-scale rainfall, maximum likelihood, point process, stochastic model
Procedia PDF Downloads 27844390 Causes for the Precession of the Perihelion in the Planetary Orbits
Authors: Kwan U. Kim, Jin Sim, Ryong Jin Jang, Sung Duk Kim
Abstract:
It is Leverrier that discovered the precession of the perihelion in the planetary orbits for the first time in the world, while it is Einstein that explained the astronomical phenomenom for the first time in the world. The amount of the precession of the perihelion for Einstein’s theory of gravitation has been explained by means of the inverse fourth power force(inverse third power potential) introduced totheory of gravitation through Schwarzschild metric However, the methodology has a serious shortcoming that it is impossible to explain the cause for the precession of the perihelion in the planetary orbits. According to our study, without taking the cause for the precession of the perihelion, 6 methods can explain the amount of the precession of the perihelion discovered by Leverrier. Therefore, the problem of what caused the perihelion to precess in the planetary orbits must be solved for physics because it is a profound scientific and technological problem for a basic experiment in construction of relativistic theory of gravitation. The scientific solution to the problem proved that Einstein’s explanation for the planetary orbits is a magic made by the numerical expressions obtained from fictitious gravitation introduced to theory of gravitation and wrong definition of proper time The problem of the precession of the perihelion seems solved already by means of general theory of relativity, but, in essence, the cause for the astronomical phenomenon has not been successfully explained for astronomy yet. The right solution to the problem comes from generalized theory of gravitation. Therefore, in this paper, it has been shown that by means of Schwarzschild field and the physical quantities of relativistic Lagrangian redflected in it, fictitious gravitation is not the main factor which can cause the perihelion to precess in the planetary orbits. In addition to it, it has been shown that the main factor which can cause the perihelion to precess in the planetary orbits is the inverse third power force existing really in the relativistic region in the Solar system.Keywords: inverse third power force, precession of the perihelion, fictitious gravitation, planetary orbits
Procedia PDF Downloads 1044389 Observational Study Reveals Inverse Relationship: Rising PM₂.₅ Concentrations Linked to Decreasing Muon Flux
Authors: Yashas Mattur, Jensen Coonradt
Abstract:
Muon flux, the rate of muons reaching Earth from the atmosphere, is impacted by various factors such as air pressure, temperature, and humidity. However, the influence of concentrations of PM₂.₅ (particulate matter with diameters 2.5 mm or smaller) on muon detection rates remains unexplored. During the summer of 2023, smoke from Canadian wildfires (containing significant amounts of particulate matter) blew over regions in the Northern US, introducing huge fluctuations in PM₂.₅ concentrations, thus inspiring our experiment to investigate the correlation of PM₂.₅ concentrations and muon rates. To investigate this correlation, muon collision rates were measured and analyzed alongside PM₂.₅ concentration data over the periods of both light and heavy smoke. Other confounding variables, including temperature, humidity, and atmospheric pressure, were also considered. The results reveal a statistically significant inverse correlation between muon flux and PM₂.₅ concentrations, indicating that particulate matter has an impact on the rate of muons reaching the earth’s surface.Keywords: Muon Flux, atmospheric effects on muons, PM₂.₅, airborne particulate matter
Procedia PDF Downloads 7344388 Chemometric QSRR Evaluation of Behavior of s-Triazine Pesticides in Liquid Chromatography
Authors: Lidija R. Jevrić, Sanja O. Podunavac-Kuzmanović, Strahinja Z. Kovačević
Abstract:
This study considers the selection of the most suitable in silico molecular descriptors that could be used for s-triazine pesticides characterization. Suitable descriptors among topological, geometrical and physicochemical are used for quantitative structure-retention relationships (QSRR) model establishment. Established models were obtained using linear regression (LR) and multiple linear regression (MLR) analysis. In this paper, MLR models were established avoiding multicollinearity among the selected molecular descriptors. Statistical quality of established models was evaluated by standard and cross-validation statistical parameters. For detection of similarity or dissimilarity among investigated s-triazine pesticides and their classification, principal component analysis (PCA) and hierarchical cluster analysis (HCA) were used and gave similar grouping. This study is financially supported by COST action TD1305.Keywords: chemometrics, classification analysis, molecular descriptors, pesticides, regression analysis
Procedia PDF Downloads 39244387 Aggregation Scheduling Algorithms in Wireless Sensor Networks
Authors: Min Kyung An
Abstract:
In Wireless Sensor Networks which consist of tiny wireless sensor nodes with limited battery power, one of the most fundamental applications is data aggregation which collects nearby environmental conditions and aggregates the data to a designated destination, called a sink node. Important issues concerning the data aggregation are time efficiency and energy consumption due to its limited energy, and therefore, the related problem, named Minimum Latency Aggregation Scheduling (MLAS), has been the focus of many researchers. Its objective is to compute the minimum latency schedule, that is, to compute a schedule with the minimum number of timeslots, such that the sink node can receive the aggregated data from all the other nodes without any collision or interference. For the problem, the two interference models, the graph model and the more realistic physical interference model known as Signal-to-Interference-Noise-Ratio (SINR), have been adopted with different power models, uniform-power and non-uniform power (with power control or without power control), and different antenna models, omni-directional antenna and directional antenna models. In this survey article, as the problem has proven to be NP-hard, we present and compare several state-of-the-art approximation algorithms in various models on the basis of latency as its performance measure.Keywords: data aggregation, convergecast, gathering, approximation, interference, omni-directional, directional
Procedia PDF Downloads 22944386 Harmonising the Circular Economy: An Analysis of 160 Papers
Authors: M. Novak, J. Dufourmount, D. Wildi, A. Sutherland, L. Sosa, J. Zimmer, E. Szabo
Abstract:
The circular economy has grounded itself amongst scholars and practitioners operating across governments and enterprises. The aim of this paper is to augment the circular economy concept by identifying common core and enabling circular business models. To this aim, we have analysed over 150 papers regarding circular activities and identified 8 clusters of business models and enablers. We have mapped and harmonised the most prominent frameworks conceptualising the circular economy. Our findings indicate that circular economy core business models include regenerative in addition to reduce, reuse and recycle activities. We further find enabling activities in design, digital technologies, knowledge development and sharing, multistakeholder collaborations, and extended corporate responsibility initiatives in various forms. We critically contrast the application of these business models across the European and African contexts. Overall, we find that seemingly varied circular economy definitions distill the same conceptual business models. We hope to contribute towards the coherence of the circular economy concept, and the continuous development of practical guidance to select and implement circular strategies.Keywords: Circular economy, content analysis, business models, definitions, enablers, frameworks
Procedia PDF Downloads 22444385 Hidden Markov Movement Modelling with Irregular Data
Authors: Victoria Goodall, Paul Fatti, Norman Owen-Smith
Abstract:
Hidden Markov Models have become popular for the analysis of animal tracking data. These models are being used to model the movements of a variety of species in many areas around the world. A common assumption of the model is that the observations need to have regular time steps. In many ecological studies, this will not be the case. The objective of the research is to modify the movement model to allow for irregularly spaced locations and investigate the effect on the inferences which can be made about the latent states. A modification of the likelihood function to allow for these irregular spaced locations is investigated, without using interpolation or averaging the movement rate. The suitability of the modification is investigated using GPS tracking data for lion (Panthera leo) in South Africa, with many observations obtained during the night, and few observations during the day. Many nocturnal predator tracking studies are set up in this way, to obtain many locations at night when the animal is most active and is difficult to observe. Few observations are obtained during the day, when the animal is expected to rest and is potentially easier to observe. Modifying the likelihood function allows the popular Hidden Markov Model framework to be used to model these irregular spaced locations, making use of all the observed data.Keywords: hidden Markov Models, irregular observations, animal movement modelling, nocturnal predator
Procedia PDF Downloads 24444384 Exploring the Role of Data Mining in Crime Classification: A Systematic Literature Review
Authors: Faisal Muhibuddin, Ani Dijah Rahajoe
Abstract:
This in-depth exploration, through a systematic literature review, scrutinizes the nuanced role of data mining in the classification of criminal activities. The research focuses on investigating various methodological aspects and recent developments in leveraging data mining techniques to enhance the effectiveness and precision of crime categorization. Commencing with an exposition of the foundational concepts of crime classification and its evolutionary dynamics, this study details the paradigm shift from conventional methods towards approaches supported by data mining, addressing the challenges and complexities inherent in the modern crime landscape. Specifically, the research delves into various data mining techniques, including K-means clustering, Naïve Bayes, K-nearest neighbour, and clustering methods. A comprehensive review of the strengths and limitations of each technique provides insights into their respective contributions to improving crime classification models. The integration of diverse data sources takes centre stage in this research. A detailed analysis explores how the amalgamation of structured data (such as criminal records) and unstructured data (such as social media) can offer a holistic understanding of crime, enriching classification models with more profound insights. Furthermore, the study explores the temporal implications in crime classification, emphasizing the significance of considering temporal factors to comprehend long-term trends and seasonality. The availability of real-time data is also elucidated as a crucial element in enhancing responsiveness and accuracy in crime classification.Keywords: data mining, classification algorithm, naïve bayes, k-means clustering, k-nearest neigbhor, crime, data analysis, sistematic literature review
Procedia PDF Downloads 6444383 The Mediation Role of Loneliness in the Relationship between Interpersonal Trust and Empathy
Authors: Ghazal Doostmohammadi, Susan Rahimzadeh
Abstract:
Aim: This research aimed to investigate the relationship between empathy and interpersonal trust and recognize the mediating role of loneliness between them in both genders. Methods: With a correlational descriptive design, 192 university students (130 female and 62 male) responded to the questionnaires on “empathy quotient,” “loneliness,” and “interpersonal trust” tests. These tests were designed and validated by experts in the field. Data were analysed using Pearson correlation and path analysis, which is a statistical technique that uses standard linear regression equations to determine the degree of conformity of a theoretical causal model with reality. Results: The data analysis showed that there was no significant correlation between interpersonal trust, both with loneliness (t=0.169) and empathy (t=0.186), while there was a significant negative correlation (t=0.359) between empathy and loneliness. This means that there is an inverse correlation between empathy and loneliness. The path analysis confirmed the hypothesis of the research about the mediating role of loneliness between empathy and interpersonal trust. But gender did not play a role in this relationship. Conclusion: As an outcome, clinical professionals and education trainers should pay more attention to interpersonal trust as a basic need and try to recreate and shape it to prevent people's social breakdown, and on the other hand, self-disclosure training (especially in Men), expression of feelings and courage should be given double importance to prevent the consequences of loneliness.Keywords: empathy, loneliness, interpersonal trust, gender
Procedia PDF Downloads 8444382 Application of Signature Verification Models for Document Recognition
Authors: Boris M. Fedorov, Liudmila P. Goncharenko, Sergey A. Sybachin, Natalia A. Mamedova, Ekaterina V. Makarenkova, Saule Rakhimova
Abstract:
In modern economic conditions, the question of the possibility of correct recognition of a signature on digital documents in order to verify the expression of will or confirm a certain operation is relevant. The additional complexity of processing lies in the dynamic variability of the signature for each individual, as well as in the way information is processed because the signature refers to biometric data. The article discusses the issues of using artificial intelligence models in order to improve the quality of signature confirmation in document recognition. The analysis of several possible options for using the model is carried out. The results of the study are given, in which it is possible to correctly determine the authenticity of the signature on small samples.Keywords: signature recognition, biometric data, artificial intelligence, neural networks
Procedia PDF Downloads 14844381 Neural Network Models for Actual Cost and Actual Duration Estimation in Construction Projects: Findings from Greece
Authors: Panagiotis Karadimos, Leonidas Anthopoulos
Abstract:
Predicting the actual cost and duration in construction projects concern a continuous and existing problem for the construction sector. This paper addresses this problem with modern methods and data available from past public construction projects. 39 bridge projects, constructed in Greece, with a similar type of available data were examined. Considering each project’s attributes with the actual cost and the actual duration, correlation analysis is performed and the most appropriate predictive project variables are defined. Additionally, the most efficient subgroup of variables is selected with the use of the WEKA application, through its attribute selection function. The selected variables are used as input neurons for neural network models through correlation analysis. For constructing neural network models, the application FANN Tool is used. The optimum neural network model, for predicting the actual cost, produced a mean squared error with a value of 3.84886e-05 and it was based on the budgeted cost and the quantity of deck concrete. The optimum neural network model, for predicting the actual duration, produced a mean squared error with a value of 5.89463e-05 and it also was based on the budgeted cost and the amount of deck concrete.Keywords: actual cost and duration, attribute selection, bridge construction, neural networks, predicting models, FANN TOOL, WEKA
Procedia PDF Downloads 13444380 Development of a Multi-Factorial Instrument for Accident Analysis Based on Systemic Methods
Authors: C. V. Pietreanu, S. E. Zaharia, C. Dinu
Abstract:
The present research is built on three major pillars, commencing by making some considerations on accident investigation methods and pointing out both defining aspects and differences between linear and non-linear analysis. The traditional linear focus on accident analysis describes accidents as a sequence of events, while the latest systemic models outline interdependencies between different factors and define the processes evolution related to a specific (normal) situation. Linear and non-linear accident analysis methods have specific limitations, so the second point of interest is mirrored by the aim to discover the drawbacks of systemic models which becomes a starting point for developing new directions to identify risks or data closer to the cause of incidents/accidents. Since communication represents a critical issue in the interaction of human factor and has been proved to be the answer of the problems made by possible breakdowns in different communication procedures, from this focus point, on the third pylon a new error-modeling instrument suitable for risk assessment/accident analysis will be elaborated.Keywords: accident analysis, multi-factorial error modeling, risk, systemic methods
Procedia PDF Downloads 20844379 Joint Modeling of Longitudinal and Time-To-Event Data with Latent Variable
Authors: Xinyuan Y. Song, Kai Kang
Abstract:
Joint models for analyzing longitudinal and survival data are widely used to investigate the relationship between a failure time process and time-variant predictors. A common assumption in conventional joint models in the survival analysis literature is that all predictors are observable. However, this assumption may not always be supported because unobservable traits, namely, latent variables, which are indirectly observable and should be measured through multiple observed variables, are commonly encountered in the medical, behavioral, and financial research settings. In this study, a joint modeling approach to deal with this feature is proposed. The proposed model comprises three parts. The first part is a dynamic factor analysis model for characterizing latent variables through multiple observed indicators over time. The second part is a random coefficient trajectory model for describing the individual trajectories of latent variables. The third part is a proportional hazard model for examining the effects of time-invariant predictors and the longitudinal trajectories of time-variant latent risk factors on hazards of interest. A Bayesian approach coupled with a Markov chain Monte Carlo algorithm to perform statistical inference. An application of the proposed joint model to a study on the Alzheimer's disease neuroimaging Initiative is presented.Keywords: Bayesian analysis, joint model, longitudinal data, time-to-event data
Procedia PDF Downloads 14344378 Statistical Models and Time Series Forecasting on Crime Data in Nepal
Authors: Dila Ram Bhandari
Abstract:
Throughout the 20th century, new governments were created where identities such as ethnic, religious, linguistic, caste, communal, tribal, and others played a part in the development of constitutions and the legal system of victim and criminal justice. Acute issues with extremism, poverty, environmental degradation, cybercrimes, human rights violations, crime against, and victimization of both individuals and groups have recently plagued South Asian nations. Everyday massive number of crimes are steadfast, these frequent crimes have made the lives of common citizens restless. Crimes are one of the major threats to society and also for civilization. Crime is a bone of contention that can create a societal disturbance. The old-style crime solving practices are unable to live up to the requirement of existing crime situations. Crime analysis is one of the most important activities of the majority of intelligent and law enforcement organizations all over the world. The South Asia region lacks such a regional coordination mechanism, unlike central Asia of Asia Pacific regions, to facilitate criminal intelligence sharing and operational coordination related to organized crime, including illicit drug trafficking and money laundering. There have been numerous conversations in recent years about using data mining technology to combat crime and terrorism. The Data Detective program from Sentient as a software company, uses data mining techniques to support the police (Sentient, 2017). The goals of this internship are to test out several predictive model solutions and choose the most effective and promising one. First, extensive literature reviews on data mining, crime analysis, and crime data mining were conducted. Sentient offered a 7-year archive of crime statistics that were daily aggregated to produce a univariate dataset. Moreover, a daily incidence type aggregation was performed to produce a multivariate dataset. Each solution's forecast period lasted seven days. Statistical models and neural network models were the two main groups into which the experiments were split. For the crime data, neural networks fared better than statistical models. This study gives a general review of the applied statistics and neural network models. A detailed image of each model's performance on the available data and generalizability is provided by a comparative analysis of all the models on a comparable dataset. Obviously, the studies demonstrated that, in comparison to other models, Gated Recurrent Units (GRU) produced greater prediction. The crime records of 2005-2019 which was collected from Nepal Police headquarter and analysed by R programming. In conclusion, gated recurrent unit implementation could give benefit to police in predicting crime. Hence, time series analysis using GRU could be a prospective additional feature in Data Detective.Keywords: time series analysis, forecasting, ARIMA, machine learning
Procedia PDF Downloads 16444377 Modelling and Simulation of Diffusion Effect on the Glycol Dehydration Unit of a Natural Gas Plant
Authors: M. Wigwe, J. G Akpa, E. N Wami
Abstract:
Mathematical models of the absorber of a glycol dehydration facility was developed using the principles of conservation of mass and energy. Models which predict variation of the water content of gas in mole fraction, variation of gas and liquid temperatures across the parking height were developed. These models contain contributions from bulk and diffusion flows. The effect of diffusion on the process occurring in the absorber was studied in this work. The models were validated using the initial conditions in the plant data from Company W TEG unit in Nigeria. The results obtained showed that the effect of diffusion was noticed between z=0 and z=0.004 m. A deviation from plant data of 0% was observed for the gas water content at a residence time of 20 seconds, at z=0.004 m. Similarly, deviations of 1.584% and 2.844% were observed for the gas and TEG temperatures.Keywords: separations, absorption, simulation, dehydration, water content, triethylene glycol
Procedia PDF Downloads 49944376 Online Learning for Modern Business Models: Theoretical Considerations and Algorithms
Authors: Marian Sorin Ionescu, Olivia Negoita, Cosmin Dobrin
Abstract:
This scientific communication reports and discusses learning models adaptable to modern business problems and models specific to digital concepts and paradigms. In the PAC (probably approximately correct) learning model approach, in which the learning process begins by receiving a batch of learning examples, the set of learning processes is used to acquire a hypothesis, and when the learning process is fully used, this hypothesis is used in the prediction of new operational examples. For complex business models, a lot of models should be introduced and evaluated to estimate the induced results so that the totality of the results are used to develop a predictive rule, which anticipates the choice of new models. In opposition, for online learning-type processes, there is no separation between the learning (training) and predictive phase. Every time a business model is approached, a test example is considered from the beginning until the prediction of the appearance of a model considered correct from the point of view of the business decision. After choosing choice a part of the business model, the label with the logical value "true" is known. Some of the business models are used as examples of learning (training), which helps to improve the prediction mechanisms for future business models.Keywords: machine learning, business models, convex analysis, online learning
Procedia PDF Downloads 14044375 A Review on Water Models of Surface Water Environment
Authors: Shahbaz G. Hassan
Abstract:
Water quality models are very important to predict the changes in surface water quality for environmental management. The aim of this paper is to give an overview of the water qualities, and to provide directions for selecting models in specific situation. Water quality models include one kind of model based on a mechanistic approach, while other models simulate water quality without considering a mechanism. Mechanistic models can be widely applied and have capabilities for long-time simulation, with highly complexity. Therefore, more spaces are provided to explain the principle and application experience of mechanistic models. Mechanism models have certain assumptions on rivers, lakes and estuaries, which limits the application range of the model, this paper introduces the principles and applications of water quality model based on the above three scenarios. On the other hand, mechanistic models are more easily to compute, and with no limit to the geographical conditions, but they cannot be used with confidence to simulate long term changes. This paper divides the empirical models into two broad categories according to the difference of mathematical algorithm, models based on artificial intelligence and models based on statistical methods.Keywords: empirical models, mathematical, statistical, water quality
Procedia PDF Downloads 26444374 Evaluation of a Piecewise Linear Mixed-Effects Model in the Analysis of Randomized Cross-over Trial
Authors: Moses Mwangi, Geert Verbeke, Geert Molenberghs
Abstract:
Cross-over designs are commonly used in randomized clinical trials to estimate efficacy of a new treatment with respect to a reference treatment (placebo or standard). The main advantage of using cross-over design over conventional parallel design is its flexibility, where every subject become its own control, thereby reducing confounding effect. Jones & Kenward, discuss in detail more recent developments in the analysis of cross-over trials. We revisit the simple piecewise linear mixed-effects model, proposed by Mwangi et. al, (in press) for its first application in the analysis of cross-over trials. We compared performance of the proposed piecewise linear mixed-effects model with two commonly cited statistical models namely, (1) Grizzle model; and (2) Jones & Kenward model, used in estimation of the treatment effect, in the analysis of randomized cross-over trial. We estimate two performance measurements (mean square error (MSE) and coverage probability) for the three methods, using data simulated from the proposed piecewise linear mixed-effects model. Piecewise linear mixed-effects model yielded lowest MSE estimates compared to Grizzle and Jones & Kenward models for both small (Nobs=20) and large (Nobs=600) sample sizes. It’s coverage probability were highest compared to Grizzle and Jones & Kenward models for both small and large sample sizes. A piecewise linear mixed-effects model is a better estimator of treatment effect than its two competing estimators (Grizzle and Jones & Kenward models) in the analysis of cross-over trials. The data generating mechanism used in this paper captures two time periods for a simple 2-Treatments x 2-Periods cross-over design. Its application is extendible to more complex cross-over designs with multiple treatments and periods. In addition, it is important to note that, even for single response models, adding more random effects increases the complexity of the model and thus may be difficult or impossible to fit in some cases.Keywords: Evaluation, Grizzle model, Jones & Kenward model, Performance measures, Simulation
Procedia PDF Downloads 12244373 Cluster Analysis of Retailers’ Benefits from Their Cooperation with Manufacturers: Business Models Perspective
Authors: M. K. Witek-Hajduk, T. M. Napiórkowski
Abstract:
A number of studies discussed the topic of benefits of retailers-manufacturers cooperation and coopetition. However, there are only few publications focused on the benefits of cooperation and coopetition between retailers and their suppliers of durable consumer goods; especially in the context of business model of cooperating partners. This paper aims to provide a clustering approach to segment retailers selling consumer durables according to the benefits they obtain from their cooperation with key manufacturers and differentiate the said retailers’ in term of the business models of cooperating partners. For the purpose of the study, a survey (with a CATI method) collected data on 603 consumer durables retailers present on the Polish market. Retailers are clustered both, with hierarchical and non-hierarchical methods. Five distinctive groups of consumer durables’ retailers are (based on the studied benefits) identified using the two-stage clustering approach. The clusters are then characterized with a set of exogenous variables, key of which are business models employed by the retailer and its partnering key manufacturer. The paper finds that the a combination of a medium sized retailer classified as an Integrator with a chiefly domestic capital and a manufacturer categorized as a Market Player will yield the highest benefits. On the other side of the spectrum is medium sized Distributor retailer with solely domestic capital – in this case, the business model of the cooperating manufactrer appears to be irreleveant. This paper is the one of the first empirical study using cluster analysis on primary data that defines the types of cooperation between consumer durables’ retailers and manufacturers – their key suppliers. The analysis integrates a perspective of both retailers’ and manufacturers’ business models and matches them with individual and joint benefits.Keywords: benefits of cooperation, business model, cluster analysis, retailer-manufacturer cooperation
Procedia PDF Downloads 25644372 Probabilistic Models to Evaluate Seismic Liquefaction In Gravelly Soil Using Dynamic Penetration Test and Shear Wave Velocity
Authors: Nima Pirhadi, Shao Yong Bo, Xusheng Wan, Jianguo Lu, Jilei Hu
Abstract:
Although gravels and gravelly soils are assumed to be non-liquefiable because of high conductivity and small modulus; however, the occurrence of this phenomenon in some historical earthquakes, especially recently earthquakes during 2008 Wenchuan, Mw= 7.9, 2014 Cephalonia, Greece, Mw= 6.1 and 2016, Kaikoura, New Zealand, Mw = 7.8, has been promoted the essential consideration to evaluate risk assessment and hazard analysis of seismic gravelly soil liquefaction. Due to the limitation in sampling and laboratory testing of this type of soil, in situ tests and site exploration of case histories are the most accepted procedures. Of all in situ tests, dynamic penetration test (DPT), Which is well known as the Chinese dynamic penetration test, and shear wave velocity (Vs) test, have been demonstrated high performance to evaluate seismic gravelly soil liquefaction. However, the lack of a sufficient number of case histories provides an essential limitation for developing new models. This study at first investigates recent earthquakes that caused liquefaction in gravelly soils to collect new data. Then, it adds these data to the available literature’s dataset to extend them and finally develops new models to assess seismic gravelly soil liquefaction. To validate the presented models, their results are compared to extra available models. The results show the reasonable performance of the proposed models and the critical effect of gravel content (GC)% on the assessment.Keywords: liquefaction, gravel, dynamic penetration test, shear wave velocity
Procedia PDF Downloads 20144371 Microwave Tomography: The Analytical Treatment for Detecting Malignant Tumor Inside Human Body
Authors: Muhammad Hassan Khalil, Xu Jiadong
Abstract:
Early detection through screening is the best tool short of a perfect treatment against the malignant tumor inside the breast of a woman. By detecting cancer in its early stages, it can be recognized and treated before it has the opportunity to spread and change into potentially dangerous. Microwave tomography is a new imaging method based on contrast in dielectric properties of materials. The mathematical theory of microwave tomography involves solving an inverse problem for Maxwell’s equations. In this paper, we present designed antenna for breast cancer detection, which will use in microwave tomography configuration.Keywords: microwave imaging, inverse scattering, breast cancer, malignant tumor detection
Procedia PDF Downloads 37144370 Implementing Biogas Technology in Rural Areas of Limpopo: Analysis of Gawula, Mopani District in South Africa
Authors: Thilivhali E. Rasimphi, David Tinarwo
Abstract:
Access to energy is crucial in poverty alleviation, economic growth, education, and agricultural improvement. The best renewable energy source is one which is locally available, affordable, and can easily be used and managed by local communities. The usage of renewable energy technology has the potential to alleviate many of the current problems facing rural areas. To address energy poverty, biogas technology has become an important part of resolving such. This study, therefore, examines the performance of digesters in Gawula village; it also identifies the contributing factors to the adoption and use of the technology. Data was collected using an open-ended questionnaire from biogas users. To evaluate the performance of the digesters, a data envelopment analysis (DEA) non-parametric technique was used, and to identify key factors affecting adoption, a logit model was applied. The reviewed critical barriers to biogas development in the area seem to be a poor institutional framework, poor infrastructure, a lack of technical support, user training on maintenance and operation, and as such, the implemented plants have failed to make the desired impact. Thus most digesters were abandoned. To create awareness amongst rural communities, government involvement is key, and there is a need for national programs. Biogas technology does what few other renewable energy technologies do, which is to integrate waste management and energy. This creates a substantial opportunity for biogas generation and penetration. That is, a promising pathway towards achieving sustainable development through biogas technology.Keywords: domestic biogas technology, economic, sustainable, social, rural development
Procedia PDF Downloads 14044369 Characterizing the Spatially Distributed Differences in the Operational Performance of Solar Power Plants Considering Input Volatility: Evidence from China
Authors: Bai-Chen Xie, Xian-Peng Chen
Abstract:
China has become the world's largest energy producer and consumer, and its development of renewable energy is of great significance to global energy governance and the fight against climate change. The rapid growth of solar power in China could help achieve its ambitious carbon peak and carbon neutrality targets early. However, the non-technical costs of solar power in China are much higher than at international levels, meaning that inefficiencies are rooted in poor management and improper policy design and that efficiency distortions have become a serious challenge to the sustainable development of the renewable energy industry. Unlike fossil energy generation technologies, the output of solar power is closely related to the volatile solar resource, and the spatial unevenness of solar resource distribution leads to potential efficiency spatial distribution differences. It is necessary to develop an efficiency evaluation method that considers the volatility of solar resources and explores the mechanism of the influence of natural geography and social environment on the spatially varying characteristics of efficiency distribution to uncover the root causes of managing inefficiencies. The study sets solar resources as stochastic inputs, introduces a chance-constrained data envelopment analysis model combined with the directional distance function, and measures the solar resource utilization efficiency of 222 solar power plants in representative photovoltaic bases in northwestern China. By the meta-frontier analysis, we measured the characteristics of different power plant clusters and compared the differences among groups, discussed the mechanism of environmental factors influencing inefficiencies, and performed statistical tests through the system generalized method of moments. Rational localization of power plants is a systematic project that requires careful consideration of the full utilization of solar resources, low transmission costs, and power consumption guarantee. Suitable temperature, precipitation, and wind speed can improve the working performance of photovoltaic modules, reasonable terrain inclination can reduce land cost, and the proximity to cities strongly guarantees the consumption of electricity. The density of electricity demand and high-tech industries is more important than resource abundance because they trigger the clustering of power plants to result in a good demonstration and competitive effect. To ensure renewable energy consumption, increased support for rural grids and encouraging direct trading between generators and neighboring users will provide solutions. The study will provide proposals for improving the full life-cycle operational activities of solar power plants in China to reduce high non-technical costs and improve competitiveness against fossil energy sources.Keywords: solar power plants, environmental factors, data envelopment analysis, efficiency evaluation
Procedia PDF Downloads 9044368 Soil-Structure Interaction Models for the Reinforced Foundation System – A State-of-the-Art Review
Authors: Ashwini V. Chavan, Sukhanand S. Bhosale
Abstract:
Challenges of weak soil subgrade are often resolved either by stabilization or reinforcing it. However, it is also practiced to reinforce the granular fill to improve the load-settlement behavior of over weak soil strata. The inclusion of reinforcement in the engineered granular fill provided a new impetus for the development of enhanced Soil-Structure Interaction (SSI) models, also known as mechanical foundation models or lumped parameter models. Several researchers have been working in this direction to understand the mechanism of granular fill-reinforcement interaction and the response of weak soil under the application of load. These models have been developed by extending available SSI models such as the Winkler Model, Pasternak Model, Hetenyi Model, Kerr Model etc., and are helpful to visualize the load-settlement behavior of a physical system through 1-D and 2-D analysis considering beam and plate resting on the foundation respectively. Based on the literature survey, these models are categorized as ‘Reinforced Pasternak Model,’ ‘Double Beam Model,’ ‘Reinforced Timoshenko Beam Model,’ and ‘Reinforced Kerr Model.’ The present work reviews the past 30+ years of research in the field of SSI models for reinforced foundation systems, presenting the conceptual development of these models systematically and discussing their limitations. Special efforts are taken to tabulate the parameters and their significance in the load-settlement analysis, which may be helpful in future studies for the comparison and enhancement of results and findings of physical models.Keywords: geosynthetics, mathematical modeling, reinforced foundation, soil-structure interaction, ground improvement, soft soil
Procedia PDF Downloads 12344367 Modeling Waiting and Service Time for Patients: A Case Study of Matawale Health Centre, Zomba, Malawi
Authors: Moses Aron, Elias Mwakilama, Jimmy Namangale
Abstract:
Spending more time on long queues for a basic service remains a common challenge to most developing countries, including Malawi. For health sector in particular, Out-Patient Department (OPD) experiences long queues. This puts the lives of patients at risk. However, using queuing analysis to under the nature of the problems and efficiency of service systems, such problems can be abated. Based on a kind of service, literature proposes different possible queuing models. However, unlike using generalized assumed models proposed by literature, use of real time case study data can help in deeper understanding the particular problem model and how such a model can vary from one day to the other and also from each case to another. As such, this study uses data obtained from one urban HC for BP, Pediatric and General OPD cases to investigate an average queuing time for patients within the system. It seeks to highlight the proper queuing model by investigating the kind of distributions functions over patient’s arrival time, inter-arrival time, waiting time and service time. Comparable with the standard set values by WHO, the study found that patients at this HC spend more waiting times than service times. On model investigation, different days presented different models ranging from an assumed M/M/1, M/M/2 to M/Er/2. As such, through sensitivity analysis, in general, a commonly assumed M/M/1 model failed to fit the data but rather an M/Er/2 demonstrated to fit well. An M/Er/3 model seemed to be good in terms of measuring resource utilization, proposing a need to increase medical personnel at this HC. However, an M/Er/4 showed to cause more idleness of human resources.Keywords: health care, out-patient department, queuing model, sensitivity analysis
Procedia PDF Downloads 435