Search results for: cloud service models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10219

Search results for: cloud service models

4699 Social Entrepreneurship and Inclusive Growth

Authors: Sudheer Gupta

Abstract:

Approximately 4 billion citizens of the world live on the equivalent of less than $8 a day. This segment constitutes a $5 trillion global market that remains under-served. Multinational corporations have historically tended to focus their innovation efforts on the upper segments of the economic pyramid. The academic literature has also been dominated by theories and frameworks of innovation that are valid when applied to the developed markets and consumer segments, but fail to adequately account for the challenges and realities of new product and service creation for the poor. Theories of entrepreneurship developed in the context of developed markets similarly ignore the challenges and realities of operating in developing economies that can be characterized by missing institutions, missing markets, information and infrastructural challenges, and resource constraints. Social entrepreneurs working in such contexts develop solutions differently. In this talk, we summarize lessons learnt from a long-term research project that involves data collection from a broad range of social entrepreneurs in developing countries working towards solutions to alleviate poverty, and grounded theory-building efforts. We aim to develop a better understanding of consumers, producers, and other stakeholder involvement, thus laying the foundation to build a robust theory of innovation and entrepreneurship for the poor.

Keywords: poverty alleviation, social enterprise, social innovation, development

Procedia PDF Downloads 382
4698 Drying Modeling of Banana Using Cellular Automata

Authors: M. Fathi, Z. Farhaninejad, M. Shahedi, M. Sadeghi

Abstract:

Drying is one of the oldest preservation methods for food and agriculture products. Appropriate control of operation can be obtained by modeling. Limitation of continues models for complex boundary condition and non-regular geometries leading to appearance of discrete novel methods such as cellular automata, which provides a platform for obtaining fast predictions by rule-based mathematics. In this research a one D dimensional CA was used for simulating thin layer drying of banana. Banana slices were dried with a convectional air dryer and experimental data were recorded for validating of final model. The model was programmed by MATLAB, run for 70000 iterations and von-Neumann neighborhood. The validation results showed a good accordance between experimental and predicted data (R=0.99). Cellular automata are capable to reproduce the expected pattern of drying and have a powerful potential for solving physical problems with reasonable accuracy and low calculating resources.

Keywords: banana, cellular automata, drying, modeling

Procedia PDF Downloads 418
4697 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups

Authors: Naushad Mamode Khan

Abstract:

The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood based estimating methodology. The joint generalized quasilikelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQLIII) that are based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.

Keywords: longitudinal, com-Poisson, ill-conditioned, INAR(1), GLMS, GQL

Procedia PDF Downloads 339
4696 Clustering of Panels and Shade Diffusion Techniques for Partially Shaded PV Array-Review

Authors: Shahida Khatoon, Mohd. Faisal Jalil, Vaishali Gautam

Abstract:

The Photovoltaic (PV) generated power is mainly dependent on environmental factors. The PV array’s lifetime and overall systems effectiveness reduce due to the partial shading condition. Clustering the electrical connections between solar modules is a viable strategy for minimizing these power losses by shade diffusion. This article comprehensively evaluates various PV array clustering/reconfiguration models for PV systems. These are static and dynamic reconfiguration techniques for extracting maximum power in mismatch conditions. This paper explores and analyzes current breakthroughs in solar PV performance improvement strategies that merit further investigation. Altogether, researchers and academicians working in the field of dedicated solar power generation will benefit from this research.

Keywords: static reconfiguration, dynamic reconfiguration, photo voltaic array, partial shading, CTC configuration

Procedia PDF Downloads 90
4695 An Overview of New Era in Food Science and Technology

Authors: Raana Babadi Fathipour

Abstract:

Strict prerequisites of logical diaries united ought to demonstrate the exploratory information is (in)significant from the statistical point of view and has driven a soak increment within the utilization and advancement of the factual program. It is essential that the utilization of numerical and measurable strategies, counting chemometrics and many other factual methods/algorithms in nourishment science and innovation has expanded steeply within the final 20 a long time. Computational apparatuses accessible can be utilized not as it were to run factual investigations such as univariate and bivariate tests as well as multivariate calibration and improvement of complex models but also to run reenactments of distinctive scenarios considering a set of inputs or essentially making expectations for particular information sets or conditions. Conducting a fast look within the most legitimate logical databases (Pubmed, ScienceDirect, Scopus), it is conceivable to watch that measurable strategies have picked up a colossal space in numerous regions.

Keywords: food science, food technology, food safety, computational tools

Procedia PDF Downloads 47
4694 Assesing Customer Relationship Management Practice in the Case of Wegagen Bank of Ethiopia

Authors: Rina G/Micheal

Abstract:

The objective of this study is to examine the practice of CRM application in Wegagen bank. A quantitative approach was used with a descriptive design. Both primary and secondary sources were used to gather data’s based on the six dimensions of CRM (customer acquisition, customer response, customer knowledge, customer information system, customer value evaluation and customer information process). The study investigates customers' and employees' perceptions of CRM practices of selected Wegagen banks in Addis Ababa. The study data was collected with a Sample size of 109 with purposive sampling from tier 1 branches(Teklhaymanot branch, Beklobetbranch, Gofa branch, Bolebranch, and Meskel Square branch) of Wegagen bank. The study shows the practice of CRM application in Wegagen Bank is at the average level, with the practice of application of the customer knowledge dimension being the highest in achievement while the customer information process practices are insufficient; therefore, it suggested that Wegagen Bank should keep working more on the customer knowledge and on the customer information process the bank should have a system that can make easier for the customers to do a business with the bank by using updated technologies that can make all processes easier also the bank should use computer system for recording the customers‘ requests and service rendered in order to bit the stiff competition and achieve its goals.

Keywords: CRM, customer acquisition, customer response customer, knowledge, customer information system, customer value evaluation, customer information process

Procedia PDF Downloads 8
4693 Analysis of the Relationship between the Old Days Hospitalized with Economic Lost Top Ten Age Productive Disease in Hospital Inpatient Inche Abdul Moeis Samarinda, Indonesia

Authors: Tri Murti Tugiman, Awalyya Fasha

Abstract:

This research aims to analyze the magnitude of the economic losses incurred as a result of a person suffering from a particular disease of the ten highest in the productive age diseases in Hospitals Inche Abdul Moeis Samarinda. This research was a descriptive survey research and a secondary data analysis. For the analysis of economic losses populations used are all in patients who suffer from the 10 highest diseases in the productive age in hospitals IA Moeis Samarinda in 2011. Sampling was performed by using a stratified random sampling with samples of 77 people. Research results indicate that the direct cost community incurred to obtain medical services in hospitals IA Moeis is IDR 74437520. The amount of indirect costs incurred during service in a community hospital is IDR 10562000. The amount lost due to sickness fee is IDR 5377800. The amount of economic lost people to obtain medical services in hospitals IA Moeis is IDR 90377320. The number of days of hospitalization was as much as 171 respondents throughout the day. This study suggests the economic loss could be prevented by changes in the lifestyle of the people who clean and healthy along with the following insurance.

Keywords: hospitalized, economic lost, productive age diseases, secondary data analysis

Procedia PDF Downloads 461
4692 Explanation Conceptual Model of the Architectural Form Effect on Structures in Building Aesthetics

Authors: Fatemeh Nejati, Farah Habib, Sayeh Goudarzi

Abstract:

Architecture and structure have always been closely interrelated so that they should be integrated into a unified, coherent and beautiful universe, while in the contemporary era, both structures and architecture proceed separately. The purpose of architecture is the art of creating form and space and order for human service, and the goal of the structural engineer is the transfer of loads to the structure, too. This research seeks to achieve the goal by looking at the relationship between the form of architecture and structure from its inception to the present day to the Global Identification and Management Plan. Finally, by identifying the main components of the design of the structure in interaction with the architectural form, an effective step is conducted in the Professional training direction and solutions to professionals. Therefore, after reviewing the evolution of structural and architectural coordination in various historical periods as well as how to reach the form of the structure in different times and places, components are required to test the components and present the final theory that one hundred to be tested in this regard. Finally, this research indicates the fact that the form of architecture and structure has an aesthetic link, which is influenced by a number of components that could be edited and has a regular order throughout history that could be regular. The research methodology is analytic, and it is comparative using analytical and matrix diagrams and diagrams and tools for conducting library research and interviewing.

Keywords: architecture, structural form, structural and architectural coordination, effective components, aesthetics

Procedia PDF Downloads 196
4691 Casting Lots for Candidature in General Elections: An Un-Named Political System

Authors: Talib Jan Abasindhi

Abstract:

Democracy has reached almost every nuke & corner of the globe. It is well embedded in the political systems found in the majority of the countries in the world. Political parties, their manifestos, and programs are educating their people for better democracy and good governance, as well as service delivery in many countries around the globe. Although democracy in Pakistan by itself is in its infancy stage, yet, there is a region consisting of three districts (administrative units) lying in the North of the country where democracy is yet unknown to a wide range of population bounded into a series of mountains from Himalayan and Karakuram ranges. Political parties are struggling now to get their roots in the area while disrupting the traditional and tribal electoral system prevailing in the region. This paper will shed light on an interesting subject of casting lots for nomination as candidatures for general and local bodies’ elections in Kohistani region in Pakistan. The people of wisdom and knowledge in the modern world deem such societies as uncivilized where these practices are found. No one can believe it in today’s world, yet, this practice has been common in Kohistani region over the past many decades, and there have been many reasons for this too. Through this paper, we shall not only make others aware of the process and procedure practiced in casting the lots in elections in democratic Pakistan, but rather we shall also talk about its very basic reasons and suggestions as a solution for this menace to be eliminated.

Keywords: democracy, casting lots, governance, Kohistani region

Procedia PDF Downloads 53
4690 Simulation Model of Induction Heating in COMSOL Multiphysics

Authors: K. Djellabi, M. E. H. Latreche

Abstract:

The induction heating phenomenon depends on various factors, making the problem highly nonlinear. The mathematical analysis of this problem in most cases is very difficult and it is reduced to simple cases. Another knowledge of induction heating systems is generated in production environments, but these trial-error procedures are long and expensive. The numerical models of induction heating problem are another approach to reduce abovementioned drawbacks. This paper deals with the simulation model of induction heating problem. The simulation model of induction heating system in COMSOL Multiphysics is created. In this work we present results of numerical simulations of induction heating process in pieces of cylindrical shapes, in an inductor with four coils. The modeling of the inducting heating process was made with the software COMSOL Multiphysics Version 4.2a, for the study we present the temperature charts.

Keywords: induction heating, electromagnetic field, inductor, numerical simulation, finite element

Procedia PDF Downloads 297
4689 Hydrological-Economic Modeling of Two Hydrographic Basins of the Coast of Peru

Authors: Julio Jesus Salazar, Manuel Andres Jesus De Lama

Abstract:

There are very few models that serve to analyze the use of water in the socio-economic process. On the supply side, the joint use of groundwater has been considered in addition to the simple limits on the availability of surface water. In addition, we have worked on waterlogging and the effects on water quality (mainly salinity). In this paper, a 'complex' water economy is examined; one in which demands grow differentially not only within but also between sectors, and one in which there are limited opportunities to increase consumptive use. In particular, high-value growth, the growth of the production of irrigated crops of high value within the basins of the case study, together with the rapidly growing urban areas, provides a rich context to examine the general problem of water management at the basin level. At the same time, the long-term aridity of nature has made the eco-environment in the basins located on the coast of Peru very vulnerable, and the exploitation and immediate use of water resources have further deteriorated the situation. The presented methodology is the optimization with embedded simulation. The wide basin simulation of flow and water balances and crop growth are embedded with the optimization of water allocation, reservoir operation, and irrigation scheduling. The modeling framework is developed from a network of river basins that includes multiple nodes of origin (reservoirs, aquifers, water courses, etc.) and multiple demand sites along the river, including places of consumptive use for agricultural, municipal and industrial, and uses of running water on the coast of Peru. The economic benefits associated with water use are evaluated for different demand management instruments, including water rights, based on the production and benefit functions of water use in the urban agricultural and industrial sectors. This work represents a new effort to analyze the use of water at the regional level and to evaluate the modernization of the integrated management of water resources and socio-economic territorial development in Peru. It will also allow the establishment of policies to improve the process of implementation of the integrated management and development of water resources. The input-output analysis is essential to present a theory about the production process, which is based on a particular type of production function. Also, this work presents the Computable General Equilibrium (CGE) version of the economic model for water resource policy analysis, which was specifically designed for analyzing large-scale water management. As to the platform for CGE simulation, GEMPACK, a flexible system for solving CGE models, is used for formulating and solving CGE model through the percentage-change approach. GEMPACK automates the process of translating the model specification into a model solution program.

Keywords: water economy, simulation, modeling, integration

Procedia PDF Downloads 133
4688 Continuous and Discontinuos Modeling of Wellbore Instability in Anisotropic Rocks

Authors: C. Deangeli, P. Obentaku Obenebot, O. Omwanghe

Abstract:

The study focuses on the analysis of wellbore instability in rock masses affected by weakness planes. The occurrence of failure in such a type of rocks can occur in the rock matrix and/ or along the weakness planes, in relation to the mud weight gradient. In this case the simple Kirsch solution coupled with a failure criterion cannot supply a suitable scenario for borehole instabilities. Two different numerical approaches have been used in order to investigate the onset of local failure at the wall of a borehole. For each type of approach the influence of the inclination of weakness planes has been investigates, by considering joint sets at 0°, 35° and 90° to the horizontal. The first set of models have been carried out with FLAC 2D (Fast Lagrangian Analysis of Continua) by considering the rock material as a continuous medium, with a Mohr Coulomb criterion for the rock matrix and using the ubiquitous joint model for accounting for the presence of the weakness planes. In this model yield may occur in either the solid or along the weak plane, or both, depending on the stress state, the orientation of the weak plane and the material properties of the solid and weak plane. The second set of models have been performed with PFC2D (Particle Flow code). This code is based on the Discrete Element Method and considers the rock material as an assembly of grains bonded by cement-like materials, and pore spaces. The presence of weakness planes is simulated by the degradation of the bonds between grains along given directions. In general the results of the two approaches are in agreement. However the discrete approach seems to capture more complex phenomena related to local failure in the form of grain detachment at wall of the borehole. In fact the presence of weakness planes in the discontinuous medium leads to local instability along the weak planes also in conditions not predicted from the continuous solution. In general slip failure locations and directions do not follow the conventional wellbore breakout direction but depend upon the internal friction angle and the orientation of the bedding planes. When weakness plane is at 0° and 90° the behaviour are similar to that of a continuous rock material, but borehole instability is more severe when weakness planes are inclined at an angle between 0° and 90° to the horizontal. In conclusion, the results of the numerical simulations show that the prediction of local failure at the wall of the wellbore cannot disregard the presence of weakness planes and consequently the higher mud weight required for stability for any specific inclination of the joints. Despite the discrete approach can simulate smaller areas because of the large number of particles required for the generation of the rock material, however it seems to investigate more correctly the occurrence of failure at the miscroscale and eventually the propagation of the failed zone to a large portion of rock around the wellbore.

Keywords: continuous- discontinuous, numerical modelling, weakness planes wellbore, FLAC 2D

Procedia PDF Downloads 487
4687 Naphtha Catalytic Reform: Modeling and Simulation of Unity

Authors: Leal Leonardo, Pires Carlos Augusto de Moraes, Casiraghi Magela

Abstract:

In this work were realized the modeling and simulation of the catalytic reformer process, of ample form, considering all the equipment that influence the operation performance. Considered it a semi-regenerative reformer, with four reactors in series intercalated with four furnaces, two heat exchanges, one product separator and one recycle compressor. A simplified reactional system was considered, involving only ten chemical compounds related through five reactions. The considered process was the applied to aromatics production (benzene, toluene, and xylene). The models developed to diverse equipment were interconnecting in a simulator that consists of a computer program elaborate in FORTRAN 77. The simulation of the global model representative of reformer unity achieved results that are compatibles with the literature ones. It was then possible to study the effects of operational variables in the products concentration and in the performance of the unity equipment.

Keywords: catalytic reforming, modeling, simulation, petrochemical engineering

Procedia PDF Downloads 489
4686 Estimating Big Five Personality Expressions with a Tiered Information Framework

Authors: Laura Kahn, Paul Rodrigues, Onur Savas, Shannon Hahn

Abstract:

An empirical understanding of an individual's personality expression can have a profound impact on organizations seeking to strengthen team performance and improve employee retention. A team's personality composition can impact overall performance. Creating a tiered information framework that leverages proxies for a user's social context and lexical and linguistic content provides insight into location-specific personality expression. We leverage the layered framework to examine domain-specific, psychological, and lexical cues within social media posts. We apply DistilBERT natural language transfer learning models with real world data to examine the relationship between Big Five personality expressions of people in Science, Technology, Engineering and Math (STEM) fields.

Keywords: big five, personality expression, social media analysis, workforce development

Procedia PDF Downloads 122
4685 Cubic Trigonometric B-Spline Approach to Numerical Solution of Wave Equation

Authors: Shazalina Mat Zin, Ahmad Abd. Majid, Ahmad Izani Md. Ismail, Muhammad Abbas

Abstract:

The generalized wave equation models various problems in sciences and engineering. In this paper, a new three-time level implicit approach based on cubic trigonometric B-spline for the approximate solution of wave equation is developed. The usual finite difference approach is used to discretize the time derivative while cubic trigonometric B-spline is applied as an interpolating function in the space dimension. Von Neumann stability analysis is used to analyze the proposed method. Two problems are discussed to exhibit the feasibility and capability of the method. The absolute errors and maximum error are computed to assess the performance of the proposed method. The results were found to be in good agreement with known solutions and with existing schemes in literature.

Keywords: collocation method, cubic trigonometric B-spline, finite difference, wave equation

Procedia PDF Downloads 518
4684 Comparison of Equivalent Linear and Non-Linear Site Response Model Performance in Kathmandu Valley

Authors: Sajana Suwal, Ganesh R. Nhemafuki

Abstract:

Evaluation of ground response under earthquake shaking is crucial in geotechnical earthquake engineering. Damage due to seismic excitation is mainly correlated to local geological and geotechnical conditions. It is evident from the past earthquakes (e.g. 1906 San Francisco, USA, 1923 Kanto, Japan) that the local geology has strong influence on amplitude and duration of ground motions. Since then significant studies has been conducted on ground motion amplification revealing the importance of influence of local geology on ground. Observations from the damaging earthquakes (e.g. Nigata and San Francisco, 1964; Irpinia, 1980; Mexico, 1985; Kobe, 1995; L’Aquila, 2009) divulged that non-uniform damage pattern, particularly in soft fluvio-lacustrine deposit is due to the local amplification of seismic ground motion. Non-uniform damage patterns are also observed in Kathmandu Valley during 1934 Bihar Nepal earthquake and recent 2015 Gorkha earthquake seemingly due to the modification of earthquake ground motion parameters. In this study, site effects resulting from amplification of soft soil in Kathmandu are presented. A large amount of subsoil data was collected and used for defining the appropriate subsoil model for the Kathamandu valley. A comparative study of one-dimensional total-stress equivalent linear and non-linear site response is performed using four strong ground motions for six sites of Kathmandu valley. In general, one-dimensional (1D) site-response analysis involves the excitation of a soil profile using the horizontal component and calculating the response at individual soil layers. In the present study, both equivalent linear and non-linear site response analyses were conducted using the computer program DEEPSOIL. The results show that there is no significant deviation between equivalent linear and non-linear site response models until the maximum strain reaches to 0.06-0.1%. Overall, it is clearly observed from the results that non-linear site response model perform better as compared to equivalent linear model. However, the significant deviation between two models is resulted from other influencing factors such as assumptions made in 1D site response, lack of accurate values of shear wave velocity and nonlinear properties of the soil deposit. The results are also presented in terms of amplification factors which are predicted to be around four times more in case of non-linear analysis as compared to equivalent linear analysis. Hence, the nonlinear behavior of soil prevails the urgent need of study of dynamic characteristics of the soft soil deposit that can specifically represent the site-specific design spectra for the Kathmandu valley for building resilient structures from future damaging earthquakes.

Keywords: deep soil, equivalent linear analysis, non-linear analysis, site response

Procedia PDF Downloads 280
4683 Distorted Document Images Dataset for Text Detection and Recognition

Authors: Ilia Zharikov, Philipp Nikitin, Ilia Vasiliev, Vladimir Dokholyan

Abstract:

With the increasing popularity of document analysis and recognition systems, text detection (TD) and optical character recognition (OCR) in document images become challenging tasks. However, according to our best knowledge, no publicly available datasets for these particular problems exist. In this paper, we introduce a Distorted Document Images dataset (DDI-100) and provide a detailed analysis of the DDI-100 in its current state. To create the dataset we collected 7000 unique document pages, and extend it by applying different types of distortions and geometric transformations. In total, DDI-100 contains more than 100,000 document images together with binary text masks, text and character locations in terms of bounding boxes. We also present an analysis of several state-of-the-art TD and OCR approaches on the presented dataset. Lastly, we demonstrate the usefulness of DDI-100 to improve accuracy and stability of the considered TD and OCR models.

Keywords: document analysis, open dataset, optical character recognition, text detection

Procedia PDF Downloads 149
4682 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression

Authors: Anne M. Denton, Rahul Gomes, David W. Franzen

Abstract:

High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.

Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression

Procedia PDF Downloads 110
4681 Using Combination of Sets of Features of Molecules for Aqueous Solubility Prediction: A Random Forest Model

Authors: Muhammet Baldan, Emel Timuçin

Abstract:

Generally, absorption and bioavailability increase if solubility increases; therefore, it is crucial to predict them in drug discovery applications. Molecular descriptors and Molecular properties are traditionally used for the prediction of water solubility. There are various key descriptors that are used for this purpose, namely Drogan Descriptors, Morgan Descriptors, Maccs keys, etc., and each has different prediction capabilities with differentiating successes between different data sets. Another source for the prediction of solubility is structural features; they are commonly used for the prediction of solubility. However, there are little to no studies that combine three or more properties or descriptors for prediction to produce a more powerful prediction model. Unlike available models, we used a combination of those features in a random forest machine learning model for improved solubility prediction to better predict and, therefore, contribute to drug discovery systems.

Keywords: solubility, random forest, molecular descriptors, maccs keys

Procedia PDF Downloads 17
4680 Managers’ Mobile Information Behavior in an Openness Paradigm Era

Authors: Abd Latif Abdul Rahman, Zuraidah Arif, Muhammad Faizal Iylia, Mohd Ghazali, Asmadi Mohammed Ghazali

Abstract:

Mobile information is a significant access point for human information activities. Theories and models of human information behavior have developed over several decades but have not yet considered the role of the user’s computing device in digital information interactions. This paper reviews the literature that leads to developing a conceptual framework of a study on the managers mobile information behavior. Based on the literature review, dimensions of mobile information behavior are identified, namely, dimension information needs, dimension information access, information retrieval and dimension of information use. The study is significant to understand the nature of librarians’ behavior in searching, retrieving and using information via the mobile device. Secondly, the study would provide suggestions about various kinds of mobile applications which organization can provide for their staff to improve their services.

Keywords: mobile information behavior, information behavior, mobile information, mobile devices

Procedia PDF Downloads 323
4679 Mathematical Modeling and Optimization of Burnishing Parameters for 15NiCr6 Steel

Authors: Tarek Litim, Ouahiba Taamallah

Abstract:

The present paper is an investigation of the effect of burnishing on the surface integrity of a component made of 15NiCr6 steel. This work shows a statistical study based on regression, and Taguchi's design has allowed the development of mathematical models to predict the output responses as a function of the technological parameters studied. The response surface methodology (RSM) showed a simultaneous influence of the burnishing parameters and observe the optimal processing parameters. ANOVA analysis of the results resulted in the validation of the prediction model with a determination coefficient R=90.60% and 92.41% for roughness and hardness, respectively. Furthermore, a multi-objective optimization allowed to identify a regime characterized by P=10kgf, i=3passes, and f=0.074mm/rev, which favours minimum roughness and maximum hardness. The result was validated by the desirability of D= (0.99 and 0.95) for roughness and hardness, respectively.

Keywords: 15NiCr6 steel, burnishing, surface integrity, Taguchi, RSM, ANOVA

Procedia PDF Downloads 177
4678 Forecasting Issues in Energy Markets within a Reg-ARIMA Framework

Authors: Ilaria Lucrezia Amerise

Abstract:

Electricity markets throughout the world have undergone substantial changes. Accurate, reliable, clear and comprehensible modeling and forecasting of different variables (loads and prices in the first instance) have achieved increasing importance. In this paper, we describe the actual state of the art focusing on reg-SARMA methods, which have proven to be flexible enough to accommodate the electricity price/load behavior satisfactory. More specifically, we will discuss: 1) The dichotomy between point and interval forecasts; 2) The difficult choice between stochastic (e.g. climatic variation) and non-deterministic predictors (e.g. calendar variables); 3) The confrontation between modelling a single aggregate time series or creating separated and potentially different models of sub-series. The noteworthy point that we would like to make it emerge is that prices and loads require different approaches that appear irreconcilable even though must be made reconcilable for the interests and activities of energy companies.

Keywords: interval forecasts, time series, electricity prices, reg-SARIMA methods

Procedia PDF Downloads 116
4677 Secure Optical Communication System Using Quantum Cryptography

Authors: Ehab AbdulRazzaq Hussein

Abstract:

Quantum cryptography (QC) is an emerging technology for secure key distribution with single-photon transmissions. In contrast to classical cryptographic schemes, the security of QC schemes is guaranteed by the fundamental laws of nature. Their security stems from the impossibility to distinguish non-orthogonal quantum states with certainty. A potential eavesdropper introduces errors in the transmissions, which can later be discovered by the legitimate participants of the communication. In this paper, the modeling approach is proposed for QC protocol BB84 using polarization coding. The single-photon system is assumed to be used in the designed models. Thus, Eve cannot use beam-splitting strategy to eavesdrop on the quantum channel transmission. The only eavesdropping strategy possible to Eve is the intercept/resend strategy. After quantum transmission of the QC protocol, the quantum bit error rate (QBER) is estimated and compared with a threshold value. If it is above this value the procedure must be stopped and performed later again.

Keywords: security, key distribution, cryptography, quantum protocols, Quantum Cryptography (QC), Quantum Key Distribution (QKD).

Procedia PDF Downloads 384
4676 Probing Mechanical Mechanism of Three-Hinge Formation on a Growing Brain: A Numerical and Experimental Study

Authors: Mir Jalil Razavi, Tianming Liu, Xianqiao Wang

Abstract:

Cortical folding, characterized by convex gyri and concave sulci, has an intrinsic relationship to the brain’s functional organization. Understanding the mechanism of the brain’s convoluted patterns can provide useful clues into normal and pathological brain function. During the development, the cerebral cortex experiences a noticeable expansion in volume and surface area accompanied by tremendous tissue folding which may be attributed to many possible factors. Despite decades of endeavors, the fundamental mechanism and key regulators of this crucial process remain incompletely understood. Therefore, to taking even a small role in unraveling of brain folding mystery, we present a mechanical model to find mechanism of 3-hinges formation in a growing brain that it has not been addressed before. A 3-hinge is defined as a gyral region where three gyral crests (hinge-lines) join. The reasons that how and why brain prefers to develop 3-hinges have not been answered very well. Therefore, we offer a theoretical and computational explanation to mechanism of 3-hinges formation in a growing brain and validate it by experimental observations. In theoretical approach, the dynamic behavior of brain tissue is examined and described with the aid of a large strain and nonlinear constitutive model. Derived constitute model is used in the computational model to define material behavior. Since the theoretical approach cannot predict the evolution of cortical complex convolution after instability, non-linear finite element models are employed to study the 3-hinges formation and secondary morphological folds of the developing brain. Three-dimensional (3D) finite element analyses on a multi-layer soft tissue model which mimics a small piece of the brain are performed to investigate the fundamental mechanism of consistent hinge formation in the cortical folding. Results show that after certain amount growth of cortex, mechanical model starts to be unstable and then by formation of creases enters to a new configuration with lower strain energy. By further growth of the model, formed shallow creases start to form convoluted patterns and then develop 3-hinge patterns. Simulation results related to 3-hinges in models show good agreement with experimental observations from macaque, chimpanzee and human brain images. These results have great potential to reveal fundamental principles of brain architecture and to produce a unified theoretical framework that convincingly explains the intrinsic relationship between cortical folding and 3-hinges formation. This achieved fundamental understanding of the intrinsic relationship between cortical folding and 3-hinges formation would potentially shed new insights into the diagnosis of many brain disorders such as schizophrenia, autism, lissencephaly and polymicrogyria.

Keywords: brain, cortical folding, finite element, three hinge

Procedia PDF Downloads 218
4675 A Comparison of Neural Network and DOE-Regression Analysis for Predicting Resource Consumption of Manufacturing Processes

Authors: Frank Kuebler, Rolf Steinhilper

Abstract:

Artificial neural networks (ANN) as well as Design of Experiments (DOE) based regression analysis (RA) are mainly used for modeling of complex systems. Both methodologies are commonly applied in process and quality control of manufacturing processes. Due to the fact that resource efficiency has become a critical concern for manufacturing companies, these models needs to be extended to predict resource-consumption of manufacturing processes. This paper describes an approach to use neural networks as well as DOE based regression analysis for predicting resource consumption of manufacturing processes and gives a comparison of the achievable results based on an industrial case study of a turning process.

Keywords: artificial neural network, design of experiments, regression analysis, resource efficiency, manufacturing process

Procedia PDF Downloads 506
4674 Solutions of Fractional Reaction-Diffusion Equations Used to Model the Growth and Spreading of Biological Species

Authors: Kamel Al-Khaled

Abstract:

Reaction-diffusion equations are commonly used in population biology to model the spread of biological species. In this paper, we propose a fractional reaction-diffusion equation, where the classical second derivative diffusion term is replaced by a fractional derivative of order less than two. Based on the symbolic computation system Mathematica, Adomian decomposition method, developed for fractional differential equations, is directly extended to derive explicit and numerical solutions of space fractional reaction-diffusion equations. The fractional derivative is described in the Caputo sense. Finally, the recent appearance of fractional reaction-diffusion equations as models in some fields such as cell biology, chemistry, physics, and finance, makes it necessary to apply the results reported here to some numerical examples.

Keywords: fractional partial differential equations, reaction-diffusion equations, adomian decomposition, biological species

Procedia PDF Downloads 354
4673 A Systematic Review of Situational Awareness and Cognitive Load Measurement in Driving

Authors: Aly Elshafei, Daniela Romano

Abstract:

With the development of autonomous vehicles, a human-machine interaction (HMI) system is needed for a safe transition of control when a takeover request (TOR) is required. An important part of the HMI system is the ability to monitor the level of situational awareness (SA) of any driver in real-time, in different scenarios, and without any pre-calibration. Presenting state-of-the-art machine learning models used to measure SA is the purpose of this systematic review. Investigating the limitations of each type of sensor, the gaps, and the most suited sensor and computational model that can be used in driving applications. To the author’s best knowledge this is the first literature review identifying online and offline classification methods used to measure SA, explaining which measurements are subject or session-specific, and how many classifications can be done with each classification model. This information can be very useful for researchers measuring SA to identify the most suited model to measure SA for different applications.

Keywords: situational awareness, autonomous driving, gaze metrics, EEG, ECG

Procedia PDF Downloads 103
4672 Automated 3D Segmentation System for Detecting Tumor and Its Heterogeneity in Patients with High Grade Ovarian Epithelial Cancer

Authors: Dimitrios Binas, Marianna Konidari, Charis Bourgioti, Lia Angela Moulopoulou, Theodore Economopoulos, George Matsopoulos

Abstract:

High grade ovarian epithelial cancer (OEC) is fatal gynecological cancer and the poor prognosis of this entity is closely related to considerable intratumoral genetic heterogeneity. By examining imaging data, it is possible to assess the heterogeneity of tumorous tissue. This study proposes a methodology for aligning, segmenting and finally visualizing information from various magnetic resonance imaging series in order to construct 3D models of heterogeneity maps from the same tumor in OEC patients. The proposed system may be used as an adjunct digital tool by health professionals for personalized medicine, as it allows for an easy visual assessment of the heterogeneity of the examined tumor.

Keywords: image segmentation, ovarian epithelial cancer, quantitative characteristics, image registration, tumor visualization

Procedia PDF Downloads 190
4671 Technology, Organizational and Environmental Determinants of Business Intelligence Systems Adoption in Croatian SME: A Case Study of Medium-Sized Enterprise

Authors: Ana-Marija Stjepić, Luka Sušac, Dalia Suša Vugec

Abstract:

In the last few years, examples from scientific literature and business practices show that the adoption of technological innovations increases enterprises' performance. Recently, when it comes to the field of information technology innovation, business intelligence systems (BISs) have drawn a significant amount of attention of the scientific circles. BISs can be understood as a form of technological innovation which can bring certain benefits to the organizations that are adopting it. Therefore, the aim of this paper is twofold: (1) to define determinants of successful BISs adoption in small and medium enterprises and thus contribute to this neglected research area and (2) to present the current state of BISs adoption in small and medium-sized companies. In order to do so, determinants are defined and classified into three dimensions, according to the Technology – Organization – Environment (TOE) theoretical framework that describes the impact of each dimension on technological innovations adoption. Moreover, paper brings a case study presenting the adoption of BISs in practice within an organization from tertiary (service) industry sector. Based on the results of the study, guidelines for more efficient, faster and easier BISs adoption are presented.

Keywords: adoption, business intelligence, business intelligence systems, case study, TOE framework

Procedia PDF Downloads 126
4670 An Artificial Intelligence Framework to Forecast Air Quality

Authors: Richard Ren

Abstract:

Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.

Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms

Procedia PDF Downloads 102