Search results for: sales process ARIMA models
20491 A Pedagogical Case Study on Consumer Decision Making Models: A Selection of Smart Phone Apps
Authors: Yong Bum Shin
Abstract:
This case focuses on Weighted additive difference, Conjunctive, Disjunctive, and Elimination by aspects methodologies in consumer decision-making models and the Simple additive weighting (SAW) approach in the multi-criteria decision-making (MCDM) area. Most decision-making models illustrate that the rank reversal phenomenon is unpreventable. This paper presents that rank reversal occurs in popular managerial methods such as Weighted Additive Difference (WAD), Conjunctive Method, Disjunctive Method, Elimination by Aspects (EBA) and MCDM methods as well as such as the Simple Additive Weighting (SAW) and finally Unified Commensurate Multiple (UCM) models which successfully addresses these rank reversal problems in most popular MCDM methods in decision-making area.Keywords: multiple criteria decision making, rank inconsistency, unified commensurate multiple, analytic hierarchy process
Procedia PDF Downloads 8020490 Impact of Artificial Intelligence Technologies on Information-Seeking Behaviors and the Need for a New Information Seeking Model
Authors: Mohammed Nasser Al-Suqri
Abstract:
Former information-seeking models are proposed more than two decades ago. These already existed models were given prior to the evolution of digital information era and Artificial Intelligence (AI) technologies. Lack of current information seeking models within Library and Information Studies resulted in fewer advancements for teaching students about information-seeking behaviors, design of library tools and services. In order to better facilitate the aforementioned concerns, this study aims to propose state-of-the-art model while focusing on the information seeking behavior of library users in the Sultanate of Oman. This study aims for the development, designing and contextualizing the real-time user-centric information seeking model capable of enhancing information needs and information usage along with incorporating critical insights for the digital library practices. Another aim is to establish far-sighted and state-of-the-art frame of reference covering Artificial Intelligence (AI) while synthesizing digital resources and information for optimizing information-seeking behavior. The proposed study is empirically designed based on a mix-method process flow, technical surveys, in-depth interviews, focus groups evaluations and stakeholder investigations. The study data pool is consist of users and specialist LIS staff at 4 public libraries and 26 academic libraries in Oman. The designed research model is expected to facilitate LIS by assisting multi-dimensional insights with AI integration for redefining the information-seeking process, and developing a technology rich model.Keywords: artificial intelligence, information seeking, information behavior, information seeking models, libraries, Sultanate of Oman
Procedia PDF Downloads 11520489 A Large Language Model-Driven Method for Automated Building Energy Model Generation
Authors: Yake Zhang, Peng Xu
Abstract:
The development of building energy models (BEM) required for architectural design and analysis is a time-consuming and complex process, demanding a deep understanding and proficient use of simulation software. To streamline the generation of complex building energy models, this study proposes an automated method for generating building energy models using a large language model and the BEM library aimed at improving the efficiency of model generation. This method leverages a large language model to parse user-specified requirements for target building models, extracting key features such as building location, window-to-wall ratio, and thermal performance of the building envelope. The BEM library is utilized to retrieve energy models that match the target building’s characteristics, serving as reference information for the large language model to enhance the accuracy and relevance of the generated model, allowing for the creation of a building energy model that adapts to the user’s modeling requirements. This study enables the automatic creation of building energy models based on natural language inputs, reducing the professional expertise required for model development while significantly decreasing the time and complexity of manual configuration. In summary, this study provides an efficient and intelligent solution for building energy analysis and simulation, demonstrating the potential of a large language model in the field of building simulation and performance modeling.Keywords: artificial intelligence, building energy modelling, building simulation, large language model
Procedia PDF Downloads 2520488 Characterising the Dynamic Friction in the Staking of Plain Spherical Bearings
Authors: Jacob Hatherell, Jason Matthews, Arnaud Marmier
Abstract:
Anvil Staking is a cold-forming process that is used in the assembly of plain spherical bearings into a rod-end housing. This process ensures that the bearing outer lip conforms to the chamfer in the matching rod end to produce a lightweight mechanical joint with sufficient strength to meet the pushout load requirement of the assembly. Finite Element (FE) analysis is being used extensively to predict the behaviour of metal flow in cold forming processes to support industrial manufacturing and product development. On-going research aims to validate FE models across a wide range of bearing and rod-end geometries by systematically isolating and understanding the uncertainties caused by variations in, material properties, load-dependent friction coefficients and strain rate sensitivity. The improved confidence in these models aims to eliminate the costly and time-consuming process of experimental trials in the introduction of new bearing designs. Previous literature has shown that friction coefficients do not remain constant during cold forming operations, however, the understanding of this phenomenon varies significantly and is rarely implemented in FE models. In this paper, a new approach to evaluate the normal contact pressure versus friction coefficient relationship is outlined using friction calibration charts generated via iterative FE models and ring compression tests. When compared to previous research, this new approach greatly improves the prediction of forming geometry and the forming load during the staking operation. This paper also aims to standardise the FE approach to modelling ring compression test and determining the friction calibration charts.Keywords: anvil staking, finite element analysis, friction coefficient, spherical plain bearing, ring compression tests
Procedia PDF Downloads 20520487 Enhancing Model Interoperability and Reuse by Designing and Developing a Unified Metamodel Standard
Authors: Arash Gharibi
Abstract:
Mankind has always used models to solve problems. Essentially, models are simplified versions of reality, whose need stems from having to deal with complexity; many processes or phenomena are too complex to be described completely. Thus a fundamental model requirement is that it contains the characteristic features that are essential in the context of the problem to be solved or described. Models are used in virtually every scientific domain to deal with various problems. During the recent decades, the number of models has increased exponentially. Publication of models as part of original research has traditionally been in in scientific periodicals, series, monographs, agency reports, national journals and laboratory reports. This makes it difficult for interested groups and communities to stay informed about the state-of-the-art. During the modeling process, many important decisions are made which impact the final form of the model. Without a record of these considerations, the final model remains ill-defined and open to varying interpretations. Unfortunately, the details of these considerations are often lost or in case there is any existing information about a model, it is likely to be written intuitively in different layouts and in different degrees of detail. In order to overcome these issues, different domains have attempted to implement their own approaches to preserve their models’ information in forms of model documentation. The most frequently cited model documentation approaches show that they are domain specific, not to applicable to the existing models and evolutionary flexibility and intrinsic corrections and improvements are not possible with the current approaches. These issues are all because of a lack of unified standards for model documentation. As a way forward, this research will propose a new standard for capturing and managing models’ information in a unified way so that interoperability and reusability of models become possible. This standard will also be evolutionary, meaning members of modeling realm could contribute to its ongoing developments and improvements. In this paper, the current 3 of the most common metamodels are reviewed and according to pros and cons of each, a new metamodel is proposed.Keywords: metamodel, modeling, interoperability, reuse
Procedia PDF Downloads 19720486 Importance of Hardware Systems and Circuits in Secure Software Development Life Cycle
Authors: Mir Shahriar Emami
Abstract:
Although it is fully impossible to ensure that a software system is quite secure, developing an acceptable secure software system in a convenient platform is not unreachable. In this paper, we attempt to analyze software development life cycle (SDLC) models from the hardware systems and circuits point of view. To date, the SDLC models pay merely attention to the software security from the software perspectives. In this paper, we present new features for SDLC stages to emphasize the role of systems and circuits in developing secure software system through the software development stages, the point that has not been considered previously in the SDLC models.Keywords: SDLC, SSDLC, software security, software process engineering, hardware systems and circuits security
Procedia PDF Downloads 26020485 Optimization of the Fabrication Process for Particleboards Made from Oil Palm Fronds Blended with Empty Fruit Bunch Using Response Surface Methodology
Authors: Ghazi Faisal Najmuldeen, Wahida Amat-Fadzil, Zulkafli Hassan, Jinan B. Al-Dabbagh
Abstract:
The objective of this study was to evaluate the optimum fabrication process variables to produce particleboards from oil palm fronds (OPF) particles and empty fruit bunch fiber (EFB). Response surface methodology was employed to analyse the effect of hot press temperature (150–190°C); press time (3–7 minutes) and EFB blending ratio (0–40%) on particleboards modulus of rupture, modulus of elasticity, internal bonding, water absorption and thickness swelling. A Box-Behnken experimental design was carried out to develop statistical models used for the optimisation of the fabrication process variables. All factors were found to be statistically significant on particleboards properties. The statistical analysis indicated that all models showed significant fit with experimental results. The optimum particleboards properties were obtained at optimal fabrication process condition; press temperature; 186°C, press time; 5.7 min and EFB / OPF ratio; 30.4%. Incorporating of oil palm frond and empty fruit bunch to produce particleboards has improved the particleboards properties. The OPF–EFB particleboards fabricated at optimized conditions have satisfied the ANSI A208.1–1999 specification for general purpose particleboards.Keywords: empty fruit bunch fiber, oil palm fronds, particleboards, response surface methodology
Procedia PDF Downloads 22620484 Dividend Policy, Overconfidence and Moral Hazard
Authors: Richard Fairchild, Abdullah Al-Ghazali, Yilmaz Guney
Abstract:
This study analyses the relationship between managerial overconfidence, dividends, and firm value by developing theoretical models that examine the condition under which managerial overconfident, dividends, and firm value may be positive or negative. Furthermore, the models incorporate moral hazard, in terms of managerial effort shirking, and the potential for the manager to choose negative NPV projects, due to private benefits. Our models demonstrate that overconfidence can lead to higher dividends (when the manager is overconfident about his current ability) or lower dividends (when the manager is overconfident about his future ability). The models also demonstrate that higher overconfidence may result in an increase or a decrease in firm value. Numerical examples are illustrated for both models which interestingly support the models’ propositions.Keywords: behavioural corporate finance, dividend policy, overconfidence, moral hazard
Procedia PDF Downloads 33720483 Natural Gas Production Forecasts Using Diffusion Models
Authors: Md. Abud Darda
Abstract:
Different options for natural gas production in wide geographic areas may be described through diffusion of innovation models. This type of modeling approach provides an indirect estimate of an ultimately recoverable resource, URR, capture the quantitative effects of observed strategic interventions, and allow ex-ante assessments of future scenarios over time. In order to ensure a sustainable energy policy, it is important to forecast the availability of this natural resource. Considering a finite life cycle, in this paper we try to investigate the natural gas production of Myanmar and Algeria, two important natural gas provider in the world energy market. A number of homogeneous and heterogeneous diffusion models, with convenient extensions, have been used. Models validation has also been performed in terms of prediction capability.Keywords: diffusion models, energy forecast, natural gas, nonlinear production
Procedia PDF Downloads 22520482 Influence of Radio Frequency Identification Technology at Cost of Supply Chain as a Driver for the Generation of Competitive Advantage
Authors: Mona Baniahmadi, Saied Haghanifar
Abstract:
Radio Frequency Identification (RFID) is regarded as a promising technology for the optimization of supply chain processes since it improves manufacturing and retail operations from forecasting demand for planning, managing inventory, and distribution. This study precisely aims at learning to know the RFID technology and at explaining how it can concretely be used for supply chain management and how it can help improving it in the case of Hejrat Company which is located in Iran and works on the distribution of medical drugs and cosmetics. This study uses some statistical analysis to calculate the expected benefits of an integrated RFID system on supply chain obtained through competitive advantages increases with decreasing cost factor. The study investigates how the cost of storage process, labor cost, the cost of missing goods, inventory management optimization, on-time delivery, order cost, lost sales and supply process optimization affect the performance of the integrated RFID supply chain regarding cost factors and provides a competitive advantage.Keywords: cost, competitive advantage, radio frequency identification, supply chain
Procedia PDF Downloads 27520481 An Analytical Survey of Construction Changes: Gaps and Opportunities
Authors: Ehsan Eshtehardian, Saeed Khodaverdi
Abstract:
This paper surveys the studies on construction change and reveals some of the potential future works. A full-scale investigation of change literature, including change definitions, types, causes and effects, and change management systems, is accomplished to explore some of the coming change trends. It is tried to pick up the critical works in each section to deduct a true timeline of construction changes. The findings show that leaping from best practice guides in late 1990s and generic process models in the early 2000s to very advanced modeling environments in the mid-2000s and the early 2010s have made gaps along with opportunities for change researchers in order to develop some more easy and applicable models. Another finding is that there is a compelling similarity between the change and risk prediction models. Therefore, integrating these two concepts, specifically from proactive management point of view, may lead to a synergy and help project teams avoid rework. Also, the findings show that exploitation of cause-effect relationship models, in order to facilitate the dispute resolutions, seems to be an interesting field for future works.Keywords: construction change, change management systems, dispute resolutions, change literature
Procedia PDF Downloads 29520480 Evaluating Traffic Congestion Using the Bayesian Dirichlet Process Mixture of Generalized Linear Models
Authors: Ren Moses, Emmanuel Kidando, Eren Ozguven, Yassir Abdelrazig
Abstract:
This study applied traffic speed and occupancy to develop clustering models that identify different traffic conditions. Particularly, these models are based on the Dirichlet Process Mixture of Generalized Linear regression (DML) and change-point regression (CR). The model frameworks were implemented using 2015 historical traffic data aggregated at a 15-minute interval from an Interstate 295 freeway in Jacksonville, Florida. Using the deviance information criterion (DIC) to identify the appropriate number of mixture components, three traffic states were identified as free-flow, transitional, and congested condition. Results of the DML revealed that traffic occupancy is statistically significant in influencing the reduction of traffic speed in each of the identified states. Influence on the free-flow and the congested state was estimated to be higher than the transitional flow condition in both evening and morning peak periods. Estimation of the critical speed threshold using CR revealed that 47 mph and 48 mph are speed thresholds for congested and transitional traffic condition during the morning peak hours and evening peak hours, respectively. Free-flow speed thresholds for morning and evening peak hours were estimated at 64 mph and 66 mph, respectively. The proposed approaches will facilitate accurate detection and prediction of traffic congestion for developing effective countermeasures.Keywords: traffic congestion, multistate speed distribution, traffic occupancy, Dirichlet process mixtures of generalized linear model, Bayesian change-point detection
Procedia PDF Downloads 29420479 System Identification and Quantitative Feedback Theory Design of a Lathe Spindle
Authors: M. Khairudin
Abstract:
This paper investigates the system identification and design quantitative feedback theory (QFT) for the robust control of a lathe spindle. The dynamic of the lathe spindle is uncertain and time variation due to the deepness variation on cutting process. System identification was used to obtain the dynamics model of the lathe spindle. In this work, real time system identification is used to construct a linear model of the system from the nonlinear system. These linear models and its uncertainty bound can then be used for controller synthesis. The real time nonlinear system identification process to obtain a set of linear models of the lathe spindle that represents the operating ranges of the dynamic system. With a selected input signal, the data of output and response is acquired and nonlinear system identification is performed using Matlab to obtain a linear model of the system. Practical design steps are presented in which the QFT-based conditions are formulated to obtain a compensator and pre-filter to control the lathe spindle. The performances of the proposed controller are evaluated in terms of velocity responses of the the lathe machine spindle in corporating deepness on cutting process.Keywords: lathe spindle, QFT, robust control, system identification
Procedia PDF Downloads 54320478 Numerical Modelling of Immiscible Fluids Flow in Oil Reservoir Rocks during Enhanced Oil Recovery Processes
Authors: Zahreddine Hafsi, Manoranjan Mishra , Sami Elaoud
Abstract:
Ensuring the maximum recovery rate of oil from reservoir rocks is a challenging task that requires preliminary numerical analysis of different techniques used to enhance the recovery process. After conventional oil recovery processes and in order to retrieve oil left behind after the primary recovery phase, water flooding in one of several techniques used for enhanced oil recovery (EOR). In this research work, EOR via water flooding is numerically modeled, and hydrodynamic instabilities resulted from immiscible oil-water flow in reservoir rocks are investigated. An oil reservoir is a porous medium consisted of many fractures of tiny dimensions. For modeling purposes, the oil reservoir is considered as a collection of capillary tubes which provides useful insights into how fluids behave in the reservoir pore spaces. Equations governing oil-water flow in oil reservoir rocks are developed and numerically solved following a finite element scheme. Numerical results are obtained using Comsol Multiphysics software. The two phase Darcy module of COMSOL Multiphysics allows modelling the imbibition process by the injection of water (as wetting phase) into an oil reservoir. Van Genuchten, Brooks Corey and Levrett models were considered as retention models and obtained flow configurations are compared, and the governing parameters are discussed. For the considered retention models it was found that onset of instabilities viz. fingering phenomenon is highly dependent on the capillary pressure as well as the boundary conditions, i.e., the inlet pressure and the injection velocity.Keywords: capillary pressure, EOR process, immiscible flow, numerical modelling
Procedia PDF Downloads 13020477 The Case for Strategic Participation: How Facilitated Engagement Can Be Shown to Reduce Resistance and Improve Outcomes Through the Use of Strategic Models
Authors: Tony Mann
Abstract:
This paper sets out the case for involving and engaging employees/workers/stakeholders/staff in any significant change that is being considered by the senior executives of the organization. It establishes the rationale, the approach, the methodology of engagement and the benefits of a participative approach. It challenges the new norm of imposing change for fear of resistance and instead suggests that involving people has better outcomes and a longer-lasting impact. Various strategic models are introduced and illustrated to explain how the process can be most effective. The paper highlights one model in particular (the Process Iceberg® Organizational Change model) that has proven to be instrumental in developing effective change. Its use is demonstrated in its various forms and explains why so much change fails to address the key elements and how we can be more productive in managing change. ‘Participation’ in change is too often seen as negative, expensive and unwieldy. The paper aims to show that another model: UIA=O+E, can offset the difficulties and, in fact, produce much more positive and effective change.Keywords: facilitation, stakeholders, buy-in, digital workshops
Procedia PDF Downloads 10820476 Reliability Estimation of Bridge Structures with Updated Finite Element Models
Authors: Ekin Ozer
Abstract:
Assessment of structural reliability is essential for efficient use of civil infrastructure which is subjected hazardous events. Dynamic analysis of finite element models is a commonly used tool to simulate structural behavior and estimate its performance accordingly. However, theoretical models purely based on preliminary assumptions and design drawings may deviate from the actual behavior of the structure. This study proposes up-to-date reliability estimation procedures which engages actual bridge vibration data modifying finite element models for finite element model updating and performing reliability estimation, accordingly. The proposed method utilizes vibration response measurements of bridge structures to identify modal parameters, then uses these parameters to calibrate finite element models which are originally based on design drawings. The proposed method does not only show that reliability estimation based on updated models differs from the original models, but also infer that non-updated models may overestimate the structural capacity.Keywords: earthquake engineering, engineering vibrations, reliability estimation, structural health monitoring
Procedia PDF Downloads 22120475 Planning a Supply Chain with Risk and Environmental Objectives
Authors: Ghanima Al-Sharrah, Haitham M. Lababidi, Yusuf I. Ali
Abstract:
The main objective of the current work is to introduce sustainability factors in optimizing the supply chain model for process industries. The supply chain models are normally based on purely economic considerations related to costs and profits. To account for sustainability, two additional factors have been introduced; environment and risk. A supply chain for an entire petroleum organization has been considered for implementing and testing the proposed optimization models. The environmental and risk factors were introduced as indicators reflecting the anticipated impact of the optimal production scenarios on sustainability. The aggregation method used in extending the single objective function to multi-objective function is proven to be quite effective in balancing the contribution of each objective term. The results indicate that introducing sustainability factor would slightly reduce the economic benefit while improving the environmental and risk reduction performances of the process industries.Keywords: environmental indicators, optimization, risk, supply chain
Procedia PDF Downloads 35020474 Efficient Deep Neural Networks for Real-Time Strawberry Freshness Monitoring: A Transfer Learning Approach
Authors: Mst. Tuhin Akter, Sharun Akter Khushbu, S. M. Shaqib
Abstract:
A real-time system architecture is highly effective for monitoring and detecting various damaged products or fruits that may deteriorate over time or become infected with diseases. Deep learning models have proven to be effective in building such architectures. However, building a deep learning model from scratch is a time-consuming and costly process. A more efficient solution is to utilize deep neural network (DNN) based transfer learning models in the real-time monitoring architecture. This study focuses on using a novel strawberry dataset to develop effective transfer learning models for the proposed real-time monitoring system architecture, specifically for evaluating and detecting strawberry freshness. Several state-of-the-art transfer learning models were employed, and the best performing model was found to be Xception, demonstrating higher performance across evaluation metrics such as accuracy, recall, precision, and F1-score.Keywords: strawberry freshness evaluation, deep neural network, transfer learning, image augmentation
Procedia PDF Downloads 8920473 Detection of Chaos in General Parametric Model of Infectious Disease
Authors: Javad Khaligh, Aghileh Heydari, Ali Akbar Heydari
Abstract:
Mathematical epidemiological models for the spread of disease through a population are used to predict the prevalence of a disease or to study the impacts of treatment or prevention measures. Initial conditions for these models are measured from statistical data collected from a population since these initial conditions can never be exact, the presence of chaos in mathematical models has serious implications for the accuracy of the models as well as how epidemiologists interpret their findings. This paper confirms the chaotic behavior of a model for dengue fever and SI by investigating sensitive dependence, bifurcation, and 0-1 test under a variety of initial conditions.Keywords: epidemiological models, SEIR disease model, bifurcation, chaotic behavior, 0-1 test
Procedia PDF Downloads 32320472 Quantum Kernel Based Regressor for Prediction of Non-Markovianity of Open Quantum Systems
Authors: Diego Tancara, Raul Coto, Ariel Norambuena, Hoseein T. Dinani, Felipe Fanchini
Abstract:
Quantum machine learning is a growing research field that aims to perform machine learning tasks assisted by a quantum computer. Kernel-based quantum machine learning models are paradigmatic examples where the kernel involves quantum states, and the Gram matrix is calculated from the overlapping between these states. With the kernel at hand, a regular machine learning model is used for the learning process. In this paper we investigate the quantum support vector machine and quantum kernel ridge models to predict the degree of non-Markovianity of a quantum system. We perform digital quantum simulation of amplitude damping and phase damping channels to create our quantum dataset. We elaborate on different kernel functions to map the data and kernel circuits to compute the overlapping between quantum states. We observe a good performance of the models.Keywords: quantum, machine learning, kernel, non-markovianity
Procedia PDF Downloads 17820471 Dry Relaxation Shrinkage Prediction of Bordeaux Fiber Using a Feed Forward Neural
Authors: Baeza S. Roberto
Abstract:
The knitted fabric suffers a deformation in its dimensions due to stretching and tension factors, transverse and longitudinal respectively, during the process in rectilinear knitting machines so it performs a dry relaxation shrinkage procedure and thermal action of prefixed to obtain stable conditions in the knitting. This paper presents a dry relaxation shrinkage prediction of Bordeaux fiber using a feed forward neural network and linear regression models. Six operational alternatives of shrinkage were predicted. A comparison of the results was performed finding neural network models with higher levels of explanation of the variability and prediction. The presence of different reposes are included. The models were obtained through a neural toolbox of Matlab and Minitab software with real data in a knitting company of Southern Guanajuato. The results allow predicting dry relaxation shrinkage of each alternative operation.Keywords: neural network, dry relaxation, knitting, linear regression
Procedia PDF Downloads 58420470 Innovative Methods of Improving Train Formation in Freight Transport
Authors: Jaroslav Masek, Juraj Camaj, Eva Nedeliakova
Abstract:
The paper is focused on the operational model for transport the single wagon consignments on railway network by using two different models of train formation. The paper gives an overview of possibilities of improving the quality of transport services. Paper deals with two models used in problematic of train formatting - time continuously and time discrete. By applying these models in practice, the transport company can guarantee a higher quality of service and expect increasing of transport performance. The models are also applicable into others transport networks. The models supplement a theoretical problem of train formation by new ways of looking to affecting the organization of wagon flows.Keywords: train formation, wagon flows, marshalling yard, railway technology
Procedia PDF Downloads 43620469 Modelling and Simulation Efforts in Scale-Up and Characterization of Semi-Solid Dosage Forms
Authors: Saurav S. Rath, Birendra K. David
Abstract:
Generic pharmaceutical industry has to operate in strict timelines of product development and scale-up from lab to plant. Hence, detailed product & process understanding and implementation of appropriate mechanistic modelling and Quality-by-design (QbD) approaches are imperative in the product life cycle. This work provides example cases of such efforts in topical dosage products. Topical products are typically in the form of emulsions, gels, thick suspensions or even simple solutions. The efficacy of such products is determined by characteristics like rheology and morphology. Defining, and scaling up the right manufacturing process with a given set of ingredients, to achieve the right product characteristics presents as a challenge to the process engineer. For example, the non-Newtonian rheology varies not only with CPPs and CMAs but also is an implicit function of globule size (CQA). Hence, this calls for various mechanistic models, to help predict the product behaviour. This paper focusses on such models obtained from computational fluid dynamics (CFD) coupled with population balance modelling (PBM) and constitutive models (like shear, energy density). In a special case of the use of high shear homogenisers (HSHs) for the manufacture of thick emulsions/gels, this work presents some findings on (i) scale-up algorithm for HSH using shear strain, a novel scale-up parameter for estimating mixing parameters, (ii) non-linear relationship between viscosity and shear imparted into the system, (iii) effect of hold time on rheology of product. Specific examples of how this approach enabled scale-up across 1L, 10L, 200L, 500L and 1000L scales will be discussed.Keywords: computational fluid dynamics, morphology, quality-by-design, rheology
Procedia PDF Downloads 26920468 Statistical Analysis of Natural Images after Applying ICA and ISA
Authors: Peyman Sheikholharam Mashhadi
Abstract:
Difficulties in analyzing real world images in classical image processing and machine vision framework have motivated researchers towards considering the biology-based vision. It is a common belief that mammalian visual cortex has been adapted to the statistics of the real world images through the evolution process. There are two well-known successful models of mammalian visual cortical cells: Independent Component Analysis (ICA) and Independent Subspace Analysis (ISA). In this paper, we statistically analyze the dependencies which remain in the components after applying these models to the natural images. Also, we investigate the response of feature detectors to gratings with various parameters in order to find optimal parameters of the feature detectors. Finally, the selectiveness of feature detectors to phase, in both models is considered.Keywords: statistics, independent component analysis, independent subspace analysis, phase, natural images
Procedia PDF Downloads 33820467 Urban Big Data: An Experimental Approach to Building-Value Estimation Using Web-Based Data
Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin
Abstract:
Current real-estate value estimation, difficult for laymen, usually is performed by specialists. This paper presents an automated estimation process based on big data and machine-learning technology that calculates influences of building conditions on real-estate price measurement. The present study analyzed actual building sales sample data for Nonhyeon-dong, Gangnam-gu, Seoul, Korea, measuring the major influencing factors among the various building conditions. Further to that analysis, a prediction model was established and applied using RapidMiner Studio, a graphical user interface (GUI)-based tool for derivation of machine-learning prototypes. The prediction model is formulated by reference to previous examples. When new examples are applied, it analyses and predicts accordingly. The analysis process discerns the crucial factors effecting price increases by calculation of weighted values. The model was verified, and its accuracy determined, by comparing its predicted values with actual price increases.Keywords: apartment complex, big data, life-cycle building value analysis, machine learning
Procedia PDF Downloads 37220466 Powder Flow with Normalized Powder Particles Size Distribution and Temperature Analyses in Laser Melting Deposition: Analytical Modelling and Experimental Validation
Authors: Muhammad Arif Mahmood, Andrei C. Popescu, Mihai Oane, Diana Chioibascu, Carmen Ristoscu, Ion N. Mihailescu
Abstract:
Powder flow and temperature distributions are recognized as influencing factors during laser melting deposition (LMD) process, that not only affect the consolidation rate but also characteristics of the deposited layers. Herewith, two simplified analytical models will be presented to simulate the powder flow with the inclusion of powder particles size distribution in Gaussian form, under three powder jet nozzles, and temperature analyses during LMD process. The output of the 1st model will serve as the input in the 2nd model. The models will be validated with experimental data, i.e., weight measurement method for powder particles distribution and infrared imaging for temperature analyses. This study will increase the cost-efficiency of the LMD process by adjustment of the operating parameters for reaching optimal powder debit and energy. This research has received funds under the Marie Sklodowska-Curie grant agreement No. 764935, from the European Union’s Horizon 2020 research and innovation program.Keywords: laser additive manufacturing, powder particles size distribution in Gaussian form, powder stream distribution, temperature analyses
Procedia PDF Downloads 13320465 Providing a Suitable Model for Launching New Home Appliances Products to the Market
Authors: Ebrahim Sabermaash Eshghi, Donna Sandsmark
Abstract:
In changing modern economic conditions of the world, one the most important issues facing managers of firms, is increasing the sales and profitability through sales of newly developed products. This is while purpose of decreasing unnecessary costs is one of the most essential programs of smart managers for more implementation with new conditions in current business. In modern life, condition of misgiving is dominant in all of the industries. Accordingly, in this research, influence of different aspects of presenting products to the market is investigated. This study is done through a Quantitative-Qualitative (Interviews and Questionnaire) approach. In sum, 103 of informed managers and experts of Pars-Khazar Company have been examined through census. Validity of measurement tools was approved through judgments of experts. Reliability of tools was gained through Cronbach's alpha coefficient in size of 0.930 and in sum, validity and reliability of tools were approved generally. Results of regression test revealed that the influence of all aspects of product introduction supported the performance of product, positively and significantly. In addition that influence of two new factors raised from the interview, namely Human Resource Management and Management of product’s pre-test on performance of products was approved.Keywords: introducing products, performance, home appliances, price, advertisement, production
Procedia PDF Downloads 21020464 The Effects of Transformational Leadership on Process Innovation through Knowledge Sharing
Authors: Sawsan J. Al-Husseini, Talib A. Dosa
Abstract:
Transformational leadership has been identified as the most important factor affecting innovation and knowledge sharing; it leads to increased goal-directed behavior exhibited by followers and thus to enhanced performance and innovation for the organization. However, there is a lack of models linking transformational leadership, knowledge sharing, and process innovation within higher education (HE) institutions in general within developing countries, particularly in Iraq. This research aims to examine the mediating role of knowledge sharing in the transformational leadership and process innovation relationship. A quantitative approach was taken and 254 usable questionnaires were collected from public HE institutions in Iraq. Structural equation modelling with AMOS 22 was used to analyze the causal relationships among factors. The research found that knowledge sharing plays a pivotal role in the relationship between transformational leadership and process innovation, and that transformational leadership would be ideal in an educational context, promoting knowledge sharing activities and influencing process innovation in the public HE in Iraq. The research has developed some guidelines for researchers as well as leaders and provided evidence to support the use of TL to increase process innovation within HE environment in developing countries, particularly in Iraq.Keywords: transformational leadership, knowledge sharing, process innovation, structural equation modelling, developing countries
Procedia PDF Downloads 33420463 The Benefits of End-To-End Integrated Planning from the Mine to Client Supply for Minimizing Penalties
Authors: G. Martino, F. Silva, E. Marchal
Abstract:
The control over delivered iron ore blend characteristics is one of the most important aspects of the mining business. The iron ore price is a function of its composition, which is the outcome of the beneficiation process. So, end-to-end integrated planning of mine operations can reduce risks of penalties on the iron ore price. In a standard iron mining company, the production chain is composed of mining, ore beneficiation, and client supply. When mine planning and client supply decisions are made uncoordinated, the beneficiation plant struggles to deliver the best blend possible. Technological improvements in several fields allowed bridging the gap between departments and boosting integrated decision-making processes. Clusterization and classification algorithms over historical production data generate reasonable previsions for quality and volume of iron ore produced for each pile of run-of-mine (ROM) processed. Mathematical modeling can use those deterministic relations to propose iron ore blends that better-fit specifications within a delivery schedule. Additionally, a model capable of representing the whole production chain can clearly compare the overall impact of different decisions in the process. This study shows how flexibilization combined with a planning optimization model between the mine and the ore beneficiation processes can reduce risks of out of specification deliveries. The model capabilities are illustrated on a hypothetical iron ore mine with magnetic separation process. Finally, this study shows ways of cost reduction or profit increase by optimizing process indicators across the production chain and integrating the different plannings with the sales decisions.Keywords: clusterization and classification algorithms, integrated planning, mathematical modeling, optimization, penalty minimization
Procedia PDF Downloads 12320462 SOM Map vs Hopfield Neural Network: A Comparative Study in Microscopic Evacuation Application
Authors: Zouhour Neji Ben Salem
Abstract:
Microscopic evacuation focuses on the evacuee behavior and way of search of safety place in an egress situation. In recent years, several models handled microscopic evacuation problem. Among them, we have proposed Artificial Neural Network (ANN) as an alternative to mathematical models that can deal with such problem. In this paper, we present two ANN models: SOM map and Hopfield Network used to predict the evacuee behavior in a disaster situation. These models are tested in a real case, the second floor of Tunisian children hospital evacuation in case of fire. The two models are studied and compared in order to evaluate their performance.Keywords: artificial neural networks, self-organization map, hopfield network, microscopic evacuation, fire building evacuation
Procedia PDF Downloads 403