Search results for: sales process ARIMA models
20432 Defining the Customers' Color Preference for the Apparel Industry in Terms of Chromaticity Coordinates
Authors: Banu Hatice Gürcüm, Pınar Arslan, Mahmut Yalçın
Abstract:
Fashion designers create lots of dresses, suits, shoes, and other clothing and accessories, which are purchased every year by consumers. Fashion trends, sketches of designs, accessories affect the apparel goods, but colors make the finishing touches to an outfit. In all fields of apparel men's, women's, and children's wear, including casual wear, suits, sportswear, formal wear, outerwear, maternity, and intimate apparel, color sells. Thus, specialization in color in apparel is a basic concern each season. The perception of color is the key to sales for every sector in textile business. Mechanism of color perception, cognition in brain and color emotion are unique subjects, which scientists have been investigating for many years. The parameters of color may not be corresponding to visual scales since human emotions induced by color are completely subjective. However, with a very few exception each manufacturer concern their top selling colors for each season through seasonal sales reports of apparel companies. This paper examines sensory and instrumental methods for quantifying color of fabrics and investigates the relationship between fabric color and sale numbers. 5 top selling colors for each season from 10 leading apparel companies in the same segment are taken. The compilation is based according to the sales of the companies for 5 to 10 years. The research’s main concern is the corelation with the magnitude of seasonal color selling figures and the CIE chromaticity coordinates. The colors are chosen from the globally accepted Pantone Textile Color System and the three-dimentional measurement system CIE L*a*b* (CIELAB) is used, L* representing the degree of lightness of color, a* the degree of color ranging from magenta to green, and b* the degree of color ranging from blue to yellow. The objective of this paper is to demonstrate the feasibility of relating color perceptance to a laboratory instrument yielding measurements in the CIELAB system. Our approach is to obtain a total of a hundred reference fabrics to be measured on a laboratory spectrophotometer calibrated to the CIELAB color system. Relationships between the CIE tristimulus (X, Y, Z) and CIELAB (L*, a*, b*) are examined and are reported herein.Keywords: CIELAB, CIE tristimulus, color preference, fashion
Procedia PDF Downloads 33520431 Analysis on Prediction Models of TBM Performance and Selection of Optimal Input Parameters
Authors: Hang Lo Lee, Ki Il Song, Hee Hwan Ryu
Abstract:
An accurate prediction of TBM(Tunnel Boring Machine) performance is very difficult for reliable estimation of the construction period and cost in preconstruction stage. For this purpose, the aim of this study is to analyze the evaluation process of various prediction models published since 2000 for TBM performance, and to select the optimal input parameters for the prediction model. A classification system of TBM performance prediction model and applied methodology are proposed in this research. Input and output parameters applied for prediction models are also represented. Based on these results, a statistical analysis is performed using the collected data from shield TBM tunnel in South Korea. By performing a simple regression and residual analysis utilizinFg statistical program, R, the optimal input parameters are selected. These results are expected to be used for development of prediction model of TBM performance.Keywords: TBM performance prediction model, classification system, simple regression analysis, residual analysis, optimal input parameters
Procedia PDF Downloads 30920430 Comparison of Deep Convolutional Neural Networks Models for Plant Disease Identification
Authors: Megha Gupta, Nupur Prakash
Abstract:
Identification of plant diseases has been performed using machine learning and deep learning models on the datasets containing images of healthy and diseased plant leaves. The current study carries out an evaluation of some of the deep learning models based on convolutional neural network (CNN) architectures for identification of plant diseases. For this purpose, the publicly available New Plant Diseases Dataset, an augmented version of PlantVillage dataset, available on Kaggle platform, containing 87,900 images has been used. The dataset contained images of 26 diseases of 14 different plants and images of 12 healthy plants. The CNN models selected for the study presented in this paper are AlexNet, ZFNet, VGGNet (four models), GoogLeNet, and ResNet (three models). The selected models are trained using PyTorch, an open-source machine learning library, on Google Colaboratory. A comparative study has been carried out to analyze the high degree of accuracy achieved using these models. The highest test accuracy and F1-score of 99.59% and 0.996, respectively, were achieved by using GoogLeNet with Mini-batch momentum based gradient descent learning algorithm.Keywords: comparative analysis, convolutional neural networks, deep learning, plant disease identification
Procedia PDF Downloads 19820429 The Promotion Effects for a Supply Chain System with a Dominant Retailer
Authors: Tai-Yue Wang, Yi-Ho Chen
Abstract:
In this study, we investigate a two-echelon supply chain with two suppliers and three retailers among which one retailer dominates other retailers. A price competition demand function is used to model this dominant retailer, which is leading market. The promotion strategies and negotiation schemes are integrated to form decision-making models under different scenarios. These models are then formulated into different mathematical programming models. The decision variables such as promotional costs, retailer prices, wholesale price, and order quantity are included in these models. At last, the distributions of promotion costs under different cost allocation strategies are discussed. Finally, an empirical example used to validate our models. The results from this empirical example show that the profit model will create the largest profit for the supply chain but with different profit-sharing results. At the same time, the more risk a member can take, the more profits are distributed to that member in the utility model.Keywords: supply chain, price promotion, mathematical models, dominant retailer
Procedia PDF Downloads 40020428 Optimising Post-Process Heat Treatments of Selective Laser Melting-Produced Ti-6Al-4V Parts to Achieve Superior Mechanical Properties
Authors: Gerrit Ter Haar, Thorsten Becker, Deborah Blaine
Abstract:
The Additive Manufacturing (AM) process of Selective Laser Melting (SLM) has seen an exponential growth in sales and development in the past fifteen years. Whereas the capability of SLM was initially limited to rapid prototyping, progress in research and development (R&D) has allowed SLM to be capable of fully functional parts. This technology is still at a primitive stage and technical knowledge of the vast number of variables influencing final part quality is limited. Ongoing research and development of the sensitive printing process and post processes is of utmost importance in order to qualify SLM parts to meet international standards. Quality concerns in Ti-6Al-4V manufactured through SLM has been identified, which include: high residual stresses, part porosity, low ductility and anisotropic mechanical properties. Whereas significant quality improvements have been made through optimising printing parameters, research indicates as-produced part ductility to be a major limiting factor when compared to its wrought counterpart. This study aims at achieving an in-depth understanding of the underlining links between SLM produced Ti-6Al-4V microstructure and its mechanical properties. Knowledge of microstructural transformation kinetics of Ti-6Al-4V allows for the optimisation of post-process heat treatments thereby achieving the required process route to manufacture high quality SLM produced Ti-6Al-4V parts. Experimental methods used to evaluate the kinematics of microstructural transformation of SLM Ti-6Al-4V are: optical microscopy and electron backscatter diffraction. Results show that a low-temperature heat treatment is capable of transforming the as-produced, martensitic microstructure into a duel-phase microstructure exhibiting both a high strength and improved ductility. Furthermore, isotropy of mechanical properties can be achieved through certain annealing routes. Mechanical properties identical to that of wrought Ti-6Al-4V can, therefore, be achieved through an optimised process route.Keywords: EBSD analysis, heat treatments, microstructural characterisation, selective laser melting, tensile behaviour, Ti-6Al-4V
Procedia PDF Downloads 42120427 The Effect of Institutions on Economic Growth: An Analysis Based on Bayesian Panel Data Estimation
Authors: Mohammad Anwar, Shah Waliullah
Abstract:
This study investigated panel data regression models. This paper used Bayesian and classical methods to study the impact of institutions on economic growth from data (1990-2014), especially in developing countries. Under the classical and Bayesian methodology, the two-panel data models were estimated, which are common effects and fixed effects. For the Bayesian approach, the prior information is used in this paper, and normal gamma prior is used for the panel data models. The analysis was done through WinBUGS14 software. The estimated results of the study showed that panel data models are valid models in Bayesian methodology. In the Bayesian approach, the effects of all independent variables were positively and significantly affected by the dependent variables. Based on the standard errors of all models, we must say that the fixed effect model is the best model in the Bayesian estimation of panel data models. Also, it was proved that the fixed effect model has the lowest value of standard error, as compared to other models.Keywords: Bayesian approach, common effect, fixed effect, random effect, Dynamic Random Effect Model
Procedia PDF Downloads 6820426 Experimental Assessment of Micromechanical Models for Mechanical Properties of Recycled Short Fiber Composites
Authors: Mohammad S. Rouhi, Magdalena Juntikka
Abstract:
Processing of polymer fiber composites has a remarkable influence on their mechanical performance. These mechanical properties are even more influenced when using recycled reinforcement. Therefore, we place particular attention on the evaluation of micromechanical models to estimate the mechanical properties and compare them against the experimental results of the manufactured composites. For the manufacturing process, an epoxy matrix and carbon fiber production cut-offs as reinforcing material are incorporated using a vacuum infusion process. In addition, continuous textile reinforcement in combination with the epoxy matrix is used as reference material to evaluate the kick-down in mechanical performance of the recycled composite. The experimental results show less degradation of the composite stiffness compared to the strength properties. Observations from the modeling also show the same trend as the error between the theoretical and experimental results is lower for stiffness comparisons than the strength calculations. Yet still, good mechanical performance for specific applications can be expected from these materials.Keywords: composite recycling, carbon fibers, mechanical properties, micromechanics
Procedia PDF Downloads 16120425 The Impact of Information Technology Monitoring on Employee Theft and Productivity
Authors: Ajayi Oluwasola Felix
Abstract:
This paper examines how firm investments in technology-based employee monitoring impact both misconduct and productivity. We use unique and detailed theft and sales data from 392 restaurant locations from five firms that adopt a theft monitoring information technology (IT) product. We use difference-in-differences (DD) models with staggered adoption dates to estimate the treatment effect of IT monitoring on theft and productivity. We find significant treatment effects in reduced theft and improved productivity that appear to be primarily driven by changed worker behavior rather than worker turnover. We examine four mechanisms that may drive this productivity result: economic and cognitive multitasking, fairness-based motivation, and perceived increases of general oversight. The observed productivity results represent substantial financial benefits to both firms and the legitimate tip-based earnings of workers. Our results suggest that employee misconduct is not solely a function of individual differences in ethics or morality, but can also be influenced by managerial policies that can benefit both firms and employees.Keywords: information technology, monitoring, misconduct, employee theft
Procedia PDF Downloads 42020424 The Methods of Customer Satisfaction Measurement and Its Statistical Analysis towards Sales and Logistic Activities in Food Sector
Authors: Seher Arslankaya, Bahar Uludağ
Abstract:
Meeting the needs and demands of customers and pleasing the customers are important requirements for companies in food sectors where the growth of competition is significantly unpredictable. Customer satisfaction is also one of the key concepts which is mainly driven by wide range of customer preference and expectation upon products and services introduced and delivered to them. In order to meet the customer demands, the companies that engage in food sectors are expected to have a well-managed set of Total Quality Management (TQM), which sets out to improve quality of products and services; to reduce costs and to increase customer satisfaction by restructuring traditional management practices. It aims to increase customer satisfaction by meeting (their) customer expectations and requirements. The achievement would be determined with the help of customer satisfaction surveys, which is done to obtain immediate feedback and to provide quick responses. In addition, the surveys would also assist the making of strategic planning which helps to anticipate customer future needs and expectations. Meanwhile, periodic measurement of customer satisfaction would be a must because with the better understanding of customers perceptions from the surveys (done by questioners), the companies would have a clear idea to identify their own strengths and weaknesses that help the companies keep their loyal customers; to stand in comparison toward their competitors and map out their future progress and improvement. In this study, we propose a survey based on customer satisfaction measurement method and its statistical analysis for sales and logistic activities of food firms. Customer satisfaction would be discussed in details. Furthermore, after analysing the data derived from the questionnaire that applied to customers by using the SPSS software, various results obtained from the application would be presented. By also applying ANOVA test, the study would analysis the existence of meaningful differences between customer demographic proportion and their perceptions. The purpose of this study is also to find out requirements which help to remove the effects that decrease customer satisfaction and produce loyal customers in food industry. For this purpose, the customer complaints are collected. Additionally, comments and suggestions are done according to the obtained results of surveys, which would be useful for the making-process of strategic planning in food industry.Keywords: customer satisfaction measurement and analysis, food industry, SPSS, TQM
Procedia PDF Downloads 24920423 Variability of Hydrological Modeling of the Blue Nile
Authors: Abeer Samy, Oliver C. Saavedra Valeriano, Abdelazim Negm
Abstract:
The Blue Nile Basin is the most important tributary of the Nile River. Egypt and Sudan are almost dependent on water originated from the Blue Nile. This multi-dependency creates conflicts among the three countries Egypt, Sudan, and Ethiopia making the management of these conflicts as an international issue. Good assessment of the water resources of the Blue Nile is an important to help in managing such conflicts. Hydrological models are good tool for such assessment. This paper presents a critical review of the nature and variability of the climate and hydrology of the Blue Nile Basin as a first step of using hydrological modeling to assess the water resources of the Blue Nile. Many several attempts are done to develop basin-scale hydrological modeling on the Blue Nile. Lumped and semi distributed models used averages of meteorological inputs and watershed characteristics in hydrological simulation, to analyze runoff for flood control and water resource management. Distributed models include the temporal and spatial variability of catchment conditions and meteorological inputs to allow better representation of the hydrological process. The main challenge of all used models was to assess the water resources of the basin is the shortage of the data needed for models calibration and validation. It is recommended to use distributed model for their higher accuracy to cope with the great variability and complexity of the Blue Nile basin and to collect sufficient data to have more sophisticated and accurate hydrological modeling.Keywords: Blue Nile Basin, climate change, hydrological modeling, watershed
Procedia PDF Downloads 36620422 Framework for Developing Change Team to Maximize Change Initiative Success
Authors: Mohammad Z. Ansari, Lisa Brodie, Marilyn Goh
Abstract:
Change facilitators are individuals who utilize change philosophy to make a positive change to organizations. The application of change facilitators can be seen in various change models; Lewin, Lippitt, etc. The facilitators within numerous change models are considered as internal/external consultants. Whilst most of the scholarly paper considers change facilitation as a consensus attempt to improve organization, there is a lack of a framework that develops both the organization and the change facilitator creating a self-sustaining change environment. This research paper introduces the development of the framework for change Leaders, Planners, and Executers (LPE), aiming at various organizational levels (Process, Departmental, and Organisational). The LPE framework is derived by exploring interrelated characteristics between facilitator(s) and the organization through qualitative research for understanding change management techniques and facilitator(s) behavioral aspect from existing Change Management models and Organisation behavior works of literature. The introduced framework assists in highlighting and identify the most appropriate change team to successfully deliver the change initiative within any organization (s).Keywords: change initiative, LPE framework, change facilitator(s), sustainable change
Procedia PDF Downloads 19620421 A Comparative Study of Primary Revenue Sources in the U.S. Professional Sports, Intercollegiate Sports, and Sporting Goods Industry
Authors: Chenghao Ma
Abstract:
This paper mainly examines and compares the primary revenue sources in the professional sports, intercollegiate sports, and sporting goods industries in the U.S. In the professional team sport, revenues may come from different resources, including broadcasting rights, ticket sales, corporate partnerships, naming rights, licensed merchandise, luxury suites, club seating, ancillary activities, and transfer fees. Many universities use university budgets and student fees to cover the cost of collegiate athletics. Other sources of revenue include ticket sales, broadcast rights, concessions, corporate partnerships, cash contributions from alumni, and others. Revenues in the sporting goods industry are very different compared with professional sports teams and collegiate athletics. Sporting goods companies mainly sell a line of products and equipment to generate revenue. Revenues are critical for sports organizations, including professional sports teams, intercollegiate athletics, and sporting goods companies. There are similarities and differences among these areas. Sports managers are looking for new ways to generate revenues, and there are many changes of sources because of the development of the internet and technology. Compared with intercollegiate athletics, professional sport and sporting goods companies will create more revenue opportunities globally.Keywords: revenue sources, professional sports, intercollegiate athletics, sporting goods industry
Procedia PDF Downloads 22020420 Copula-Based Estimation of Direct and Indirect Effects in Path Analysis Models
Authors: Alam Ali, Ashok Kumar Pathak
Abstract:
Path analysis is a statistical technique used to evaluate the direct and indirect effects of variables in path models. One or more structural regression equations are used to estimate a series of parameters in path models to find the better fit of data. However, sometimes the assumptions of classical regression models, such as ordinary least squares (OLS), are violated by the nature of the data, resulting in insignificant direct and indirect effects of exogenous variables. This article aims to explore the effectiveness of a copula-based regression approach as an alternative to classical regression, specifically when variables are linked through an elliptical copula.Keywords: path analysis, copula-based regression models, direct and indirect effects, k-fold cross validation technique
Procedia PDF Downloads 4120419 E-Consumers’ Attribute Non-Attendance Switching Behavior: Effect of Providing Information on Attributes
Authors: Leonard Maaya, Michel Meulders, Martina Vandebroek
Abstract:
Discrete Choice Experiments (DCE) are used to investigate how product attributes affect decision-makers’ choices. In DCEs, choice situations consisting of several alternatives are presented from which choice-makers select the preferred alternative. Standard multinomial logit models based on random utility theory can be used to estimate the utilities for the attributes. The overarching principle in these models is that respondents understand and use all the attributes when making choices. However, studies suggest that respondents sometimes ignore some attributes (commonly referred to as Attribute Non-Attendance/ANA). The choice modeling literature presents ANA as a static process, i.e., respondents’ ANA behavior does not change throughout the experiment. However, respondents may ignore attributes due to changing factors like availability of information on attributes, learning/fatigue in experiments, etc. We develop a dynamic mixture latent Markov model to model changes in ANA when information on attributes is provided. The model is illustrated on e-consumers’ webshop choices. The results indicate that the dynamic ANA model describes the behavioral changes better than modeling the impact of information using changes in parameters. Further, we find that providing information on attributes leads to an increase in the attendance probabilities for the investigated attributes.Keywords: choice models, discrete choice experiments, dynamic models, e-commerce, statistical modeling
Procedia PDF Downloads 14020418 Impact of Tobacco Control Policy to Cancer Mortalities in South Africa
Authors: Cyprian M. Mostert
Abstract:
This paper investigates the effectiveness of tobacco control policy (TCP) in averting cancer mortalities in both educated and uneducated segments of the South African population. A two-stage least squares model (2SLS) was used covering the period 2009-2013. The results show that the TCP caused a 26 percent average decrease in cancer mortalities in both educated and uneducated segment of the population. However, limiting the sales of cheap and illegal tobacco cigarettes is necessary for advancing the effectiveness of TCP in averting cancer mortalities in the uneducated population — as the paper noted an insignificant decrease in cancer mortalities in 2012-2013 due to the presence of cheaper cigarettes. The paper also discovered evidence of persisting tobacco purchases of branded cigarettes in the educated population group which limited the effectiveness of TCP in 2009-2011. Hikes in real tobacco tax to a 0.8 USD price level in 2012 limited tobacco consumption in the educated group resulting in a 29 percent decrease in cancer mortalities. Other developing countries may learn from the South African case and strive to limit the sales of cheap illegal cigarettes while hiking real tobacco tax of branded cigarettes as a key strategy to improve cancer deaths across educated and uneducated population groups.Keywords: cancer, health policy, health system, tobacco tax
Procedia PDF Downloads 14520417 Debriefing Practices and Models: An Integrative Review
Authors: Judson P. LaGrone
Abstract:
Simulation-based education in curricula was once a luxurious component of nursing programs but now serves as a vital element of an individual’s learning experience. A debriefing occurs after the simulation scenario or clinical experience is completed to allow the instructor(s) or trained professional(s) to act as a debriefer to guide a reflection with a purpose of acknowledging, assessing, and synthesizing the thought process, decision-making process, and actions/behaviors performed during the scenario or clinical experience. Debriefing is a vital component of the simulation process and educational experience to allow the learner(s) to progressively build upon past experiences and current scenarios within a safe and welcoming environment with a guided dialog to enhance future practice. The aim of this integrative review was to assess current practices of debriefing models in simulation-based education for health care professionals and students. The following databases were utilized for the search: CINAHL Plus, Cochrane Database of Systemic Reviews, EBSCO (ERIC), PsycINFO (Ovid), and Google Scholar. The advanced search option was useful to narrow down the search of articles (full text, Boolean operators, English language, peer-reviewed, published in the past five years). Key terms included debrief, debriefing, debriefing model, debriefing intervention, psychological debriefing, simulation, simulation-based education, simulation pedagogy, health care professional, nursing student, and learning process. Included studies focus on debriefing after clinical scenarios of nursing students, medical students, and interprofessional teams conducted between 2015 and 2020. Common themes were identified after the analysis of articles matching the search criteria. Several debriefing models are addressed in the literature with similarities of effectiveness for participants in clinical simulation-based pedagogy. Themes identified included (a) importance of debriefing in simulation-based pedagogy, (b) environment for which debriefing takes place is an important consideration, (c) individuals who should conduct the debrief, (d) length of debrief, and (e) methodology of the debrief. Debriefing models supported by theoretical frameworks and facilitated by trained staff are vital for a successful debriefing experience. Models differed from self-debriefing, facilitator-led debriefing, video-assisted debriefing, rapid cycle deliberate practice, and reflective debriefing. A reoccurring finding was centered around the emphasis of continued research for systematic tool development and analysis of the validity and effectiveness of current debriefing practices. There is a lack of consistency of debriefing models among nursing curriculum with an increasing rate of ill-prepared faculty to facilitate the debriefing phase of the simulation.Keywords: debriefing model, debriefing intervention, health care professional, simulation-based education
Procedia PDF Downloads 14220416 A Spatial Information Network Traffic Prediction Method Based on Hybrid Model
Authors: Jingling Li, Yi Zhang, Wei Liang, Tao Cui, Jun Li
Abstract:
Compared with terrestrial network, the traffic of spatial information network has both self-similarity and short correlation characteristics. By studying its traffic prediction method, the resource utilization of spatial information network can be improved, and the method can provide an important basis for traffic planning of a spatial information network. In this paper, considering the accuracy and complexity of the algorithm, the spatial information network traffic is decomposed into approximate component with long correlation and detail component with short correlation, and a time series hybrid prediction model based on wavelet decomposition is proposed to predict the spatial network traffic. Firstly, the original traffic data are decomposed to approximate components and detail components by using wavelet decomposition algorithm. According to the autocorrelation and partial correlation smearing and truncation characteristics of each component, the corresponding model (AR/MA/ARMA) of each detail component can be directly established, while the type of approximate component modeling can be established by ARIMA model after smoothing. Finally, the prediction results of the multiple models are fitted to obtain the prediction results of the original data. The method not only considers the self-similarity of a spatial information network, but also takes into account the short correlation caused by network burst information, which is verified by using the measured data of a certain back bone network released by the MAWI working group in 2018. Compared with the typical time series model, the predicted data of hybrid model is closer to the real traffic data and has a smaller relative root means square error, which is more suitable for a spatial information network.Keywords: spatial information network, traffic prediction, wavelet decomposition, time series model
Procedia PDF Downloads 14620415 Adding a Degree of Freedom to Opinion Dynamics Models
Authors: Dino Carpentras, Alejandro Dinkelberg, Michael Quayle
Abstract:
Within agent-based modeling, opinion dynamics is the field that focuses on modeling people's opinions. In this prolific field, most of the literature is dedicated to the exploration of the two 'degrees of freedom' and how they impact the model’s properties (e.g., the average final opinion, the number of final clusters, etc.). These degrees of freedom are (1) the interaction rule, which determines how agents update their own opinion, and (2) the network topology, which defines the possible interaction among agents. In this work, we show that the third degree of freedom exists. This can be used to change a model's output up to 100% of its initial value or to transform two models (both from the literature) into each other. Since opinion dynamics models are representations of the real world, it is fundamental to understand how people’s opinions can be measured. Even for abstract models (i.e., not intended for the fitting of real-world data), it is important to understand if the way of numerically representing opinions is unique; and, if this is not the case, how the model dynamics would change by using different representations. The process of measuring opinions is non-trivial as it requires transforming real-world opinion (e.g., supporting most of the liberal ideals) to a number. Such a process is usually not discussed in opinion dynamics literature, but it has been intensively studied in a subfield of psychology called psychometrics. In psychometrics, opinion scales can be converted into each other, similarly to how meters can be converted to feet. Indeed, psychometrics routinely uses both linear and non-linear transformations of opinion scales. Here, we analyze how this transformation affects opinion dynamics models. We analyze this effect by using mathematical modeling and then validating our analysis with agent-based simulations. Firstly, we study the case of perfect scales. In this way, we show that scale transformations affect the model’s dynamics up to a qualitative level. This means that if two researchers use the same opinion dynamics model and even the same dataset, they could make totally different predictions just because they followed different renormalization processes. A similar situation appears if two different scales are used to measure opinions even on the same population. This effect may be as strong as providing an uncertainty of 100% on the simulation’s output (i.e., all results are possible). Still, by using perfect scales, we show that scales transformations can be used to perfectly transform one model to another. We test this using two models from the standard literature. Finally, we test the effect of scale transformation in the case of finite precision using a 7-points Likert scale. In this way, we show how a relatively small-scale transformation introduces both changes at the qualitative level (i.e., the most shared opinion at the end of the simulation) and in the number of opinion clusters. Thus, scale transformation appears to be a third degree of freedom of opinion dynamics models. This result deeply impacts both theoretical research on models' properties and on the application of models on real-world data.Keywords: degrees of freedom, empirical validation, opinion scale, opinion dynamics
Procedia PDF Downloads 11920414 Building Information Modeling Acting as Protagonist and Link between the Virtual Environment and the Real-World for Efficiency in Building Production
Authors: Cristiane R. Magalhaes
Abstract:
Advances in Information and Communication Technologies (ICT) have led to changes in different sectors particularly in architecture, engineering, construction, and operation (AECO) industry. In this context, the advent of BIM (Building Information Modeling) has brought a number of opportunities in the field of the digital architectural design process bringing integrated design concepts that impact on the development, elaboration, coordination, and management of ventures. The project scope has begun to contemplate, from its original stage, the third dimension, by means of virtual environments (VEs), composed of models containing different specialties, substituting the two-dimensional products. The possibility to simulate the construction process of a venture in a VE starts at the beginning of the design process offering, through new technologies, many possibilities beyond geometrical digital modeling. This is a significant change and relates not only to form, but also to how information is appropriated in architectural and engineering models and exchanged among professionals. In order to achieve the main objective of this work, the Design Science Research Method will be adopted to elaborate an artifact containing strategies for the application and use of ICTs from BIM flows, with pre-construction cut-off to the execution of the building. This article intends to discuss and investigate how BIM can be extended to the site acting as a protagonist and link between the Virtual Environments and the Real-World, as well as its contribution to the integration of the value chain and the consequent increase of efficiency in the production of the building. The virtualization of the design process has reached high levels of development through the use of BIM. Therefore it is essential that the lessons learned with the virtual models be transposed to the actual building production increasing precision and efficiency. Thus, this paper discusses how the Fourth Industrial Revolution has impacted on property developments and how BIM could be the propellant acting as the main fuel and link between the virtual environment and the real production for the structuring of flows, information management and efficiency in this process. The results obtained are partial and not definite up to the date of this publication. This research is part of a doctoral thesis development, which focuses on the discussion of the impact of digital transformation in the construction of residential buildings in Brazil.Keywords: building information modeling, building production, digital transformation, ICT
Procedia PDF Downloads 12220413 Defect Management Life Cycle Process for Software Quality Improvement
Authors: Aedah Abd Rahman, Nurdatillah Hasim
Abstract:
Software quality issues require special attention especially in view of the demands of quality software product to meet customer satisfaction. Software development projects in most organisations need proper defect management process in order to produce high quality software product and reduce the number of defects. The research question of this study is how to produce high quality software and reducing the number of defects. Therefore, the objective of this paper is to provide a framework for managing software defects by following defined life cycle processes. The methodology starts by reviewing defects, defect models, best practices and standards. A framework for defect management life cycle is proposed. The major contribution of this study is to define a defect management road map in software development. The adoption of an effective defect management process helps to achieve the ultimate goal of producing high quality software products and contributes towards continuous software process improvement.Keywords: defects, defect management, life cycle process, software quality
Procedia PDF Downloads 30620412 Contribution of Supply Chain Management Practices for Enhancing Healthcare Service Quality: A Quantitative Analysis in Delhi’s Healthcare Sector
Authors: Chitrangi Gupta, Arvind Bhardwaj
Abstract:
This study seeks to investigate and quantify the influence of various dimensions of supply chain management (namely, supplier relationships, compatibility, specifications and standards, delivery processes, and after-sales service) on distinct dimensions of healthcare service quality (specifically, responsiveness, trustworthiness, and security) within the operational framework of XYZ Superspeciality Hospital, situated in Delhi. The name of the Hospital is not being mentioned here because of the privacy policy of the hospital. The primary objective of this research is to elucidate the impact of supply chain management practices on the overall quality of healthcare services offered within hospital settings. Employing a quantitative research design, this study utilizes a hypothesis-testing approach to systematically discern the relationship between supply chain management dimensions and the quality of health services. The findings of this study underscore the significant influence exerted by supply chain management dimensions, specifically supplier relationships, specifications and standards, delivery processes, and after-sales service, on the enhancement of healthcare service quality. Moreover, the study's results reveal that demographic factors such as gender, qualifications, age, and experience do not yield discernible disparities in the relationship between supply chain management and healthcare service quality.Keywords: supply chain management, healthcare, hospital operations, service delivery
Procedia PDF Downloads 6720411 Various Models of Quality Management Systems
Authors: Mehrnoosh Askarizadeh
Abstract:
People, process and IT are the most important assets of any organization. Optimal utilization of these resources has been the question of research in business for many decades. The business world have responded by inventing various methodologies that can be used for addressing problems of quality improvement, efficiency of processes, continuous improvement, reduction of waste, automation, strategy alignments etc. Some of these methodologies can be commonly called as Business Process Quality Management methodologies (BPQM). In essence, the first references to the process management can be traced back to Frederick Taylor and scientific management. Time and motion study was addressed to improvement of manufacturing process efficiency. The ideas of scientific management were in use for quite a long period until more advanced quality management techniques were developed in Japan and USA. One of the first prominent methods had been Total Quality Management (TQM) which evolved during 1980’s. About the same time, Six Sigma (SS) originated at Motorola as a separate method. SS spread and evolved; and later joined with ideas of Lean manufacturing to form Lean Six Sigma. In 1990’s due to emerging IT technologies, beginning of globalization, and strengthening of competition, companies recognized the need for better process and quality management. Business Process Management (BPM) emerged as a novel methodology that has taken all this into account and helped to align IT technologies with business processes and quality management. In this article we will study various aspects of above mentioned methods and identified their relations.Keywords: e-process, quality, TQM, BPM, lean, six sigma, CPI, information technology, management
Procedia PDF Downloads 44020410 Development of a Turbulent Boundary Layer Wall-pressure Fluctuations Power Spectrum Model Using a Stepwise Regression Algorithm
Authors: Zachary Huffman, Joana Rocha
Abstract:
Wall-pressure fluctuations induced by the turbulent boundary layer (TBL) developed over aircraft are a significant source of aircraft cabin noise. Since the power spectral density (PSD) of these pressure fluctuations is directly correlated with the amount of sound radiated into the cabin, the development of accurate empirical models that predict the PSD has been an important ongoing research topic. The sound emitted can be represented from the pressure fluctuations term in the Reynoldsaveraged Navier-Stokes equations (RANS). Therefore, early TBL empirical models (including those from Lowson, Robertson, Chase, and Howe) were primarily derived by simplifying and solving the RANS for pressure fluctuation and adding appropriate scales. Most subsequent models (including Goody, Efimtsov, Laganelli, Smol’yakov, and Rackl and Weston models) were derived by making modifications to these early models or by physical principles. Overall, these models have had varying levels of accuracy, but, in general, they are most accurate under the specific Reynolds and Mach numbers they were developed for, while being less accurate under other flow conditions. Despite this, recent research into the possibility of using alternative methods for deriving the models has been rather limited. More recent studies have demonstrated that an artificial neural network model was more accurate than traditional models and could be applied more generally, but the accuracy of other machine learning techniques has not been explored. In the current study, an original model is derived using a stepwise regression algorithm in the statistical programming language R, and TBL wall-pressure fluctuations PSD data gathered at the Carleton University wind tunnel. The theoretical advantage of a stepwise regression approach is that it will automatically filter out redundant or uncorrelated input variables (through the process of feature selection), and it is computationally faster than machine learning. The main disadvantage is the potential risk of overfitting. The accuracy of the developed model is assessed by comparing it to independently sourced datasets.Keywords: aircraft noise, machine learning, power spectral density models, regression models, turbulent boundary layer wall-pressure fluctuations
Procedia PDF Downloads 13520409 Management of Cultural Heritage: Bologna Gates
Authors: Alfonso Ippolito, Cristiana Bartolomei
Abstract:
A growing demand is felt today for realistic 3D models enabling the cognition and popularization of historical-artistic heritage. Evaluation and preservation of Cultural Heritage is inextricably connected with the innovative processes of gaining, managing, and using knowledge. The development and perfecting of techniques for acquiring and elaborating photorealistic 3D models, made them pivotal elements for popularizing information of objects on the scale of architectonic structures.Keywords: cultural heritage, databases, non-contact survey, 2D-3D models
Procedia PDF Downloads 42320408 A Risk Management Framework for Selling a Mega Power Plant Project in a New Market
Authors: Negar Ganjouhaghighi, Amirali Dolatshahi
Abstract:
The origin of most risks of a mega project usually takes place in the phases before closing the contract. As a practical point of view, using project risk management techniques for preparing a proposal is not a total solution for managing the risks of a contract. The objective of this paper is to cover all those activities associated with risk management of a mega project sale’s processes; from entrance to a new market to awarding activities and the review of contract performance. In this study, the risk management happens in six consecutive steps that are divided into three distinct but interdependent phases upstream of the award of the contract: pre-tendering, tendering and closing. In the first step, by preparing standard market risk report, risks of the new market are identified. The next step is the bid or no bid decision making based on the previous gathered data. During the next three steps in tendering phase, project risk management techniques are applied for determining how much contingency reserve must be added or reduced to the estimated cost in order to put the residual risk to an acceptable level. Finally, the last step which happens in closing phase would be an overview of the project risks and final clarification of residual risks. The sales experience of more than 20,000 MW turn-key power plant projects alongside this framework, are used to develop a software that assists the sales team to have a better project risk management.Keywords: project marketing, risk management, tendering, project management, turn-key projects
Procedia PDF Downloads 32920407 The Role of Information Technology in Supply Chain Management
Authors: V. Jagadeesh, K. Venkata Subbaiah, P. Govinda Rao
Abstract:
This paper explaining about the significance of information technology tools and software packages in supply chain management (SCM) in order to manage the entire supply chain. Managing materials flow and financial flow and information flow effectively and efficiently with the aid of information technology tools and packages in order to deliver right quantity with right quality of goods at right time by using right methods and technology. Information technology plays a vital role in streamlining the sales forecasting and demand planning and Inventory control and transportation in supply networks and finally deals with production planning and scheduling. It achieves the objectives by streamlining the business process and integrates within the enterprise and its extended enterprise. SCM starts with customer and it involves sequence of activities from customer, retailer, distributor, manufacturer and supplier within the supply chain framework. It is the process of integrating demand planning and supply network planning and production planning and control. Forecasting indicates the direction for planning raw materials in order to meet the production planning requirements. Inventory control and transportation planning allocate the optimal or economic order quantity by utilizing shortest possible routes to deliver the goods to the customer. Production planning and control utilize the optimal resources mix in order to meet the capacity requirement planning. The above operations can be achieved by using appropriate information technology tools and software packages for the supply chain management.Keywords: supply chain management, information technology, business process, extended enterprise
Procedia PDF Downloads 37620406 The Use of Haar Wavelet Mother Signal Tool for Performance Analysis Response of Distillation Column (Application to Moroccan Case Study)
Authors: Mahacine Amrani
Abstract:
This paper aims at reviewing some Moroccan industrial applications of wavelet especially in the dynamic identification of a process model using Haar wavelet mother response. Two recent Moroccan study cases are described using dynamic data originated by a distillation column and an industrial polyethylene process plant. The purpose of the wavelet scheme is to build on-line dynamic models. In both case studies, a comparison is carried out between the Haar wavelet mother response model and a linear difference equation model. Finally it concludes, on the base of the comparison of the process performances and the best responses, which may be useful to create an estimated on-line internal model control and its application towards model-predictive controllers (MPC). All calculations were implemented using AutoSignal Software.Keywords: process performance, model, wavelets, Haar, Moroccan
Procedia PDF Downloads 31720405 Data-Centric Anomaly Detection with Diffusion Models
Authors: Sheldon Liu, Gordon Wang, Lei Liu, Xuefeng Liu
Abstract:
Anomaly detection, also referred to as one-class classification, plays a crucial role in identifying product images that deviate from the expected distribution. This study introduces Data-centric Anomaly Detection with Diffusion Models (DCADDM), presenting a systematic strategy for data collection and further diversifying the data with image generation via diffusion models. The algorithm addresses data collection challenges in real-world scenarios and points toward data augmentation with the integration of generative AI capabilities. The paper explores the generation of normal images using diffusion models. The experiments demonstrate that with 30% of the original normal image size, modeling in an unsupervised setting with state-of-the-art approaches can achieve equivalent performances. With the addition of generated images via diffusion models (10% equivalence of the original dataset size), the proposed algorithm achieves better or equivalent anomaly localization performance.Keywords: diffusion models, anomaly detection, data-centric, generative AI
Procedia PDF Downloads 8220404 Copula Markov Switching Multifractal Models for Forecasting Value-at-Risk
Authors: Giriraj Achari, Malay Bhattacharyya
Abstract:
In this paper, the effectiveness of Copula Markov Switching Multifractal (MSM) models at forecasting Value-at-Risk of a two-stock portfolio is studied. The innovations are allowed to be drawn from distributions that can capture skewness and leptokurtosis, which are well documented empirical characteristics observed in financial returns. The candidate distributions considered for this purpose are Johnson-SU, Pearson Type-IV and α-Stable distributions. The two univariate marginal distributions are combined using the Student-t copula. The estimation of all parameters is performed by Maximum Likelihood Estimation. Finally, the models are compared in terms of accurate Value-at-Risk (VaR) forecasts using tests of unconditional coverage and independence. It is found that Copula-MSM-models with leptokurtic innovation distributions perform slightly better than Copula-MSM model with Normal innovations. Copula-MSM models, in general, produce better VaR forecasts as compared to traditional methods like Historical Simulation method, Variance-Covariance approach and Copula-Generalized Autoregressive Conditional Heteroscedasticity (Copula-GARCH) models.Keywords: Copula, Markov Switching, multifractal, value-at-risk
Procedia PDF Downloads 16420403 Digital Marketing Maturity Models: Overview and Comparison
Authors: Elina Bakhtieva
Abstract:
The variety of available digital tools, strategies and activities might confuse and disorient even an experienced marketer. This applies in particular to B2B companies, which are usually less flexible in uptaking of digital technology than B2C companies. B2B companies are lacking a framework that corresponds to the specifics of the B2B business, and which helps to evaluate a company’s capabilities and to choose an appropriate path. A B2B digital marketing maturity model helps to fill this gap. However, modern marketing offers no widely approved digital marketing maturity model, and thus, some marketing institutions provide their own tools. The purpose of this paper is building an optimized B2B digital marketing maturity model based on a SWOT (strengths, weaknesses, opportunities, and threats) analysis of existing models. The current study provides an analytical review of the existing digital marketing maturity models with open access. The results of the research are twofold. First, the provided SWOT analysis outlines the main advantages and disadvantages of existing models. Secondly, the strengths of existing digital marketing maturity models, helps to identify the main characteristics and the structure of an optimized B2B digital marketing maturity model. The research findings indicate that only one out of three analyzed models could be used as a separate tool. This study is among the first examining the use of maturity models in digital marketing. It helps businesses to choose between the existing digital marketing models, the most effective one. Moreover, it creates a base for future research on digital marketing maturity models. This study contributes to the emerging B2B digital marketing literature by providing a SWOT analysis of the existing digital marketing maturity models and suggesting a structure and main characteristics of an optimized B2B digital marketing maturity model.Keywords: B2B digital marketing strategy, digital marketing, digital marketing maturity model, SWOT analysis
Procedia PDF Downloads 344