Search results for: ESCA potential model
25377 Metamorphic Computer Virus Classification Using Hidden Markov Model
Authors: Babak Bashari Rad
Abstract:
A metamorphic computer virus uses different code transformation techniques to mutate its body in duplicated instances. Characteristics and function of new instances are mostly similar to their parents, but they cannot be easily detected by the majority of antivirus in market, as they depend on string signature-based detection techniques. The purpose of this research is to propose a Hidden Markov Model for classification of metamorphic viruses in executable files. In the proposed solution, portable executable files are inspected to extract the instructions opcodes needed for the examination of code. A Hidden Markov Model trained on portable executable files is employed to classify the metamorphic viruses of the same family. The proposed model is able to generate and recognize common statistical features of mutated code. The model has been evaluated by examining the model on a test data set. The performance of the model has been practically tested and evaluated based on False Positive Rate, Detection Rate and Overall Accuracy. The result showed an acceptable performance with high average of 99.7% Detection Rate.Keywords: malware classification, computer virus classification, metamorphic virus, metamorphic malware, Hidden Markov Model
Procedia PDF Downloads 31525376 Monitoring Large-Coverage Forest Canopy Height by Integrating LiDAR and Sentinel-2 Images
Authors: Xiaobo Liu, Rakesh Mishra, Yun Zhang
Abstract:
Continuous monitoring of forest canopy height with large coverage is essential for obtaining forest carbon stocks and emissions, quantifying biomass estimation, analyzing vegetation coverage, and determining biodiversity. LiDAR can be used to collect accurate woody vegetation structure such as canopy height. However, LiDAR’s coverage is usually limited because of its high cost and limited maneuverability, which constrains its use for dynamic and large area forest canopy monitoring. On the other hand, optical satellite images, like Sentinel-2, have the ability to cover large forest areas with a high repeat rate, but they do not have height information. Hence, exploring the solution of integrating LiDAR data and Sentinel-2 images to enlarge the coverage of forest canopy height prediction and increase the prediction repeat rate has been an active research topic in the environmental remote sensing community. In this study, we explore the potential of training a Random Forest Regression (RFR) model and a Convolutional Neural Network (CNN) model, respectively, to develop two predictive models for predicting and validating the forest canopy height of the Acadia Forest in New Brunswick, Canada, with a 10m ground sampling distance (GSD), for the year 2018 and 2021. Two 10m airborne LiDAR-derived canopy height models, one for 2018 and one for 2021, are used as ground truth to train and validate the RFR and CNN predictive models. To evaluate the prediction performance of the trained RFR and CNN models, two new predicted canopy height maps (CHMs), one for 2018 and one for 2021, are generated using the trained RFR and CNN models and 10m Sentinel-2 images of 2018 and 2021, respectively. The two 10m predicted CHMs from Sentinel-2 images are then compared with the two 10m airborne LiDAR-derived canopy height models for accuracy assessment. The validation results show that the mean absolute error (MAE) for year 2018 of the RFR model is 2.93m, CNN model is 1.71m; while the MAE for year 2021 of the RFR model is 3.35m, and the CNN model is 3.78m. These demonstrate the feasibility of using the RFR and CNN models developed in this research for predicting large-coverage forest canopy height at 10m spatial resolution and a high revisit rate.Keywords: remote sensing, forest canopy height, LiDAR, Sentinel-2, artificial intelligence, random forest regression, convolutional neural network
Procedia PDF Downloads 9425375 Reaching New Levels: Using Systems Thinking to Analyse a Major Incident Investigation
Authors: Matthew J. I. Woolley, Gemma J. M. Read, Paul M. Salmon, Natassia Goode
Abstract:
The significance of high consequence, workplace failures within construction continues to resonate with a combined average of 12 fatal incidents occurring daily throughout Australia, the United Kingdom, and the United States. Within the Australian construction domain, more than 35 serious, compensable injury incidents are reported daily. These alarming figures, in conjunction with the continued occurrence of fatal and serious, occupational injury incidents globally suggest existing approaches to incident analysis may not be achieving required injury prevention outcomes. One reason may be that, incident analysis methods used in construction have not kept pace with advances in the field of safety science and are not uncovering the full range system-wide contributory factors that are required to achieve optimal levels of construction safety performance. Another reason underpinning this global issue may also be the absence of information surrounding the construction operating and project delivery system. For example, it is not clear who shares the responsibility for construction safety in different contexts. To respond to this issue, to the author’s best knowledge, a first of its kind, control structure model of the construction industry is presented and then used to analyse a fatal construction incident. The model was developed by applying and extending the Systems Theoretic and Incident Model and Process method to hierarchically represent the actors, constraints, feedback mechanisms, and relationships that are involved in managing construction safety performance. The Causal Analysis based on Systems Theory (CAST) method was then used to identify the control and feedback failures involved in the fatal incident. The conclusions from the Coronial investigation into the event are compared with the findings stemming from the CAST analysis. The CAST analysis highlighted additional issues across the construction system that were not identified in the coroner’s recommendations, suggested there is a potential benefit in applying a systems theory approach to incident analysis in construction. The findings demonstrate the utility applying systems theory-based methods to the analysis of construction incidents. Specifically, this study shows the utility of the construction control structure and the potential benefits for project leaders, construction entities, regulators, and construction clients in controlling construction performance.Keywords: construction project management, construction performance, incident analysis, systems thinking
Procedia PDF Downloads 13125374 UML Model for Double-Loop Control Self-Adaptive Braking System
Authors: Heung Sun Yoon, Jong Tae Kim
Abstract:
In this paper, we present an activity diagram model for double-loop control self-adaptive braking system. Since activity diagram helps to improve visibility of self-adaption, we can easily find where improvement is needed on double-loop control. Double-loop control is adopted since the design conditions and actual conditions can be different. The system is reconfigured in runtime by using double-loop control. We simulated to verify and validate our model by using MATLAB. We compared single-loop control model with double-loop control model. Simulation results show that double-loop control provides more consistent brake power control than single-loop control.Keywords: activity diagram, automotive, braking system, double-loop, self-adaptive, UML, vehicle
Procedia PDF Downloads 41725373 Entrepreneurship Skills Acquisition through Education: Impact of the Nurturance of Knowledge, Skills, and Attitude on New Venture Creation
Authors: Satya Ranjan Acharya, Yamini Chandra
Abstract:
Entrepreneurship through higher education has taken a paradigm shift from traditional classroom lecture series method to a modern approach, which lay emphasis on nurturing competencies, enhancing knowledge, skills, attitudes/abilities (KSA), which has positive impact on the development of core capabilities. The present paper was focused on the analysis of entrepreneurship education as a pedagogical intervention for the post-graduate program offered at the Entrepreneurship Development Institute of India, Gujarat, India. The study is focused on a model with special emphasis on developing KSA and its effect on nurturing entrepreneurial spirit within students. The findings represent demographic and thematic assessment of the implemented pedagogical model with an outcome of students choosing a career in new venture creation or growth/diversification of family owned businesses. This research will be helpful for academicians, research scholars, potential entrepreneurs, ecosystem enablers and students to infer the effectiveness of nurturing entrepreneurial skills and bringing more changes in personal attitudes by the way of enhancing the knowledge and skills required for the execution of an entrepreneurial career. This research is original in nature as it provides an in-depth insight into an implemented model of curriculum, focused on the development and nurturance of basic skills and its impact on the career choice of students.Keywords: attitude, entrepreneurship education, knowledge, new venture creation, pedagogical intervention, skills
Procedia PDF Downloads 19325372 Digital Reconstruction of Museum's Statue Using 3D Scanner for Cultural Preservation in Indonesia
Authors: Ahmad Zaini, F. Muhammad Reza Hadafi, Surya Sumpeno, Muhtadin, Mochamad Hariadi
Abstract:
The lack of information about museum’s collection reduces the number of visits of museum. Museum’s revitalization is an urgent activity to increase the number of visits. The research's roadmap is building a web-based application that visualizes museum in the virtual form including museum's statue reconstruction in the form of 3D. This paper describes implementation of three-dimensional model reconstruction method based on light-strip pattern on the museum statue using 3D scanner. Noise removal, alignment, meshing and refinement model's processes is implemented to get a better 3D object reconstruction. Model’s texture derives from surface texture mapping between object's images with reconstructed 3D model. Accuracy test of dimension of the model is measured by calculating relative error of virtual model dimension compared against the original object. The result is realistic three-dimensional model textured with relative error around 4.3% to 5.8%.Keywords: 3D reconstruction, light pattern structure, texture mapping, museum
Procedia PDF Downloads 46825371 Photocatalytic Degradation of Acid Dye Over Ag, Loaded ZnO Under UV/Solar Light
Authors: Farida Kaouah, Wassila Hachi, Lamia Brahmi, Chahida Ousselah, Salim Boumaza, Mohamed Trari
Abstract:
The feasibility of using solar irradiation instead of UV light in photocatalysis is a promising approach for water treatment. In this study, photocatalytic degradation of a widely used textile dye, Acid Blue 25 (AB25), with noble metal loaded ZnO photocatalyst (Ag/ZnO), was investigated in aqueous suspension under solar light. The results showed that the deposition of Ag as a noble metal onto the ZnO surface, improved the photodegradation of AB25. . The effect of different parameters such as catalyst dose, initial dye concentration, and contact time was optimized and the optimal degradation of AB25 (97%) was achieved for initial AB25 concentration of 24 mg L−1 an catalyst dose of 1 g L−1 at natural pH (5.42) after 180 min. The kinetic studies were achieved and revealed that the photocatalytic degradation process obeyed to Langmuir–Hinshelwood model and followed a pseudo-first order rate expression. This work envisages the great potential that sunlight photocatalysis has in the degradation of dyes from wastewaterKeywords: acid dye, photocatalytic degradation, sunlight, zinc oxide, noble metal, Langmuir–Hinshelwood model
Procedia PDF Downloads 11225370 Socioeconomic Status and Gender Influence on Linguistic Change: A Case Study on Language Competence and Confidence of Multilingual Minority Language Speakers
Authors: Stefanie Siebenhütter
Abstract:
Male and female speakers use language differently and with varying confidence levels. This paper contrasts gendered differences in language use with socioeconomic status and age factors. It specifically examines how Kui minority language use and competence are conditioned by the variable of gender and discusses potential reasons for this variation by examining gendered language awareness and sociolinguistic attitudes. Moreover, it discusses whether women in Kui society function as 'leaders of linguistic change', as represented in Labov’s sociolinguistic model. It discusses whether societal role expectations in collectivistic cultures influence the model of linguistic change. The findings reveal current Kui speaking preferences and give predictions on the prospective language use, which is a stable situation of multilingualism because the current Kui speakers will socialize and teach the prospective Kui speakers in the near future. It further confirms that Lao is losing importance in Kui speaker’s (female’s) daily life.Keywords: gender, identity construction, language change, minority language, multilingualism, sociolinguistics, social Networks
Procedia PDF Downloads 17825369 Quantitative Structure-Activity Relationship Analysis of Binding Affinity of a Series of Anti-Prion Compounds to Human Prion Protein
Authors: Strahinja Kovačević, Sanja Podunavac-Kuzmanović, Lidija Jevrić, Milica Karadžić
Abstract:
The present study is based on the quantitative structure-activity relationship (QSAR) analysis of eighteen compounds with anti-prion activity. The structures and anti-prion activities (expressed in response units, RU%) of the analyzed compounds are taken from CHEMBL database. In the first step of analysis 85 molecular descriptors were calculated and based on them the hierarchical cluster analysis (HCA) and principal component analysis (PCA) were carried out in order to detect potential significant similarities or dissimilarities among the studied compounds. The calculated molecular descriptors were physicochemical, lipophilicity and ADMET (absorption, distribution, metabolism, excretion and toxicity) descriptors. The first stage of the QSAR analysis was simple linear regression modeling. It resulted in one acceptable model that correlates Henry's law constant with RU% units. The obtained 2D-QSAR model was validated by cross-validation as an internal validation method. The validation procedure confirmed the model’s quality and therefore it can be used for prediction of anti-prion activity. The next stage of the analysis of anti-prion activity will include 3D-QSAR and molecular docking approaches in order to select the most promising compounds in treatment of prion diseases. These results are the part of the project No. 114-451-268/2016-02 financially supported by the Provincial Secretariat for Science and Technological Development of AP Vojvodina.Keywords: anti-prion activity, chemometrics, molecular modeling, QSAR
Procedia PDF Downloads 30425368 Applying Laser Scanning and Digital Photogrammetry for Developing an Archaeological Model Structure for Old Castle in Germany
Authors: Bara' Al-Mistarehi
Abstract:
Documentation and assessment of conservation state of an archaeological structure is a significant procedure in any management plan. However, it has always been a challenge to apply this with a low coast and safe methodology. It is also a time-demanding procedure. Therefore, a low cost, efficient methodology for documenting the state of a structure is needed. In the scope of this research, this paper will employ digital photogrammetry and laser scanner to one of highly significant structures in Germany, The Old Castle (German: Altes Schloss). The site is well known for its unique features. However, the castle suffers from serious deterioration threats because of the environmental conditions and the absence of continuous monitoring, maintenance and repair plans. Digital photogrammetry is a generally accepted technique for the collection of 3D representations of the environment. For this reason, this image-based technique has been extensively used to produce high quality 3D models of heritage sites and historical buildings for documentation and presentation purposes. Additionally, terrestrial laser scanners are used, which directly measure 3D surface coordinates based on the run-time of reflected light pulses. These systems feature high data acquisition rates, good accuracy and high spatial data density. Despite the potential of each single approach, in this research work maximum benefit is to be expected by a combination of data from both digital cameras and terrestrial laser scanners. Within the paper, the usage, application and advantages of the technique will be investigated in terms of building high realistic 3D textured model for some parts of the old castle. The model will be used as diagnosing tool of the conservation state of the castle and monitoring mean for future changes.Keywords: Digital photogrammetry, Terrestrial laser scanners, 3D textured model, archaeological structure
Procedia PDF Downloads 18025367 Evaluation of Turbulence Modelling of Gas-Liquid Two-Phase Flow in a Venturi
Authors: Mengke Zhan, Cheng-Gang Xie, Jian-Jun Shu
Abstract:
A venturi flowmeter is a common device used in multiphase flow rate measurement in the upstream oil and gas industry. Having a robust computational model for multiphase flow in a venturi is desirable for understanding the gas-liquid and fluid-pipe interactions and predicting pressure and phase distributions under various flow conditions. A steady Eulerian-Eulerian framework is used to simulate upward gas-liquid flow in a vertical venturi. The simulation results are compared with experimental measurements of venturi differential pressure and chord-averaged gas holdup in the venturi throat section. The choice of turbulence model is nontrivial in the multiphase flow modelling in a venturi. The performance cross-comparison of the k-ϵ model, Reynolds stress model (RSM) and shear-stress transport (SST) k-ω turbulence model is made in the study. In terms of accuracy and computational cost, the SST k-ω turbulence model is observed to be the most efficient.Keywords: computational fluid dynamics (CFD), gas-liquid flow, turbulence modelling, venturi
Procedia PDF Downloads 17325366 Evaluation of High Damping Rubber Considering Initial History through Dynamic Loading Test and Program Analysis
Authors: Kyeong Hoon Park, Taiji Mazuda
Abstract:
High damping rubber (HDR) bearings are dissipating devices mainly used in seismic isolation systems and have a great damping performance. Although many studies have been conducted on the dynamic model of HDR bearings, few models can reflect phenomena such as dependency of experienced shear strain on initial history. In order to develop a model that can represent the dependency of experienced shear strain of HDR by Mullins effect, dynamic loading test was conducted using HDR specimen. The reaction of HDR was measured by applying a horizontal vibration using a hybrid actuator under a constant vertical load. Dynamic program analysis was also performed after dynamic loading test. The dynamic model applied in program analysis is a bilinear type double-target model. This model is modified from typical bilinear model. This model can express the nonlinear characteristics related to the initial history of HDR bearings. Based on the dynamic loading test and program analysis results, equivalent stiffness and equivalent damping ratio were calculated to evaluate the mechanical properties of HDR and the feasibility of the bilinear type double-target model was examined.Keywords: base-isolation, bilinear model, high damping rubber, loading test
Procedia PDF Downloads 12325365 Analysis of Reliability of Mining Shovel Using Weibull Model
Authors: Anurag Savarnya
Abstract:
The reliability of the various parts of electric mining shovel has been assessed through the application of Weibull Model. The study was initiated to find reliability of components of electric mining shovel. The paper aims to optimize the reliability of components and increase the life cycle of component. A multilevel decomposition of the electric mining shovel was done and maintenance records were used to evaluate the failure data and appropriate system characterization was done to model the system in terms of reasonable number of components. The approach used develops a mathematical model to assess the reliability of the electric mining shovel components. The model can be used to predict reliability of components of the hydraulic mining shovel and system performance. Reliability is an inherent attribute to a system. When the life-cycle costs of a system are being analyzed, reliability plays an important role as a major driver of these costs and has considerable influence on system performance. It is an iterative process that begins with specification of reliability goals consistent with cost and performance objectives. The data were collected from an Indian open cast coal mine and the reliability of various components of the electric mining shovel has been assessed by following a Weibull Model.Keywords: reliability, Weibull model, electric mining shovel
Procedia PDF Downloads 51525364 R Software for Parameter Estimation of Spatio-Temporal Model
Authors: Budi Nurani Ruchjana, Atje Setiawan Abdullah, I. Gede Nyoman Mindra Jaya, Eddy Hermawan
Abstract:
In this paper, we propose the application package to estimate parameters of spatiotemporal model based on the multivariate time series analysis using the R open-source software. We build packages mainly to estimate the parameters of the Generalized Space Time Autoregressive (GSTAR) model. GSTAR is a combination of time series and spatial models that have parameters vary per location. We use the method of Ordinary Least Squares (OLS) and use the Mean Average Percentage Error (MAPE) to fit the model to spatiotemporal real phenomenon. For case study, we use oil production data from volcanic layer at Jatibarang Indonesia or climate data such as rainfall in Indonesia. Software R is very user-friendly and it is making calculation easier, processing the data is accurate and faster. Limitations R script for the estimation of model parameters spatiotemporal GSTAR built is still limited to a stationary time series model. Therefore, the R program under windows can be developed either for theoretical studies and application.Keywords: GSTAR Model, MAPE, OLS method, oil production, R software
Procedia PDF Downloads 24325363 Forecasting Models for Steel Demand Uncertainty Using Bayesian Methods
Authors: Watcharin Sangma, Onsiri Chanmuang, Pitsanu Tongkhow
Abstract:
A forecasting model for steel demand uncertainty in Thailand is proposed. It consists of trend, autocorrelation, and outliers in a hierarchical Bayesian frame work. The proposed model uses a cumulative Weibull distribution function, latent first-order autocorrelation, and binary selection, to account for trend, time-varying autocorrelation, and outliers, respectively. The Gibbs sampling Markov Chain Monte Carlo (MCMC) is used for parameter estimation. The proposed model is applied to steel demand index data in Thailand. The root mean square error (RMSE), mean absolute percentage error (MAPE), and mean absolute error (MAE) criteria are used for model comparison. The study reveals that the proposed model is more appropriate than the exponential smoothing method.Keywords: forecasting model, steel demand uncertainty, hierarchical Bayesian framework, exponential smoothing method
Procedia PDF Downloads 35025362 Developing Fuzzy Logic Model for Reliability Estimation: Case Study
Authors: Soroor K. H. Al-Khafaji, Manal Mohammad Abed
Abstract:
The research aim of this paper is to evaluate the reliability of a complex engineering system and to design a fuzzy model for the reliability estimation. The designed model has been applied on Vegetable Oil Purification System (neutralization system) to help the specialist user based on the concept of FMEA (Failure Mode and Effect Analysis) to estimate the reliability of the repairable system at the vegetable oil industry. The fuzzy model has been used to predict the system reliability for a future time period, depending on a historical database for the two past years. The model can help to specify the system malfunctions and to predict its reliability during a future period in more accurate and reasonable results compared with the results obtained by the traditional method of reliability estimation.Keywords: fuzzy logic, reliability, repairable systems, FMEA
Procedia PDF Downloads 61625361 Developing a Systems Dynamics Model for Security Management
Authors: Kuan-Chou Chen
Abstract:
This paper will demonstrate a simulation model of an information security system by using the systems dynamic approach. The relationships in the system model are designed to be simple and functional and do not necessarily represent any particular information security environments. The purpose of the paper aims to develop a generic system dynamic information security system model with implications on information security research. The interrelated and interdependent relationships of five primary sectors in the system dynamic model will be presented in this paper. The integrated information security systems model will include (1) information security characteristics, (2) users, (3) technology, (4) business functions, and (5) policy and management. Environments, attacks, government and social culture will be defined as the external sector. The interactions within each of these sectors will be depicted by system loop map as well. The proposed system dynamic model will not only provide a conceptual framework for information security analysts and designers but also allow information security managers to remove the incongruity between the management of risk incidents and the management of knowledge and further support information security managers and decision makers the foundation for managerial actions and policy decisions.Keywords: system thinking, information security systems, security management, simulation
Procedia PDF Downloads 43125360 Location Quotients Model in Turkey’s Provinces and Nuts II Regions
Authors: Semih Sözer
Abstract:
One of the most common issues in economic systems is understanding characteristics of economic activities in cities and regions. Although there are critics to economic base models in conceptual and empirical aspects, these models are useful tools to examining the economic structure of a nation, regions or cities. This paper uses one of the methodologies of economic base models namely the location quotients model. Data for this model includes employment numbers of provinces and NUTS II regions in Turkey. Time series of data covers the years of 1990, 2000, 2003, and 2009. Aim of this study is finding which sectors are export-base and which sectors are import-base in provinces and regions. Model results show that big provinces or powerful regions (population, size etc.) mostly have basic sectors in their economic system. However, interesting facts came from different sectors in different provinces and regions in the model results.Keywords: economic base, location quotients model, regional economics, regional development
Procedia PDF Downloads 42625359 An Investigation into the Crystallization Tendency/Kinetics of Amorphous Active Pharmaceutical Ingredients: A Case Study with Dipyridamole and Cinnarizine
Authors: Shrawan Baghel, Helen Cathcart, Biall J. O'Reilly
Abstract:
Amorphous drug formulations have great potential to enhance solubility and thus bioavailability of BCS class II drugs. However, the higher free energy and molecular mobility of the amorphous form lowers the activation energy barrier for crystallization and thermodynamically drives it towards the crystalline state which makes them unstable. Accurate determination of the crystallization tendency/kinetics is the key to the successful design and development of such systems. In this study, dipyridamole (DPM) and cinnarizine (CNZ) has been selected as model compounds. Thermodynamic fragility (m_T) is measured from the heat capacity change at the glass transition temperature (Tg) whereas dynamic fragility (m_D) is evaluated using methods based on extrapolation of configurational entropy to zero 〖(m〗_(D_CE )), and heating rate dependence of Tg 〖(m〗_(D_Tg)). The mean relaxation time of amorphous drugs was calculated from Vogel-Tammann-Fulcher (VTF) equation. Furthermore, the correlation between fragility and glass forming ability (GFA) of model drugs has been established and the relevance of these parameters to crystallization of amorphous drugs is also assessed. Moreover, the crystallization kinetics of model drugs under isothermal conditions has been studied using Johnson-Mehl-Avrami (JMA) approach to determine the Avrami constant ‘n’ which provides an insight into the mechanism of crystallization. To further probe into the crystallization mechanism, the non-isothermal crystallization kinetics of model systems was also analysed by statistically fitting the crystallization data to 15 different kinetic models and the relevance of model-free kinetic approach has been established. In addition, the crystallization mechanism for DPM and CNZ at each extent of transformation has been predicted. The calculated fragility, glass forming ability (GFA) and crystallization kinetics is found to be in good correlation with the stability prediction of amorphous solid dispersions. Thus, this research work involves a multidisciplinary approach to establish fragility, GFA and crystallization kinetics as stability predictors for amorphous drug formulations.Keywords: amorphous, fragility, glass forming ability, molecular mobility, mean relaxation time, crystallization kinetics, stability
Procedia PDF Downloads 35425358 A General Iterative Nonlinear Programming Method to Synthesize Heat Exchanger Network
Authors: Rupu Yang, Cong Toan Tran, Assaad Zoughaib
Abstract:
The work provides an iterative nonlinear programming method to synthesize a heat exchanger network by manipulating the trade-offs between the heat load of process heat exchangers (HEs) and utilities. We consider for the synthesis problem two cases, the first one without fixed cost for HEs, and the second one with fixed cost. For the no fixed cost problem, the nonlinear programming (NLP) model with all the potential HEs is optimized to obtain the global optimum. For the case with fixed cost, the NLP model is iterated through adding/removing HEs. The method was applied in five case studies and illustrated quite well effectiveness. Among which, the approach reaches the lowest TAC (2,904,026$/year) compared with the best record for the famous Aromatic plants problem. It also locates a slightly better design than records in literature for a 10 streams case without fixed cost with only 1/9 computational time. Moreover, compared to the traditional mixed-integer nonlinear programming approach, the iterative NLP method opens a possibility to consider constraints (such as controllability or dynamic performances) that require knowing the structure of the network to be calculated.Keywords: heat exchanger network, synthesis, NLP, optimization
Procedia PDF Downloads 16425357 Media Richness Perspective on Web 2.0 Usage for Knowledge Creation: The Case of the Cocoa Industry in Ghana
Authors: Albert Gyamfi
Abstract:
Cocoa plays critical role in the socio-economic development of Ghana. Meanwhile, smallholder farmers most of whom are illiterate dominate the industry. According to the cocoa-based agricultural knowledge and information system (AKIS) model knowledge is created and transferred to the industry between three key actors: cocoa researchers, extension experts, and cocoa farmers. Dwelling on the SECI model, the media richness theory (MRT), and the AKIS model, a conceptual model of web 2.0-based AKIS model (AKIS 2.0) is developed and used to assess the possible effects of social media usage for knowledge creation in the Ghanaian cocoa industry. A mixed method approach with a survey questionnaire was employed, and a second-order multi-group structural equation model (SEM) was used to analyze the data. The study concludes that the use of web 2.0 applications for knowledge creation would lead to sustainable interactions among the key knowledge actors for effective knowledge creation in the cocoa industry in Ghana.Keywords: agriculture, cocoa, knowledge, media, web 2.0
Procedia PDF Downloads 33525356 Artificial Neural Network Based Approach for Estimation of Individual Vehicle Speed under Mixed Traffic Condition
Authors: Subhadip Biswas, Shivendra Maurya, Satish Chandra, Indrajit Ghosh
Abstract:
Developing speed model is a challenging task particularly under mixed traffic condition where the traffic composition plays a significant role in determining vehicular speed. The present research has been conducted to model individual vehicular speed in the context of mixed traffic on an urban arterial. Traffic speed and volume data have been collected from three midblock arterial road sections in New Delhi. Using the field data, a volume based speed prediction model has been developed adopting the methodology of Artificial Neural Network (ANN). The model developed in this work is capable of estimating speed for individual vehicle category. Validation results show a great deal of agreement between the observed speeds and the predicted values by the model developed. Also, it has been observed that the ANN based model performs better compared to other existing models in terms of accuracy. Finally, the sensitivity analysis has been performed utilizing the model in order to examine the effects of traffic volume and its composition on individual speeds.Keywords: speed model, artificial neural network, arterial, mixed traffic
Procedia PDF Downloads 38825355 Prioritization of Customer Order Selection Factors by Utilizing Conjoint Analysis: A Case Study for a Structural Steel Firm
Authors: Burcu Akyildiz, Cigdem Kadaifci, Y. Ilker Topcu, Burc Ulengin
Abstract:
In today’s business environment, companies should make strategic decisions to gain sustainable competitive advantage. Order selection is a crucial issue among these decisions especially for steel production industry. When the companies allocate a high proportion of their design and production capacities to their ongoing projects, determining which customer order should be chosen among the potential orders without exceeding the remaining capacity is the major critical problem. In this study, it is aimed to identify and prioritize the evaluation factors for the customer order selection problem. Conjoint analysis is used to examine the importance level of each factor which is determined as the potential profit rate per unit of time, the compatibility of potential order with available capacity, the level of potential future order with higher profit, customer credit of future business opportunity, and the negotiability level of production schedule for the order.Keywords: conjoint analysis, order prioritization, profit management, structural steel firm
Procedia PDF Downloads 38425354 COVID-19 Analysis with Deep Learning Model Using Chest X-Rays Images
Authors: Uma Maheshwari V., Rajanikanth Aluvalu, Kumar Gautam
Abstract:
The COVID-19 disease is a highly contagious viral infection with major worldwide health implications. The global economy suffers as a result of COVID. The spread of this pandemic disease can be slowed if positive patients are found early. COVID-19 disease prediction is beneficial for identifying patients' health problems that are at risk for COVID. Deep learning and machine learning algorithms for COVID prediction using X-rays have the potential to be extremely useful in solving the scarcity of doctors and clinicians in remote places. In this paper, a convolutional neural network (CNN) with deep layers is presented for recognizing COVID-19 patients using real-world datasets. We gathered around 6000 X-ray scan images from various sources and split them into two categories: normal and COVID-impacted. Our model examines chest X-ray images to recognize such patients. Because X-rays are commonly available and affordable, our findings show that X-ray analysis is effective in COVID diagnosis. The predictions performed well, with an average accuracy of 99% on training photographs and 88% on X-ray test images.Keywords: deep CNN, COVID–19 analysis, feature extraction, feature map, accuracy
Procedia PDF Downloads 8125353 A Hybrid Genetic Algorithm and Neural Network for Wind Profile Estimation
Authors: M. Saiful Islam, M. Mohandes, S. Rehman, S. Badran
Abstract:
Increasing necessity of wind power is directing us to have precise knowledge on wind resources. Methodical investigation of potential locations is required for wind power deployment. High penetration of wind energy to the grid is leading multi megawatt installations with huge investment cost. This fact appeals to determine appropriate places for wind farm operation. For accurate assessment, detailed examination of wind speed profile, relative humidity, temperature and other geological or atmospheric parameters are required. Among all of these uncertainty factors influencing wind power estimation, vertical extrapolation of wind speed is perhaps the most difficult and critical one. Different approaches have been used for the extrapolation of wind speed to hub height which are mainly based on Log law, Power law and various modifications of the two. This paper proposes a Artificial Neural Network (ANN) and Genetic Algorithm (GA) based hybrid model, namely GA-NN for vertical extrapolation of wind speed. This model is very simple in a sense that it does not require any parametric estimations like wind shear coefficient, roughness length or atmospheric stability and also reliable compared to other methods. This model uses available measured wind speeds at 10m, 20m and 30m heights to estimate wind speeds up to 100m. A good comparison is found between measured and estimated wind speeds at 30m and 40m with approximately 3% mean absolute percentage error. Comparisons with ANN and power law, further prove the feasibility of the proposed method.Keywords: wind profile, vertical extrapolation of wind, genetic algorithm, artificial neural network, hybrid machine learning
Procedia PDF Downloads 49025352 A Case Study on Machine Learning-Based Project Performance Forecasting for an Urban Road Reconstruction Project
Authors: Soheila Sadeghi
Abstract:
In construction projects, predicting project performance metrics accurately is essential for effective management and successful delivery. However, conventional methods often depend on fixed baseline plans, disregarding the evolving nature of project progress and external influences. To address this issue, we introduce a distinct approach based on machine learning to forecast key performance indicators, such as cost variance and earned value, for each Work Breakdown Structure (WBS) category within an urban road reconstruction project. Our proposed model leverages time series forecasting techniques, namely Autoregressive Integrated Moving Average (ARIMA) and Long Short-Term Memory (LSTM) networks, to predict future performance by analyzing historical data and project progress. Additionally, the model incorporates external factors, including weather patterns and resource availability, as features to improve forecast accuracy. By harnessing the predictive capabilities of machine learning, our performance forecasting model enables project managers to proactively identify potential deviations from the baseline plan and take timely corrective measures. To validate the effectiveness of the proposed approach, we conduct a case study on an urban road reconstruction project, comparing the model's predictions with actual project performance data. The outcomes of this research contribute to the advancement of project management practices in the construction industry by providing a data-driven solution for enhancing project performance monitoring and control.Keywords: project performance forecasting, machine learning, time series forecasting, cost variance, schedule variance, earned value management
Procedia PDF Downloads 4025351 Environmental Impact Assessment of Conventional Tyre Manufacturing Process
Authors: G. S. Dangayach, Gaurav Gaurav, Alok Bihari Singh
Abstract:
The popularity of vehicles in both industrialized and developing economies led to a rise in the production of tyres. People have become increasingly concerned about the tyre industry's possible environmental impact in the last two decades. The life cycle assessment (LCA) methodology was used to assess the environmental impacts of industrial tyres throughout their life cycle, which included four stages: manufacture, transportation, consumption, and end-of-life. The majority of prior studies focused on tyre recycling and disposal. Only a few studies have been conducted on the environmental impact of tyre production process. LCA methodology was employed to determine the environmental impact of tyre manufacture process (gate to gate) at an Indian firm. Comparative analysis was also conducted to identify the environmental hotspots in various stages of tire manufacturing. This study is limited to gate-to-gate analysis of manufacturing processes with the functional unit of a single tyre weighing 50 kg. GaBi software was used to do both qualitative and quantitative analysis. Different environmental impact indicators are measured in terms of CO2, SO2, NOx, GWP (global warming potential), AP (acidification potential), EP (eutrophication potential), POCP (photochemical oxidant formation potential), and HTP (toxic human potential). The results demonstrate that the major contributor to environmental pollution is electricity. The Banbury process has a very high negative environmental impact, which causes respiratory problems to workers and operators.Keywords: life cycle assessment (LCA), environmental impact indicators, tyre manufacturing process, environmental impact assessment
Procedia PDF Downloads 15325350 Orientational Pair Correlation Functions Modelling of the LiCl6H2O by the Hybrid Reverse Monte Carlo: Using an Environment Dependence Interaction Potential
Authors: Mohammed Habchi, Sidi Mohammed Mesli, Rafik Benallal, Mohammed Kotbi
Abstract:
On the basis of four partial correlation functions and some geometric constraints obtained from neutron scattering experiments, a Reverse Monte Carlo (RMC) simulation has been performed in the study of the aqueous electrolyte LiCl6H2O at the glassy state. The obtained 3-dimensional model allows computing pair radial and orientational distribution functions in order to explore the structural features of the system. Unrealistic features appeared in some coordination peaks. To remedy to this, we use the Hybrid Reverse Monte Carlo (HRMC), incorporating an additional energy constraint in addition to the usual constraints derived from experiments. The energy of the system is calculated using an Environment Dependence Interaction Potential (EDIP). Ions effects is studied by comparing correlations between water molecules in the solution and in pure water at room temperature Our results show a good agreement between experimental and computed partial distribution functions (PDFs) as well as a significant improvement in orientational distribution curves.Keywords: LiCl6H2O, glassy state, RMC, HRMC
Procedia PDF Downloads 47125349 Parking Service Effectiveness at Commercial Malls
Authors: Ahmad AlAbdullah, Ali AlQallaf, Mahdi Hussain, Mohammed AlAttar, Salman Ashknani, Magdy Helal
Abstract:
We study the effectiveness of the parking service provided at Kuwaiti commercial malls and explore potential problems and feasible improvements. Commercial malls are important to Kuwaitis as the entertainment and shopping centers due to the lack of other alternatives. The difficulty and relatively long times wasted in finding a parking spot at the mall are real annoyances. We applied queuing analysis to one of the major malls that offer paid-parking (1040 parking spots) in addition to free parking. Patrons of the mall usually complained of the traffic jams and delays at entering the paid parking (average delay to park exceeds 15 min for about 62% of the patrons, while average time spent in the mall is about 2.6 hours). However, the analysis showed acceptable service levels at the check-in gates of the parking garage. Detailed review of the vehicle movement at the gateways indicated that arriving and departing cars both had to share parts of the gateway to the garage, which caused the traffic jams and delays. A simple comparison we made indicated that the largest commercial mall in Kuwait does not suffer such parking issues, while other smaller, yet important malls do, including the one we studied. It was suggested that well-designed inlets and outlets of that gigantic mall permitted smooth parking despite being totally free and mall is the first choice for most people for entertainment and shopping. A simulation model is being developed for further analysis and verification. Simulation can overcome the mathematical difficulty in using non-Poisson queuing models. The simulation model is used to explore potential changes to the parking garage entrance layout. And with the inclusion of the drivers’ behavior inside the parking, effectiveness indicators can be derived to address the economic feasibility of extending the parking capacity and increasing service levels. Outcomes of the study are planned to be generalized as appropriate to other commercial malls in KuwaitKeywords: commercial malls, parking service, queuing analysis, simulation modeling
Procedia PDF Downloads 34025348 A Hybrid Feature Selection Algorithm with Neural Network for Software Fault Prediction
Authors: Khalaf Khatatneh, Nabeel Al-Milli, Amjad Hudaib, Monther Ali Tarawneh
Abstract:
Software fault prediction identify potential faults in software modules during the development process. In this paper, we present a novel approach for software fault prediction by combining a feedforward neural network with particle swarm optimization (PSO). The PSO algorithm is employed as a feature selection technique to identify the most relevant metrics as inputs to the neural network. Which enhances the quality of feature selection and subsequently improves the performance of the neural network model. Through comprehensive experiments on software fault prediction datasets, the proposed hybrid approach achieves better results, outperforming traditional classification methods. The integration of PSO-based feature selection with the neural network enables the identification of critical metrics that provide more accurate fault prediction. Results shows the effectiveness of the proposed approach and its potential for reducing development costs and effort by detecting faults early in the software development lifecycle. Further research and validation on diverse datasets will help solidify the practical applicability of the new approach in real-world software engineering scenarios.Keywords: feature selection, neural network, particle swarm optimization, software fault prediction
Procedia PDF Downloads 96