Search results for: empirical distribution
6749 The Mediating Effect of Individual Readiness for Change in the Relationship between Organisational Culture and Individual Commitment to Change
Authors: Mohamed Haffar, Lois Farquharson, Gbola Gbadamosi, Wafi Al-Karaghouli, Ramadane Djbarni
Abstract:
A few recent research studies and mostly conceptual in nature have paid attention to the relationship between organizational culture (OC), individual readiness for change (IRFC) and individual affective commitment to change (IACC). Surprisingly enough, there is a lack of empirical studies investigating the influence of all four OC types on IRFC and IACC. Moreover, there is a very limited research investigating the mediating role of individual readiness for change between OC types and individual affective commitment to change. Therefore, this study is proposed to fill this gap by providing empirical evidence leading to advancement in the understanding of direct and indirect influences of OC on individual affective commitment to change. To achieve this, a questionnaire based survey was developed and self-administered to 226 middle managers in Algerian manufacturing organizations (AMOs). The results of this study indicated that group culture and adhocracy culture positively affect the IACC. Furthermore, the findings of this study show support for the mediating roles of self-efficacy and personally valence in the relationship between OC and IACC.Keywords: individual readiness for change, individual commitment to change, organisational culture, manufacturing organisations
Procedia PDF Downloads 5036748 Reliability-Based Method for Assessing Liquefaction Potential of Soils
Authors: Mehran Naghizaderokni, Asscar Janalizadechobbasty
Abstract:
This paper explores probabilistic method for assessing the liquefaction potential of sandy soils. The current simplified methods for assessing soil liquefaction potential use a deterministic safety factor in order to determine whether liquefaction will occur or not. However, these methods are unable to determine the liquefaction probability related to a safety factor. A solution to this problem can be found by reliability analysis.This paper presents a reliability analysis method based on the popular certain liquefaction analysis method. The proposed probabilistic method is formulated based on the results of reliability analyses of 190 field records and observations of soil performance against liquefaction. The results of the present study show that confidence coefficient greater and smaller than 1 does not mean safety and/or liquefaction in cadence for liquefaction, and for assuring liquefaction probability, reliability based method analysis should be used. This reliability method uses the empirical acceleration attenuation law in the Chalos area to derive the probability density distribution function and the statistics for the earthquake-induced cyclic shear stress ratio (CSR). The CSR and CRR statistics are used in continuity with the first order and second moment method to calculate the relation between the liquefaction probability, the safety factor and the reliability index. Based on the proposed method, the liquefaction probability related to a safety factor can be easily calculated. The influence of some of the soil parameters on the liquefaction probability can be quantitatively evaluated.Keywords: liquefaction, reliability analysis, chalos area, civil and structural engineering
Procedia PDF Downloads 4706747 Cultural Heritage, War and Heritage Legislations: An Empirical Review
Authors: Gebrekiros Welegebriel Asfaw
Abstract:
The conservation of cultural heritage during times of war is a topic of significant importance and concern in the field of heritage studies. The destruction, looting, and illicit acts against cultural heritages have devastating consequences. International and national legislations have been put in place to address these issues and provide a legal framework for protecting cultural heritage during armed conflicts. Thus, the aim of this review is to examine the existing heritage legislations and evaluate their effectiveness in protecting cultural heritage during times of war with a special insight of the Tigray war. The review is based on a comprehensive empirical analysis of existing heritage legislations related to the protection of cultural heritage during war, with a special focus on the Tigray war. The review reveals that there are several international and national legislations in place to protect cultural heritage during times of war. However, the implementation of these legislations has been insufficient and ineffective in the case of the Tigray war. The priceless cultural heritages in Tigray, which were once the centers of investment and world pride were, have been subjected to destruction, looting, and other illicit acts, in violation of both international conventions such as the UNESCO Convention and national legislations. Therefore, there is a need for consistent intervention and enforcement of different legislations from the international community and organizations to rehabilitate, repatriate, and reinstitute the irreplaceable heritages of Tigray.Keywords: cultural heritage, heritage legislations, tigray, war
Procedia PDF Downloads 1556746 Radial Distribution Network Reliability Improvement by Using Imperialist Competitive Algorithm
Authors: Azim Khodadadi, Sahar Sadaat Vakili, Ebrahim Babaei
Abstract:
This study presents a numerical method to optimize the failure rate and repair time of a typical radial distribution system. Failure rate and repair time are effective parameters in customer and energy based indices of reliability. Decrease of these parameters improves reliability indices. Thus, system stability will be boost. The penalty functions indirectly reflect the cost of investment which spent to improve these indices. Constraints on customer and energy based indices, i.e. SAIFI, SAIDI, CAIDI and AENS have been considered by using a new method which reduces optimization algorithm controlling parameters. Imperialist Competitive Algorithm (ICA) used as main optimization technique and particle swarm optimization (PSO), simulated annealing (SA) and differential evolution (DE) has been applied for further investigation. These algorithms have been implemented on a test system by MATLAB. Obtained results have been compared with each other. The optimized values of repair time and failure rate are much lower than current values which this achievement reduced investment cost and also ICA gives better answer than the other used algorithms.Keywords: imperialist competitive algorithm, failure rate, repair time, radial distribution network
Procedia PDF Downloads 6686745 Hard Disk Failure Predictions in Supercomputing System Based on CNN-LSTM and Oversampling Technique
Authors: Yingkun Huang, Li Guo, Zekang Lan, Kai Tian
Abstract:
Hard disk drives (HDD) failure of the exascale supercomputing system may lead to service interruption and invalidate previous calculations, and it will cause permanent data loss. Therefore, initiating corrective actions before hard drive failures materialize is critical to the continued operation of jobs. In this paper, a highly accurate analysis model based on CNN-LSTM and oversampling technique was proposed, which can correctly predict the necessity of a disk replacement even ten days in advance. Generally, the learning-based method performs poorly on a training dataset with long-tail distribution, especially fault prediction is a very classic situation as the scarcity of failure data. To overcome the puzzle, a new oversampling was employed to augment the data, and then, an improved CNN-LSTM with the shortcut was built to learn more effective features. The shortcut transmits the results of the previous layer of CNN and is used as the input of the LSTM model after weighted fusion with the output of the next layer. Finally, a detailed, empirical comparison of 6 prediction methods is presented and discussed on a public dataset for evaluation. The experiments indicate that the proposed method predicts disk failure with 0.91 Precision, 0.91 Recall, 0.91 F-measure, and 0.90 MCC for 10 days prediction horizon. Thus, the proposed algorithm is an efficient algorithm for predicting HDD failure in supercomputing.Keywords: HDD replacement, failure, CNN-LSTM, oversampling, prediction
Procedia PDF Downloads 796744 Determining Inventory Replenishment Policy for Major Component in Assembly-to-Order of Cooling System Manufacturing
Authors: Tippawan Nasawan
Abstract:
The objective of this study is to find the replenishment policy in Assembly-to-Order manufacturing (ATO) which some of the major components have lead-time longer than customer lead-time. The variety of products, independent component demand, and long component lead-time are the difficulty that has resulted in the overstock problem. In addition, the ordering cost is trivial when compared to the cost of material of the major component. A conceptual design of the Decision Supporting System (DSS) has introduced to assist the replenishment policy. Component replenishment by using the variable which calls Available to Promise (ATP) for making the decision is one of the keys. The Poisson distribution is adopted to realize demand patterns in order to calculate Safety Stock (SS) at the specified Customer Service Level (CSL). When distribution cannot identify, nonparametric will be applied instead. The test result after comparing the ending inventory between the new policy and the old policy, the overstock has significantly reduced by 46.9 percent or about 469,891.51 US-Dollars for the cost of the major component (material cost only). Besides, the number of the major component inventory is also reduced by about 41 percent which helps to mitigate the chance of damage and keeping stock.Keywords: Assembly-to-Order, Decision Supporting System, Component replenishment , Poisson distribution
Procedia PDF Downloads 1266743 Finite Element Modeling of Ultrasonic Shot Peening Process using Multiple Pin Impacts
Authors: Chao-xun Liu, Shi-hong Lu
Abstract:
In spite of its importance to the aerospace and automobile industries, little or no attention has been devoted to the accurate modeling of the ultrasonic shot peening (USP) process. It is therefore the purpose of this study to conduct finite element analysis of the process using a realistic multiple pin impacts model with the explicit solver of ABAQUS. In this paper, we research the effect of several key parameters on the residual stress distribution within the target, including impact velocity, incident angle, friction coefficient between pins and target and impact number of times were investigated. The results reveal that the impact velocity and impact number of times have obvious effect and impacting vertically could produce the most perfect residual stress distribution. Then we compare the results with the date in USP experiment and verify the exactness of the model. The analysis of the multiple pin impacts date reveal the relationships between peening process parameters and peening quality, which are useful for identifying the parameters which need to be controlled and regulated in order to produce a more beneficial compressive residual stress distribution within the target.Keywords: ultrasonic shot peening, finite element, multiple pins, residual stress, numerical simulation
Procedia PDF Downloads 4486742 Assessment of Artists’ Socioeconomic and Working Conditions: The Empirical Case of Lithuania
Authors: Rusne Kregzdaite, Erika Godlevska, Morta Vidunaite
Abstract:
The main aim of this research is to explore existing methodologies for artists’ labour force and create artists’ socio-economic and creative conditions in an assessment model. Artists have dual aims in their creative working process: 1) income and 2) artistic self-expression. The valuation of their conditions takes into consideration both sides: the factors related to income and the satisfaction of the creative process and its result. The problem addressed in the study: tangible and intangible artists' criteria used for assessments creativity conditions. The proposed model includes objective factors (working time, income, etc.) and subjective factors (salary covering essential needs, self-satisfaction). Other intangible indicators are taken into account: the impact on the common culture, social values, and the possibility to receive awards, to represent the country in the international market. The empirical model consists of 59 separate indicators, grouped into eight categories. The deviation of each indicator from the general evaluation allows for identifying the strongest and the weakest components of artists’ conditions.Keywords: artist conditions, artistic labour force, cultural policy, indicator, assessment model
Procedia PDF Downloads 1516741 Advanced Hybrid Particle Swarm Optimization for Congestion and Power Loss Reduction in Distribution Networks with High Distributed Generation Penetration through Network Reconfiguration
Authors: C. Iraklis, G. Evmiridis, A. Iraklis
Abstract:
Renewable energy sources and distributed power generation units already have an important role in electrical power generation. A mixture of different technologies penetrating the electrical grid, adds complexity in the management of distribution networks. High penetration of distributed power generation units creates node over-voltages, huge power losses, unreliable power management, reverse power flow and congestion. This paper presents an optimization algorithm capable of reducing congestion and power losses, both described as a function of weighted sum. Two factors that describe congestion are being proposed. An upgraded selective particle swarm optimization algorithm (SPSO) is used as a solution tool focusing on the technique of network reconfiguration. The upgraded SPSO algorithm is achieved with the addition of a heuristic algorithm specializing in reduction of power losses, with several scenarios being tested. Results show significant improvement in minimization of losses and congestion while achieving very small calculation times.Keywords: congestion, distribution networks, loss reduction, particle swarm optimization, smart grid
Procedia PDF Downloads 4456740 Aggregate Fluctuations and the Global Network of Input-Output Linkages
Authors: Alexander Hempfing
Abstract:
The desire to understand business cycle fluctuations, trade interdependencies and co-movement has a long tradition in economic thinking. From input-output economics to business cycle theory, researchers aimed to find appropriate answers from an empirical as well as a theoretical perspective. This paper empirically analyses how the production structure of the global economy and several states developed over time, what their distributional properties are and if there are network specific metrics that allow identifying structurally important nodes, on a global, national and sectoral scale. For this, the World Input-Output Database was used, and different statistical methods were applied. Empirical evidence is provided that the importance of the Eastern hemisphere in the global production network has increased significantly between 2000 and 2014. Moreover, it was possible to show that the sectoral eigenvector centrality indices on a global level are power-law distributed, providing evidence that specific national sectors exist which are more critical to the world economy than others while serving as a hub within the global production network. However, further findings suggest, that global production cannot be characterized as a scale-free network.Keywords: economic integration, industrial organization, input-output economics, network economics, production networks
Procedia PDF Downloads 2766739 Indoor Visible Light Communication Channel Characterization for User Mobility: A Use-Case Study
Authors: Pooja Sanathkumar, Srinidhi Murali, Sethuraman TV, Saravanan M, Paventhan Arumugam, Ashwin Ashok
Abstract:
The last decade has witnessed a significant interest in visible light communication (VLC) technology, as VLC can potentially achieve high data rate links and secure communication channels. However, the use of VLC under mobile settings is fundamentally limited as its a line-of-sight (LOS) technology and there has been limited breakthroughs in realizing VLC for mobile settings. In this regard, this work targets to study the VLC channel under mobility. Through a use-case study analysis with experiment data traces this paper presents an empirical VLC channel study considering the application of VLC for smart lighting in an indoor room environment. This paper contributes a calibration study of a prototype VLC smart lighting system in an indoor environment and through the inferences gained from the calibration, and considering a user is carrying a mobile device fit with a VLC receiver, this work presents recommendations for user's position adjustments, with the goal to ensure maximum connectivity across the room.Keywords: visible light communication, mobility, empirical study, channel characterization
Procedia PDF Downloads 1266738 Sensor Validation Using Bottleneck Neural Network and Variable Reconstruction
Authors: Somia Bouzid, Messaoud Ramdani
Abstract:
The success of any diagnosis strategy critically depends on the sensors measuring process variables. This paper presents a detection and diagnosis sensor faults method based on a Bottleneck Neural Network (BNN). The BNN approach is used as a statistical process control tool for drinking water distribution (DWD) systems to detect and isolate the sensor faults. Variable reconstruction approach is very useful for sensor fault isolation, this method is validated in simulation on a nonlinear system: actual drinking water distribution system. Several results are presented.Keywords: fault detection, localization, PCA, NLPCA, auto-associative neural network
Procedia PDF Downloads 3896737 Research on Modern Semiconductor Converters and the Usage of SiC Devices in the Technology Centre of Ostrava
Authors: P. Vaculík, P. Kaňovský
Abstract:
The following article presents Technology Centre of Ostrava (TCO) in the Czech Republic. Describes the structure and main research areas realized by the project ENET-Energy Units for Utilization of non-traditional Energy Sources. More details are presented from the research program dealing with transformation, accumulation, and distribution of electric energy. Technology Centre has its own energy mix consisting of alternative sources of fuel sources that use of process gases from the storage part and also the energy from distribution network. The article will focus on the properties and application possibilities SiC semiconductor devices for power semiconductor converter for photo-voltaic systems.Keywords: SiC, Si, technology centre of Ostrava, photovoltaic systems, DC/DC Converter, simulation
Procedia PDF Downloads 6106736 Temperature-Dependent Barrier Characteristics of Inhomogeneous Pd/n-GaN Schottky Barrier Diodes Surface
Authors: K. Al-Heuseen, M. R. Hashim
Abstract:
The current-voltage (I-V) characteristics of Pd/n-GaN Schottky barrier were studied at temperatures over room temperature (300-470K). The values of ideality factor (n), zero-bias barrier height (φB0), flat barrier height (φBF) and series resistance (Rs) obtained from I-V-T measurements were found to be strongly temperature dependent while (φBo) increase, (n), (φBF) and (Rs) decrease with increasing temperature. The apparent Richardson constant was found to be 2.1x10-9 Acm-2K-2 and mean barrier height of 0.19 eV. After barrier height inhomogeneities correction, by assuming a Gaussian distribution (GD) of the barrier heights, the Richardson constant and the mean barrier height were obtained as 23 Acm-2K-2 and 1.78eV, respectively. The corrected Richardson constant was very closer to theoretical value of 26 Acm-2K-2.Keywords: electrical properties, Gaussian distribution, Pd-GaN Schottky diodes, thermionic emission
Procedia PDF Downloads 2776735 Temporal Variation of Shorebirds Population in Two Different Mudflats Areas
Authors: N. Norazlimi, R. Ramli
Abstract:
A study was conducted to determine the diversity and abundance of shorebird species habituating the mudflat area of Jeram Beach and Remis Beach, Selangor, Peninsular Malaysia. Direct observation technique (using binoculars and video camera) was applied to record the presence of bird species in the sampling sites from August 2013 until July 2014. A total of 32 species of shorebird were recorded during both migratory and non-migratory seasons. Of these, eleven species (47.8%) are migrants, six species (26.1%) have both migrant and resident populations, four species (17.4%) are vagrants and two species (8.7%) are residents. The compositions of the birds differed significantly in all months (χ2=84.35, p<0.001). There is a significant difference in avian abundance between migratory and non-migratory seasons (Mann-Whitney, t=2.39, p=0.036). The avian abundance were differed significantly in Jeram and Remis Beaches during migratory periods (t=4.39, p=0.001) but not during non-migratory periods (t=0.78, p=0.456). Shorebird diversity was also affected by tidal cycle. There is a significance difference between high tide and low tide (Mann-Whitney, t=78.0, p<0.005). Frequency of disturbance also affected the shorebird distribution (Mann-Whitney, t=57.0, p= 0.0134). Therefore, this study concluded that tides and disturbances are two factors that affecting temporal distribution of shorebird in mudflats area.Keywords: biodiversity, distribution, migratory birds, direct observation
Procedia PDF Downloads 3916734 Role of Empirical Evidence in Law-Making: Case Study from India
Authors: Kaushiki Sanyal, Rajesh Chakrabarti
Abstract:
In India, on average, about 60 Bills are passed every year in both Houses of Parliament – Lok Sabha and Rajya Sabha (calculated from information on websites of both Houses). These are debated in both Lok Sabha (House of Commons) and Rajya Sabha (Council of States) before they are passed. However, lawmakers rarely use empirical evidence to make a case for a law. Most of the time, they support a law on the basis of anecdote, intuition, and common sense. While these do play a role in law-making, without the necessary empirical evidence, laws often fail to achieve their desired results. The quality of legislative debates is an indicator of the efficacy of the legislative process through which a Bill is enacted. However, the study of legislative debates has not received much attention either in India or worldwide due to the difficulty of objectively measuring the quality of a debate. Broadly, three approaches have emerged in the study of legislative debates. The rational-choice or formal approach shows that speeches vary based on different institutional arrangements, intra-party politics, and the political culture of a country. The discourse approach focuses on the underlying rules and conventions and how they impact the content of the debates. The deliberative approach posits that legislative speech can be reasoned, respectful, and informed. This paper aims to (a) develop a framework to judge the quality of debates by using the deliberative approach; (b) examine the legislative debates of three Bills passed in different periods as a demonstration of the framework, and (c) examine the broader structural issues that disincentive MPs from scrutinizing Bills. The framework would include qualitative and quantitative indicators to judge a debate. The idea is that the framework would provide useful insights into the legislators’ knowledge of the subject, the depth of their scrutiny of Bills, and their inclination toward evidence-based research. The three Bills that the paper plans to examine are as follows: 1. The Narcotics Drugs and Psychotropic Substances Act, 1985: This act was passed to curb drug trafficking and abuse. However, it mostly failed to fulfill its purpose. Consequently, it was amended thrice but without much impact on the ground. 2. The Criminal Laws (Amendment) Act, 2013: This act amended the Indian Penal Code to add a section on human trafficking. The purpose was to curb trafficking and penalise traffickers, pimps, and middlemen. However, the crime rate remains high while the conviction rate is low. 3. The Surrogacy (Regulation) Act, 2021: This act bans commercial surrogacy allowing only relatives to act as surrogates as long as there is no monetary payment. Experts fear that instead of preventing commercial surrogacy, it would drive the activity underground. The consequences would be borne by the surrogate, who would not be protected by law. The purpose of the paper is to objectively analyse the quality of parliamentary debates, get insights into how MPs understand the evidence and deliberate on steps to incentivise them to use empirical evidence.Keywords: legislature, debates, empirical, India
Procedia PDF Downloads 866733 Content-Based Mammograms Retrieval Based on Breast Density Criteria Using Bidimensional Empirical Mode Decomposition
Authors: Sourour Khouaja, Hejer Jlassi, Nadia Feddaoui, Kamel Hamrouni
Abstract:
Most medical images, and especially mammographies, are now stored in large databases. Retrieving a desired image is considered of great importance in order to find previous similar cases diagnosis. Our method is implemented to assist radiologists in retrieving mammographic images containing breast with similar density aspect as seen on the mammogram. This is becoming a challenge seeing the importance of density criteria in cancer provision and its effect on segmentation issues. We used the BEMD (Bidimensional Empirical Mode Decomposition) to characterize the content of images and Euclidean distance measure similarity between images. Through the experiments on the MIAS mammography image database, we confirm that the results are promising. The performance was evaluated using precision and recall curves comparing query and retrieved images. Computing recall-precision proved the effectiveness of applying the CBIR in the large mammographic image databases. We found a precision of 91.2% for mammography with a recall of 86.8%.Keywords: BEMD, breast density, contend-based, image retrieval, mammography
Procedia PDF Downloads 2326732 Modelling and Optimisation of Floating Drum Biogas Reactor
Authors: L. Rakesh, T. Y. Heblekar
Abstract:
This study entails the development and optimization of a mathematical model for a floating drum biogas reactor from first principles using thermal and empirical considerations. The model was derived on the basis of mass conservation, lumped mass heat transfer formulations and empirical biogas formation laws. The treatment leads to a system of coupled nonlinear ordinary differential equations whose solution mapped four-time independent controllable parameters to five output variables which adequately serve to describe the reactor performance. These equations were solved numerically using fourth order Runge-Kutta method for a range of input parameter values. Using the data so obtained an Artificial Neural Network with a single hidden layer was trained using Levenberg-Marquardt Damped Least Squares (DLS) algorithm. This network was then fine-tuned for optimal mapping by varying hidden layer size. This fast forward model was then employed as a health score generator in the Bacterial Foraging Optimization code. The optimal operating state of the simplified Biogas reactor was thus obtained.Keywords: biogas, floating drum reactor, neural network model, optimization
Procedia PDF Downloads 1436731 Repair Workshop Queue System Modification Using Priority Scheme
Authors: C. Okonkwo Ugochukwu, E. Sinebe Jude, N. Odoh Blessing, E. Okafor Christian
Abstract:
In this paper, a modification on repair workshop queuing system using multi priority scheme was carried out. Chi square goodness of fit test was used to determine the random distribution of the inter arrival time and service time of crankshafts that come for maintenance in the workshop. The chi square values obtained for all the prioritized classes show that the distribution conforms to Poisson distribution. The mean waiting time in queue results of non-preemptive priority for 1st, 2nd and 3rd classes show 0.066, 0.09, and 0.224 day respectively, while preemptive priority show 0.007, 0.036 and 0.258 day. However, when non priority is used, which obviously has no class distinction it amounts to 0.17 days. From the results, one can observe that the preemptive priority system provides a very dramatic improvement over the non preemptive priority as it concerns arrivals that are of higher priority. However, the improvement has a detrimental effect on the low priority class. The trend of the results is similar to the mean waiting time in the system as a result of addition of the actual service time. Even though the mean waiting time for the queue and that of the system for no priority takes the least time when compared with the least priority, urgent and semi-urgent jobs will terribly suffer which will most likely result in reneging or balking of many urgent jobs. Hence, the adoption of priority scheme in this type of scenario will result in huge profit to the Company and more customer satisfaction.Keywords: queue, priority class, preemptive, non-preemptive, mean waiting time
Procedia PDF Downloads 3966730 Slip Limit Prediction of High-Strength Bolt Joints Based on Local Approach
Authors: Chang He, Hiroshi Tamura, Hiroshi Katsuchi, Jiaqi Wang
Abstract:
In this study, the aim is to infer the slip limit (static friction limit) of contact interfaces in bolt friction joints by analyzing other bolt friction joints with the same contact surface but in a different shape. By using the Weibull distribution to deal with microelements on the contact surface statistically, the slip limit of a certain type of bolt joint was predicted from other types of bolt joint with the same contact surface. As a result, this research succeeded in predicting the slip limit of bolt joins with different numbers of contact surfaces and with different numbers of bolt rows.Keywords: bolt joints, slip coefficient, finite element method, Weibull distribution
Procedia PDF Downloads 1706729 Real-Time Monitoring of Drinking Water Quality Using Advanced Devices
Authors: Amani Abdallah, Isam Shahrour
Abstract:
The quality of drinking water is a major concern of public health. The control of this quality is generally performed in the laboratory, which requires a long time. This type of control is not adapted for accidental pollution from sudden events, which can have serious consequences on population health. Therefore, it is of major interest to develop real-time innovative solutions for the detection of accidental contamination in drinking water systems This paper presents researches conducted within the SunRise Demonstrator for ‘Smart and Sustainable Cities’ with a particular focus on the supervision of the water quality. This work aims at (i) implementing a smart water system in a large water network (Campus of the University Lille1) including innovative equipment for real-time detection of abnormal events, such as those related to the contamination of drinking water and (ii) develop a numerical modeling of the contamination diffusion in the water distribution system. The first step included verification of the water quality sensors and their effectiveness on a network prototype of 50m length. This part included the evaluation of the efficiency of these sensors in the detection both bacterial and chemical contamination events in drinking water distribution systems. An on-line optical sensor integral with a laboratory-scale distribution system (LDS) was shown to respond rapidly to changes in refractive index induced by injected loads of chemical (cadmium, mercury) and biological contaminations (Escherichia coli). All injected substances were detected by the sensor; the magnitude of the response depends on the type of contaminant introduced and it is proportional to the injected substance concentration.Keywords: distribution system, drinking water, refraction index, sensor, real-time
Procedia PDF Downloads 3546728 Automation of Savitsky's Method for Power Calculation of High Speed Vessel and Generating Empirical Formula
Authors: M. Towhidur Rahman, Nasim Zaman Piyas, M. Sadiqul Baree, Shahnewaz Ahmed
Abstract:
The design of high-speed craft has recently become one of the most active areas of naval architecture. Speed increase makes these vehicles more efficient and useful for military, economic or leisure purpose. The planing hull is designed specifically to achieve relatively high speed on the surface of the water. Speed on the water surface is closely related to the size of the vessel and the installed power. The Savitsky method was first presented in 1964 for application to non-monohedric hulls and for application to stepped hulls. This method is well known as a reliable comparative to CFD analysis of hull resistance. A computer program based on Savitsky’s method has been developed using MATLAB. The power of high-speed vessels has been computed in this research. At first, the program reads some principal parameters such as displacement, LCG, Speed, Deadrise angle, inclination of thrust line with respect to keel line etc. and calculates the resistance of the hull using empirical planning equations of Savitsky. However, some functions used in the empirical equations are available only in the graphical form, which is not suitable for the automatic computation. We use digital plotting system to extract data from nomogram. As a result, value of wetted length-beam ratio and trim angle can be determined directly from the input of initial variables, which makes the power calculation automated without manually plotting of secondary variables such as p/b and other coefficients and the regression equations of those functions are derived by using data from different charts. Finally, the trim angle, mean wetted length-beam ratio, frictional coefficient, resistance, and power are computed and compared with the results of Savitsky and good agreement has been observed.Keywords: nomogram, planing hull, principal parameters, regression
Procedia PDF Downloads 4046727 Optimal Capacitors Placement and Sizing Improvement Based on Voltage Reduction for Energy Efficiency
Authors: Zilaila Zakaria, Muhd Azri Abdul Razak, Muhammad Murtadha Othman, Mohd Ainor Yahya, Ismail Musirin, Mat Nasir Kari, Mohd Fazli Osman, Mohd Zaini Hassan, Baihaki Azraee
Abstract:
Energy efficiency can be realized by minimizing the power loss with a sufficient amount of energy used in an electrical distribution system. In this report, a detailed analysis of the energy efficiency of an electric distribution system was carried out with an implementation of the optimal capacitor placement and sizing (OCPS). The particle swarm optimization (PSO) will be used to determine optimal location and sizing for the capacitors whereas energy consumption and power losses minimization will improve the energy efficiency. In addition, a certain number of busbars or locations are identified in advance before the PSO is performed to solve OCPS. In this case study, three techniques are performed for the pre-selection of busbar or locations which are the power-loss-index (PLI). The particle swarm optimization (PSO) is designed to provide a new population with improved sizing and location of capacitors. The total cost of power losses, energy consumption and capacitor installation are the components considered in the objective and fitness functions of the proposed optimization technique. Voltage magnitude limit, total harmonic distortion (THD) limit, power factor limit and capacitor size limit are the parameters considered as the constraints for the proposed of optimization technique. In this research, the proposed methodologies implemented in the MATLAB® software will transfer the information, execute the three-phase unbalanced load flow solution and retrieve then collect the results or data from the three-phase unbalanced electrical distribution systems modeled in the SIMULINK® software. Effectiveness of the proposed methods used to improve the energy efficiency has been verified through several case studies and the results are obtained from the test systems of IEEE 13-bus unbalanced electrical distribution system and also the practical electrical distribution system model of Sultan Salahuddin Abdul Aziz Shah (SSAAS) government building in Shah Alam, Selangor.Keywords: particle swarm optimization, pre-determine of capacitor locations, optimal capacitors placement and sizing, unbalanced electrical distribution system
Procedia PDF Downloads 4346726 Distribution and Historical Trends of PAHs Deposition in Recent Sediment Cores of the Imo River, SE Nigeria
Authors: Miranda I. Dosunmu, Orok E. Oyo-Ita, Inyang O. Oyo-Ita
Abstract:
Polycyclic aromatic hydrocarbons (PAHs) are a class of priority listed organic pollutants due to their carcinogenicity, mutagenity, acute toxicity and persistency in the environment. The distribution and historical changes of PAHs contamination in recent sediment cores from the Imo River were investigated using gas chromatography coupled with mass spectrometer. The concentrations of total PAHs (TPAHs) ranging from 402.37 ng/g dry weight (dw) at the surface layer of the Estuary zone (ESC6; 0-5 cm) to 92,388.59 ng/g dw at the near surface layer of the Afam zone (ASC5; 5-10 cm) indicate that PAHs contamination was localized not only between sample sites but also within the same cores. Sediment-depth profiles for the four (Afam, Mangrove, Estuary and illegal Petroleum refinery) cores revealed irregular distribution patterns in the TPAH concentrations except the fact that these levels became maximized at the near surface layers (5-10 cm) corresponding to a geological time-frame of about 1996-2004. This time scale coincided with the period of intensive bunkering and oil pipeline vandalization by the Niger Delta militant groups. Also a general slight decline was found in the TPAHs levels from near the surface layers (5-10 cm) to the most recent top layers (0-5 cm) of the cores, attributable to the recent effort by the Nigerian government in clamping down the illegal activity of the economic saboteurs. Therefore, the recent amnesty period granted to the militant groups should be extended. Although mechanism of perylene formation still remains enigmatic, examination of its distributions down cores indicates natural biogenic, pyrogenic and petrogenic origins for the compound at different zones. Thus, the characteristic features of the Imo River environment provide a means of tracing diverse origins for perylene.Keywords: perylene, historical trend, distribution, origin, Imo River
Procedia PDF Downloads 2516725 Analysis of Urban Population Using Twitter Distribution Data: Case Study of Makassar City, Indonesia
Authors: Yuyun Wabula, B. J. Dewancker
Abstract:
In the past decade, the social networking app has been growing very rapidly. Geolocation data is one of the important features of social media that can attach the user's location coordinate in the real world. This paper proposes the use of geolocation data from the Twitter social media application to gain knowledge about urban dynamics, especially on human mobility behavior. This paper aims to explore the relation between geolocation Twitter with the existence of people in the urban area. Firstly, the study will analyze the spread of people in the particular area, within the city using Twitter social media data. Secondly, we then match and categorize the existing place based on the same individuals visiting. Then, we combine the Twitter data from the tracking result and the questionnaire data to catch the Twitter user profile. To do that, we used the distribution frequency analysis to learn the visitors’ percentage. To validate the hypothesis, we compare it with the local population statistic data and land use mapping released by the city planning department of Makassar local government. The results show that there is the correlation between Twitter geolocation and questionnaire data. Thus, integration the Twitter data and survey data can reveal the profile of the social media users.Keywords: geolocation, Twitter, distribution analysis, human mobility
Procedia PDF Downloads 3146724 The Determinants of Enterprise Risk Management: Literature Review, and Future Research
Authors: Sylvester S. Horvey, Jones Mensah
Abstract:
The growing complexities and dynamics in the business environment have led to a new approach to risk management, known as enterprise risk management (ERM). ERM is a system and an approach to managing the risks of an organization in an integrated manner to achieve the corporate goals and strategic objectives. Regardless of the diversities in the business environment, ERM has become an essential factor in managing individual and business risks because ERM is believed to enhance shareholder value and firm growth. Despite the growing number of literature on ERM, the question about what factors drives ERM remains limited. This study provides a comprehensive literature review of the main factors that contribute to ERM implementation. Google Scholar was the leading search engine used to identify empirical literature, and the review spanned between 2000 and 2020. Articles published in Scimago journal ranking and Scopus were examined. Thirteen firm characteristics and sixteen articles were considered for the empirical review. Most empirical studies agreed that firm size, institutional ownership, industry type, auditor type, industrial diversification, earnings volatility, stock price volatility, and internal auditor had a positive relationship with ERM adoption, whereas firm size, institutional ownership, auditor type, and type of industry were mostly seen be statistically significant. Other factors such as financial leverage, profitability, asset opacity, international diversification, and firm complexity revealed an inconclusive result. The growing literature on ERM is not without limitations; hence, this study suggests that further research should examine ERM determinants within a new geographical context while considering a new and robust way of measuring ERM rather than relying on a simple proxy (dummy) for ERM measurement. Other firm characteristics such as organizational culture and context, corporate scandals and losses, and governance could be considered determinants of ERM adoption.Keywords: enterprise risk management, determinants, ERM adoption, literature review
Procedia PDF Downloads 1736723 Assessing Significance of Correlation with Binomial Distribution
Authors: Vijay Kumar Singh, Pooja Kushwaha, Prabhat Ranjan, Krishna Kumar Ojha, Jitendra Kumar
Abstract:
Present day high-throughput genomic technologies, NGS/microarrays, are producing large volume of data that require improved analysis methods to make sense of the data. The correlation between genes and samples has been regularly used to gain insight into many biological phenomena including, but not limited to, co-expression/co-regulation, gene regulatory networks, clustering and pattern identification. However, presence of outliers and violation of assumptions underlying Pearson correlation is frequent and may distort the actual correlation between the genes and lead to spurious conclusions. Here, we report a method to measure the strength of association between genes. The method assumes that the expression values of a gene are Bernoulli random variables whose outcome depends on the sample being probed. The method considers the two genes as uncorrelated if the number of sample with same outcome for both the genes (Ns) is equal to certainly expected number (Es). The extent of correlation depends on how far Ns can deviate from the Es. The method does not assume normality for the parent population, fairly unaffected by the presence of outliers, can be applied to qualitative data and it uses the binomial distribution to assess the significance of association. At this stage, we would not claim about the superiority of the method over other existing correlation methods, but our method could be another way of calculating correlation in addition to existing methods. The method uses binomial distribution, which has not been used until yet, to assess the significance of association between two variables. We are evaluating the performance of our method on NGS/microarray data, which is noisy and pierce by the outliers, to see if our method can differentiate between spurious and actual correlation. While working with the method, it has not escaped our notice that the method could also be generalized to measure the association of more than two variables which has been proven difficult with the existing methods.Keywords: binomial distribution, correlation, microarray, outliers, transcriptome
Procedia PDF Downloads 4156722 Undercooling of Refractory High-Entropy Alloy
Authors: Liang Hu
Abstract:
The innovation of refractory high-entropy alloy (RHEA) formed from refractory metals W, Ta, Mo, Nb, Hf, V, and Zr was firstly implemented in 2010 to obtain better strength at high temperature than conventional HEAs based on Al, Co, Cr, Cu, Fe and Ni. Due to the refractory characteristic and high chemical activity at elevated temperature, electrostatic levitation technique has been utilized to fulfill the rapid solidification of RHEA. Several RHEAs consisting W, Ta, Mo, Nb, Zr have been selected to perform the undercooling and rapid solidification by ESL. They are substantially undercooled by up to 0.2TL. The evolution of as-solidified microstructure and component redistribution with undercooling have been investigated by SEM, EBSD, and EPMA analysis. According to the EPMA results of composing elements at different undercooling levels, the chemical distribution relevant to undercooling was also analyzed.Keywords: chemical distribution, high-entropy alloy, rapid solidification, undercooling
Procedia PDF Downloads 1286721 Reliability Indices Evaluation of SEIG Rotor Core Magnetization with Minimum Capacitive Excitation for WECs
Authors: Lokesh Varshney, R. K. Saket
Abstract:
This paper presents reliability indices evaluation of the rotor core magnetization of the induction motor operated as a self-excited induction generator by using probability distribution approach and Monte Carlo simulation. Parallel capacitors with calculated minimum capacitive value across the terminals of the induction motor operating as a SEIG with unregulated shaft speed have been connected during the experimental study. A three phase, 4 poles, 50Hz, 5.5 hp, 12.3A, 230V induction motor coupled with DC Shunt Motor was tested in the electrical machine laboratory with variable reactive loads. Based on this experimental study, it is possible to choose a reliable induction machine operating as a SEIG for unregulated renewable energy application in remote area or where grid is not available. Failure density function, cumulative failure distribution function, survivor function, hazard model, probability of success and probability of failure for reliability evaluation of the three phase induction motor operating as a SEIG have been presented graphically in this paper.Keywords: residual magnetism, magnetization curve, induction motor, self excited induction generator, probability distribution, Monte Carlo simulation
Procedia PDF Downloads 5586720 Utilizing Extended Reality in Disaster Risk Reduction Education: A Scoping Review
Authors: Stefano Scippo, Damiana Luzzi, Stefano Cuomo, Maria Ranieri
Abstract:
Background: In response to the rise in natural disasters linked to climate change, numerous studies on Disaster Risk Reduction Education (DRRE) have emerged since the '90s, mainly using a didactic transmission-based approach. Effective DRRE should align with an interactive, experiential, and participatory educational model, which can be costly and risky. A potential solution is using simulations facilitated by eXtended Reality (XR). Research Question: This study aims to conduct a scoping review to explore educational methodologies that use XR to enhance knowledge among teachers, students, and citizens about environmental risks, natural disasters (including climate-related ones), and their management. Method: A search string of 66 keywords was formulated, spanning three domains: 1) education and target audience, 2) environment and natural hazards, and 3) technologies. On June 21st, 2023, the search string was used across five databases: EBSCOhost, IEEE Xplore, PubMed, Scopus, and Web of Science. After deduplication and removing papers without abstracts, 2,152 abstracts (published between 2013 and 2023) were analyzed and 2,062 papers were excluded, followed by the exclusion of 56 papers after full-text scrutiny. Excluded studies focused on unrelated technologies, non-environmental risks, and lacked educational outcomes or accessible texts. Main Results: The 34 reviewed papers were analyzed for context, risk type, research methodology, learning objectives, XR technology use, outcomes, and educational affordances of XR. Notably, since 2016, there has been a rise in scientific publications, focusing mainly on seismic events (12 studies) and floods (9), with a significant contribution from Asia (18 publications), particularly Japan (7 studies). Methodologically, the studies were categorized into empirical (26) and non-empirical (8). Empirical studies involved user or expert validation of XR tools, while non-empirical studies included systematic reviews and theoretical proposals without experimental validation. Empirical studies were further classified into quantitative, qualitative, or mixed-method approaches. Six qualitative studies involved small groups of users or experts, while 20 quantitative or mixed-method studies used seven different research designs, with most (17) employing a quasi-experimental, one-group post-test design, focusing on XR technology usability over educational effectiveness. Non-experimental studies had methodological limitations, making their results hypothetical and in need of further empirical validation. Educationally, the learning objectives centered on knowledge and skills for surviving natural disaster emergencies. All studies recommended XR technologies for simulations or serious games but did not develop comprehensive educational frameworks around these tools. XR-based tools showed potential superiority over traditional methods in teaching risk and emergency management skills. However, conclusions were more valid in studies with experimental designs; otherwise, they remained hypothetical without empirical evidence. The educational affordances of XR, mainly user engagement, were confirmed by the studies. Authors’ Conclusions: The analyzed literature lacks specific educational frameworks for XR in DRRE, focusing mainly on survival knowledge and skills. There is a need to expand educational approaches to include uncertainty education, developing competencies that encompass knowledge, skills, and attitudes like risk perception.Keywords: disaster risk reduction education, educational technologies, scoping review, XR technologies
Procedia PDF Downloads 24