Search results for: risk optimization
7759 Spatially Random Sampling for Retail Food Risk Factors Study
Authors: Guilan Huang
Abstract:
In 2013 and 2014, the U.S. Food and Drug Administration (FDA) collected data from selected fast food restaurants and full service restaurants for tracking changes in the occurrence of foodborne illness risk factors. This paper discussed how we customized spatial random sampling method by considering financial position and availability of FDA resources, and how we enriched restaurants data with location. Location information of restaurants provides opportunity for quantitatively determining random sampling within non-government units (e.g.: 240 kilometers around each data-collector). Spatial analysis also could optimize data-collectors’ work plans and resource allocation. Spatial analytic and processing platform helped us handling the spatial random sampling challenges. Our method fits in FDA’s ability to pinpoint features of foodservice establishments, and reduced both time and expense on data collection.Keywords: geospatial technology, restaurant, retail food risk factor study, spatially random sampling
Procedia PDF Downloads 3507758 Risk Assessment of Contamination by Heavy Metals in Sarcheshmeh Copper Complex of Iran Using Topsis Method
Authors: Hossein Hassani, Ali Rezaei
Abstract:
In recent years, the study of soil contamination problems surrounding mines and smelting plants has attracted some serious attention of the environmental experts. These elements due to the non- chemical disintegration and nature are counted as environmental stable and durable contaminants. Variability of these contaminants in the soil and the time and financial limitation for the favorable environmental application, in order to reduce the risk of their irreparable negative consequences on environment, caused to apply the favorable grading of these contaminant for the further success of the risk management processes. In this study, we use the contaminants factor risk indices, average concentration, enrichment factor and geoaccumulation indices for evaluating the metal contaminant of including Pb, Ni, Se, Mo and Zn in the soil of Sarcheshmeh copper mine area. For this purpose, 120 surface soil samples up to the depth of 30 cm have been provided from the study area. And the metals have been analyzed using ICP-MS method. Comparison of the heavy and potentially toxic elements concentration in the soil samples with the world average value of the uncontaminated soil and shale average indicates that the value of Zn, Pb, Ni, Se and Mo is higher than the world average value and only the Ni element shows the lower value than the shale average. Expert opinions on the relative importance of each indicators were used to assign a final weighting of the metals and the heavy metals were ranked using the TOPSIS approach. This allows us to carry out efficient environmental proceedings, leading to the reduction of environmental ricks form the contaminants. According to the results, Ni, Pb, Mo, Zn, and Se have the highest rate of risk contamination in the soil samples of the study area.Keywords: contamination coefficient, geoaccumulation factor, TOPSIS techniques, Sarcheshmeh copper complex
Procedia PDF Downloads 2747757 Artificial Intelligence for Safety Related Aviation Incident and Accident Investigation Scenarios
Authors: Bernabeo R. Alberto
Abstract:
With the tremendous improvements in the processing power of computers, the possibilities of artificial intelligence will increasingly be used in aviation and make autonomous flights, preventive maintenance, ATM (Air Traffic Management) optimization, pilots, cabin crew, ground staff, and airport staff training possible in a cost-saving, less time-consuming and less polluting way. Through the use of artificial intelligence, we foresee an interviewing scenario where the interviewee will interact with the artificial intelligence tool to contextualize the character and the necessary information in a way that aligns reasonably with the character and the scenario. We are creating simulated scenarios connected with either an aviation incident or accident to enhance also the training of future accident/incident investigators integrating artificial intelligence and augmented reality tools. The project's goal is to improve the learning and teaching scenario through academic and professional expertise in aviation and in the artificial intelligence field. Thus, we intend to contribute to the needed high innovation capacity, skills, and training development and management of artificial intelligence, supported by appropriate regulations and attention to ethical problems.Keywords: artificial intelligence, aviation accident, aviation incident, risk, safety
Procedia PDF Downloads 227756 Integrating Deterministic and Probabilistic Safety Assessment to Decrease Risk & Energy Consumption in a Typical PWR
Authors: Ebrahim Ghanbari, Mohammad Reza Nematollahi
Abstract:
Integrating deterministic and probabilistic safety assessment (IDPSA) is one of the most commonly used issues in the field of safety analysis of power plant accident. It has also been recognized today that the role of human error in creating these accidents is not less than systemic errors, so the human interference and system errors in fault and event sequences are necessary. The integration of these analytical topics will be reflected in the frequency of core damage and also the study of the use of water resources in an accident such as the loss of all electrical power of the plant. In this regard, the SBO accident was simulated for the pressurized water reactor in the deterministic analysis issue, and by analyzing the operator's behavior in controlling the accident, the results of the combination of deterministic and probabilistic assessment were identified. The results showed that the best performance of the plant operator would reduce the risk of an accident by 10%, as well as a decrease of 6.82 liters/second of the water sources of the plant.Keywords: IDPSA, human error, SBO, risk
Procedia PDF Downloads 1297755 Health Risk Assessment of Exposing to Benzene in Office Building around a Chemical Industry Based on Numerical Simulation
Authors: Majid Bayatian, Mohammadreza Ashouri
Abstract:
Releasing hazardous chemicals is one of the major problems for office buildings in the chemical industry and, therefore, environmental risks are inherent to these environments. The adverse health effects of the airborne concentration of benzene have been a matter of significant concern, especially in oil refineries. The chronic and acute adverse health effects caused by benzene exposure have attracted wide attention. Acute exposure to benzene through inhalation could cause headaches, dizziness, drowsiness, and irritation of the skin. Chronic exposures have reported causing aplastic anemia and leukemia at the occupational settings. Association between chronic occupational exposure to benzene and the development of aplastic anemia and leukemia were documented by several epidemiological studies. Numerous research works have investigated benzene emissions and determined benzene concentration at different locations of the refinery plant and stated considerable health risks. The high cost of industrial control measures requires justification through lifetime health risk assessment of exposed workers and the public. In the present study, a Computational Fluid Dynamics (CFD) model has been proposed to assess the exposure risk of office building around a refinery due to its release of benzene. For simulation, GAMBIT, FLUENT, and CFD Post software were used as pre-processor, processor, and post-processor, and the model was validated based on comparison with experimental results of benzene concentration and wind speed. Model validation results showed that the model is highly validated, and this model can be used for health risk assessment. The simulation and risk assessment results showed that benzene could be dispersion to an office building nearby, and the exposure risk has been unacceptable. According to the results of this study, a validated CFD model, could be very useful for decision-makers for control measures and possibly support them for emergency planning of probable accidents. Also, this model can be used to assess exposure to various types of accidents as well as other pollutants such as toluene, xylene, and ethylbenzene in different atmospheric conditions.Keywords: health risk assessment, office building, Benzene, numerical simulation, CFD
Procedia PDF Downloads 1307754 Changes in Textural Properties of Zucchini Slices Under Effects of Partial Predrying and Deep-Fat-Frying
Authors: E. Karacabey, Ş. G. Özçelik, M. S. Turan, C. Baltacıoğlu, E. Küçüköner
Abstract:
Changes in textural properties of any food material during processing is significant for further consumer’s evaluation and directly affects their decisions. Thus any food material should be considered in terms of textural properties after any process. In the present study zucchini slices were partially predried to control and reduce the product’s final oil content. A conventional oven was used for partially dehydration of zucchini slices. Following frying was carried in an industrial fryer having temperature controller. This study was based on the effect of this predrying process on textural properties of fried zucchini slices. Texture profile analysis was performed. Hardness, elasticity, chewiness, cohesiveness were studied texture parameters of fried zucchini slices. Temperature and weight loss were monitored parameters of predrying process, whereas, in frying, oil temperature and process time were controlled. Optimization of two successive processes was done by response surface methodology being one of the common used statistical process optimization tools. Models developed for each texture parameters displayed high success to predict their values as a function of studied processes’ conditions. Process optimization was performed according to target values for each property determined for directly fried zucchini slices taking the highest score from sensory evaluation. Results indicated that textural properties of predried and then fried zucchini slices could be controlled by well-established equations. This is thought to be significant for fried stuff related food industry, where controlling of sensorial properties are crucial to lead consumer’s perception and texture related ones are leaders. This project (113R015) has been supported by TUBITAK.Keywords: optimization, response surface methodology, texture profile analysis, conventional oven, modelling
Procedia PDF Downloads 4337753 Closed Incision Negative Pressure Therapy Dressing as an Approach to Manage Closed Sternal Incisions in High-Risk Cardiac Patients: A Multi-Centre Study in the UK
Authors: Rona Lee Suelo-Calanao, Mahmoud Loubani
Abstract:
Objective: Sternal wound infection (SWI) following cardiac operation has a significant impact on patient morbidity and mortality. It also contributes to longer hospital stays and increased treatment costs. SWI management is mainly focused on treatment rather than prevention. This study looks at the effect of closed incision negative pressure therapy (ciNPT) dressing to help reduce the incidence of superficial SWI in high-risk patients after cardiac surgery. The ciNPT dressing was evaluated at 3 cardiac hospitals in the United Kingdom". Methods: All patients who had cardiac surgery from 2013 to 2021 were included in the study. The patients were classed as high risk if they have two or more of the recognised risk factors: obesity, age above 80 years old, diabetes, and chronic obstructive pulmonary disease. Patients receiving standard dressing (SD) and patients using ciNPT were propensity matched, and the Fisher’s exact test (two-tailed) and unpaired T-test were used to analyse categorical and continuous data, respectively. Results: There were 766 matched cases in each group. Total SWI incidences are lower in the ciNPT group compared to the SD group (43 (5.6%) vs 119 (15.5%), P=0.0001). There are fewer deep sternal wound infections (14(1.8%) vs. 31(4.04%), p=0.0149) and fewer superficial infections (29(3.7%) vs. 88 (11.4%), p=0.0001) in the ciNPT group compared to the SD group. However, the ciNPT group showed a longer average length of stay (11.23 ± 13 days versus 9.66 ± 10 days; p=0.0083) and higher mean logistic EuroSCORE (11.143 ± 13 versus 8.094 ± 11; p=0.0001). Conclusion: Utilization of ciNPT as an approach to help reduce the incidence of superficial and deep SWI may be effective in high-risk patients requiring cardiac surgery.Keywords: closed incision negative pressure therapy, surgical wound infection, cardiac surgery complication, high risk cardiac patients
Procedia PDF Downloads 967752 Updating Stochastic Hosting Capacity Algorithm for Voltage Optimization Programs and Interconnect Standards
Authors: Nicholas Burica, Nina Selak
Abstract:
The ADHCAT (Automated Distribution Hosting Capacity Assessment Tool) was designed to run Hosting Capacity Analysis on the ComEd system via a stochastic DER (Distributed Energy Resource) placement on multiple power flow simulations against a set of violation criteria. The violation criteria in the initial version of the tool captured a limited amount of issues that individual departments design against for DER interconnections. Enhancements were made to the tool to further align with individual department violation and operation criteria, as well as the addition of new modules for use for future load profile analysis. A reporting engine was created for future analytical use based on the simulations and observations in the tool.Keywords: distributed energy resources, hosting capacity, interconnect, voltage optimization
Procedia PDF Downloads 1907751 Modeling and Mapping of Soil Erosion Risk Using Geographic Information Systems, Remote Sensing, and Deep Learning Algorithms: Case of the Oued Mikkes Watershed, Morocco
Authors: My Hachem Aouragh, Hind Ragragui, Abdellah El-Hmaidi, Ali Essahlaoui, Abdelhadi El Ouali
Abstract:
This study investigates soil erosion susceptibility in the Oued Mikkes watershed, located in the Meknes-Fez region of northern Morocco, utilizing advanced techniques such as deep learning algorithms and remote sensing integrated within Geographic Information Systems (GIS). Spanning approximately 1,920 km², the watershed is characterized by a semi-arid Mediterranean climate with irregular rainfall and limited water resources. The waterways within the watershed, especially the Oued Mikkes, are vital for agricultural irrigation and potable water supply. The research assesses the extent of erosion risk upstream of the Sidi Chahed dam while developing a spatial model of soil loss. Several important factors, including topography, land use/land cover, and climate, were analyzed, with data on slope, NDVI, and rainfall erosivity processed using deep learning models (DLNN, CNN, RNN). The results demonstrated excellent predictive performance, with AUC values of 0.92, 0.90, and 0.88 for DLNN, CNN, and RNN, respectively. The resulting susceptibility maps provide critical insights for soil management and conservation strategies, identifying regions at high risk for erosion across 24% of the study area. The most high-risk areas are concentrated on steep slopes, particularly near the Ifrane district and the surrounding mountains, while low-risk areas are located in flatter regions with less rugged topography. The combined use of remote sensing and deep learning offers a powerful tool for accurate erosion risk assessment and resource management in the Mikkes watershed, highlighting the implications of soil erosion on dam siltation and operational efficiency.Keywords: soil erosion, GIS, remote sensing, deep learning, Mikkes Watershed, Morocco
Procedia PDF Downloads 187750 The Exploration of Psychosocial Risk and the Handling of Unsafe Acts and Misconduct
Authors: Jacquelene Swanepoel, J. C. Visagie, H. M. Linde
Abstract:
Purpose: The aim of this article is to investigate the psychosocial risk environment influencing employee behaviour, and subsequently the trust relationship between employer and employee. Design/methodology/approach: The unique nature and commonness of negative acts, such as unsafe behaviour, human errors, poor performance and negligence, also referred to as unsafe practice, are explored. A literature review is formulated to investigate the nature of negative acts or unsafe behaviour. The findings of this study are used to draw comparisons between unsafe behaviour/misconduct and accidents in the workplace and finally conclude how it should be addressed from a labour relations point of view. Findings: The results indicate comparisons between unsafe practice/misconduct and occupational injuries and accidents, as a result of system flaws, human error or psychosocial risk.Keywords: occupational risks, unsafe practice, misconduct, organisational safety culture, ergonomics, management commitment and leadership, labour relations
Procedia PDF Downloads 3577749 Development of a Geomechanical Risk Assessment Model for Underground Openings
Authors: Ali Mortazavi
Abstract:
The main objective of this research project is to delve into a multitude of geomechanical risks associated with various mining methods employed within the underground mining industry. Controlling geotechnical design parameters and operational factors affecting the selection of suitable mining techniques for a given underground mining condition will be considered from a risk assessment point of view. Important geomechanical challenges will be investigated as appropriate and relevant to the commonly used underground mining methods. Given the complicated nature of rock mass in-situ and complicated boundary conditions and operational complexities associated with various underground mining methods, the selection of a safe and economic mining operation is of paramount significance. Rock failure at varying scales within the underground mining openings is always a threat to mining operations and causes human and capital losses worldwide. Geotechnical design is a major design component of all underground mines and basically dominates the safety of an underground mine. With regard to uncertainties that exist in rock characterization prior to mine development, there are always risks associated with inappropriate design as a function of mining conditions and the selected mining method. Uncertainty often results from the inherent variability of rock masse, which in turn is a function of both geological materials and rock mass in-situ conditions. The focus of this research is on developing a methodology which enables a geomechanical risk assessment of given underground mining conditions. The outcome of this research is a geotechnical risk analysis algorithm, which can be used as an aid in selecting the appropriate mining method as a function of mine design parameters (e.g., rock in-situ properties, design method, governing boundary conditions such as in-situ stress and groundwater, etc.).Keywords: geomechanical risk assessment, rock mechanics, underground mining, rock engineering
Procedia PDF Downloads 1457748 Risk Assessment on New Bio-Composite Materials Made from Water Resource Recovery
Authors: Arianna Nativio, Zoran Kapelan, Jan Peter van der Hoek
Abstract:
Bio-composite materials are becoming increasingly popular in various applications, such as the automotive industry. Usually, bio-composite materials are made from natural resources recovered from plants, now, a new type of bio-composite material has begun to be produced in the Netherlands. This material is made from resources recovered from drinking water treatments (calcite), wastewater treatment (cellulose), and material from surface water management (aquatic plants). Surface water, raw drinking water, and wastewater can be contaminated with pathogens and chemical compounds. Therefore, it would be valuable to develop a framework to assess, monitor, and control the potential risks. Indeed, the goal is to define the major risks in terms of human health, quality of materials, and environment associated with the production and application of these new materials. This study describes the general risk assessment framework, starting with a qualitative risk assessment. The qualitative risk analysis was carried out by using the HAZOP methodology for the hazard identification phase. The HAZOP methodology is logical and structured and able to identify the hazards in the first stage of the design when hazards and associated risks are not well known. The identified hazards were analyzed to define the potential associated risks, and then these were evaluated by using the qualitative Event Tree Analysis. ETA is a logical methodology used to define the consequences for a specific hazardous incidents, evaluating the failure modes of safety barriers and dangerous intermediate events that lead to the final scenario (risk). This paper shows the effectiveness of combining of HAZOP and qualitative ETA methodologies for hazard identification and risk mapping. Then, key risks were identified, and a quantitative framework was developed based on the type of risks identified, such as QMRA and QCRA. These two models were applied to assess human health risks due to the presence of pathogens and chemical compounds such as heavy metals into the bio-composite materials. Thus, due to these contaminations, the bio-composite product, during its application, might release toxic substances into the environment leading to a negative environmental impact. Therefore, leaching tests are going to be planned to simulate the application of these materials into the environment and evaluate the potential leaching of inorganic substances, assessing environmental risk.Keywords: bio-composite, risk assessment, water reuse, resource recovery
Procedia PDF Downloads 1097747 Cost Based Analysis of Risk Stratification Tool for Prediction and Management of High Risk Choledocholithiasis Patients
Authors: Shreya Saxena
Abstract:
Background: Choledocholithiasis is a common complication of gallstone disease. Risk scoring systems exist to guide the need for further imaging or endoscopy in managing choledocholithiasis. We completed an audit to review the American Society for Gastrointestinal Endoscopy (ASGE) scoring system for prediction and management of choledocholithiasis against the current practice at a tertiary hospital to assess its utility in resource optimisation. We have now conducted a cost focused sub-analysis on patients categorized high-risk for choledocholithiasis according to the guidelines to determine any associated cost benefits. Method: Data collection from our prior audit was used to retrospectively identify thirteen patients considered high-risk for choledocholithiasis. Their ongoing management was mapped against the guidelines. Individual costs for the key investigations were obtained from our hospital financial data. Total cost for the different management pathways identified in clinical practice were calculated and compared against predicted costs associated with recommendations in the guidelines. We excluded the cost of laparoscopic cholecystectomy and considered a set figure for per day hospital admission related expenses. Results: Based on our previous audit data, we identified a77% positive predictive value for the ASGE risk stratification tool to determine patients at high-risk of choledocholithiasis. 47% (6/13) had an magnetic resonance cholangiopancreatography (MRCP) prior to endoscopic retrograde cholangiopancreatography (ERCP), whilst 53% (7/13) went straight for ERCP. The average length of stay in the hospital was 7 days, with an additional day and cost of £328.00 (£117 for ERCP) for patients awaiting an MRCP prior to ERCP. Per day hospital admission was valued at £838.69. When calculating total cost, we assumed all patients had admission bloods and ultrasound done as the gold standard. In doing an MRCP prior to ERCP, there was a 130% increase in cost incurred (£580.04 vs £252.04) per patient. When also considering hospital admission and the average length of stay, it was an additional £1166.69 per patient. We then calculated the exact costs incurred by the department, over a three-month period, for all patients, for key investigations or procedures done in the management of choledocholithiasis. This was compared to an estimate cost derived from the recommended pathways in the ASGE guidelines. Overall, 81% (£2048.45) saving was associated with following the guidelines compared to clinical practice. Conclusion: MRCP is the most expensive test associated with the diagnosis and management of choledocholithiasis. The ASGE guidelines recommend endoscopy without an MRCP in patients stratified as high-risk for choledocholithiasis. Our audit that focused on assessing the utility of the ASGE risk scoring system showed it to be relatively reliable for identifying high-risk patients. Our cost analysis has shown significant cost savings per patient and when considering the average length of stay associated with direct endoscopy rather than an additional MRCP. Part of this is also because of an increased average length of stay associated with waiting for an MRCP. The above data supports the ASGE guidelines for the management of high-risk for choledocholithiasis patients from a cost perspective. The only caveat is our small data set that may impact the validity of our average length of hospital stay figures and hence total cost calculations.Keywords: cost-analysis, choledocholithiasis, risk stratification tool, general surgery
Procedia PDF Downloads 987746 Corporate Governance and Firm Performance in the UAE
Authors: Bakr Ali Al-Gamrh, Ku Nor Izah B. Ku Ismail
Abstract:
We investigate the relationship between corporate governance, leverage, risk, and firm performance. We use a firm level panel that spans the period 2008 to 2012 of all listed firms on Abu Dhabi Stock Exchange and Dubai Financial Market. After constructing an index of corporate governance strength, we find a negative effect of corporate governance on firm performance. We, however, discover that corporate governance strength indirectly improves the negative influence of leverage on firm performance in normal times. On the contrary, the results completely reversed when there is a black swan event. Corporate governance strength plays a significantly negative role in moderating the relationship between leverage and firm performance during the financial crisis. We also reveal that corporate governance strength increases firms’ risk and deteriorates performance during crisis. Results provide evidence that corporate governance indirectly plays a completely different role in different time periods.Keywords: corporate governance, firm performance, risk, leverage, the UAE
Procedia PDF Downloads 5507745 Design Optimization and Thermoacoustic Analysis of Pulse Tube Cryocooler Components
Authors: K. Aravinth, C. T. Vignesh
Abstract:
The usage of pulse tube cryocoolers is significantly increased mainly due to the advantage of the absence of moving parts. The underlying idea of this project is to optimize the design of pulse tube, regenerator, a resonator in cryocooler and analyzing the thermo-acoustic oscillations with respect to the design parameters. Computational Fluid Dynamic (CFD) model with time-dependent validation is done to predict its performance. The continuity, momentum, and energy equations are solved for various porous media regions. The effect of changing the geometries and orientation will be validated and investigated in performance. The pressure, temperature and velocity fields in the regenerator and pulse tube are evaluated. This optimized design performance results will be compared with the existing pulse tube cryocooler design. The sinusoidal behavior of cryocooler in acoustic streaming patterns in pulse tube cryocooler will also be evaluated.Keywords: acoustics, cryogenics, design, optimization
Procedia PDF Downloads 1757744 Treatment of Type 2 Diabetes Mellitus: Physicians’ Adherence to the American Diabetes Association Guideline in Central Region, Saudi Arabia
Authors: Ibrahim Mohammed
Abstract:
Background: Diabetes mellitus is a chronic disease that can cause devastating secondary complications, reducing the quality and length of life as well as increasing medical costs for the patient and society. The guidelines recommend both clinical and preventive strategies for diabetes management and are regularly updated. The aim of the study is to assess the level of adherence of physicians to American Diabetes Association Guidelines. Method: Observational multicenter retrospective study will be conducted among different hospitals in the central region. Patient data will be collected from the records of the last three years (2017- 2020). Records will be selected randomly after a complete randomized design. The study focuses on the management of type 2 according to ADA not changed in the last three updating; those standards; all patients should be taking Metformin 1500 to 2000 mg/day as recommended dose and should be received a high dose of statin if the high risk to ASCVD or moderate statin if not at risk, patients with hypertension and diabetes should taking ACE or ARBS. Result: The study aimed to evaluate the commitment of physicians in the central region to the ADA. Out of the 153 selected patients, only 17 % were able to control their diabetes with an average A1c below 7. ADA stated that to reach the minimum benefit of using Metformin, the daily dose should be between 1500 and 2000 mg. Results showed that 110 patients were on Metformin, where 68% of them were on the recommended dose. ADA recommended the intake of high statin for diabetic patients with ASCVD risk, while diabetic patients without ASCVD risk should be on a moderate statin. Results showed that 61.5% of patients with ASCVD risk were at high statin while only 36% of patients without ASCVD risk were at moderate statin. Results showed that 89 patients have hypertension, and 80% of them are getting ACE/ARBs as recommended by the ADA. Recommendation: It is necessary to implement periodic training courses for some physicians to enhance and update their knowledge.Keywords: American Diabetic Association, diabetes mellitus, atherosclerotic cardiovascular disease, ACE inhibitors
Procedia PDF Downloads 857743 Optimization of Flexible Job Shop Scheduling Problem with Sequence-Dependent Setup Times Using Genetic Algorithm Approach
Authors: Sanjay Kumar Parjapati, Ajai Jain
Abstract:
This paper presents optimization of makespan for ‘n’ jobs and ‘m’ machines flexible job shop scheduling problem with sequence dependent setup time using genetic algorithm (GA) approach. A restart scheme has also been applied to prevent the premature convergence. Two case studies are taken into consideration. Results are obtained by considering crossover probability (pc = 0.85) and mutation probability (pm = 0.15). Five simulation runs for each case study are taken and minimum value among them is taken as optimal makespan. Results indicate that optimal makespan can be achieved with more than one sequence of jobs in a production order.Keywords: flexible job shop, genetic algorithm, makespan, sequence dependent setup times
Procedia PDF Downloads 3327742 Prevalence of Pre Hypertension and Its Association to Risk Factors for Cardiovascular Diseases Among Male Undergraduate Students in Chennai
Authors: R. S. Dinesh Madhavan, M. Logaraj
Abstract:
Background: Recent studies have documented an increase in the risk of cardiovascular diseases (CVD) and a high rate of progression to hypertension in persons with pre hypertension. The risk factors for the growing burden of cardiovascular diseases especially hypertension, diabetes, overweight or obesity and waist hip ratio are increasing. Much study has not been done on cardiovascular risk factors associated with blood pressure (BP) among college students in Indian population. Objectives: The objective of our study was to estimate the prevalence of prehypertension among male students and to assess the association between prehypertension and risk factors for cardiovascular diseases. Material and Methods: A cross-sectional study was conducted among students of a university situated in the suburban area of Chennai. A total of 403 students was studied which included 200 medical and 203 engineering students. The information on selected socio-demographic variables were collected with the help of pre tested structured questionnaire. Measurements of height, weight, blood pressure and postprandial blood glucose were carried out as per standard procedure. Results: The mean age of the participants was 19.56 ± 1.67years. The mean systolic and diastolic blood pressure were 125.80±10.03 mm of Hg and 78.96 ±11.75mm of Hg. The average intake of fruits and vegetable per week were 4.34 ±3.47days and 6.55±4.39 days respectively. Use of smoke and smokeless tobacco were 27.3% and 3% respectively. About 30.3% of the students consume alcohol. Nearly 45.9 % of them did not practice regular exercise. About 29 % were overweight and 5.7% were obese, 24.8% were with waist circumference above 90 centimeters. The prevalence of pre hypertension and hypertension was 49.6% and 19.1% among male students. The prevalence of pre hypertension was higher in medical students (51.5%) compared to engineering students (47.8%). Higher risk of being pre hypertensive were noted above the age of 20 years (OR=4.32), fruit intake less than 3 days a week (OR= 1.03), smokers (OR= 1.13), alcohol intake (OR=1.56), lack of physical exercise (OR=1.90), BMI of more than 25 kg/m2 (OR=1.99). But statistically significant difference was noted between pre hypertensive and normotensive for age (p<0.0001), lack of physical exercise (p=0.004) and BMI (p=0.015). Conclusion: In conclusion nearly half of the students were pre hypertensive. Higher prevalence of smoking, alcohol intake, lack of physical exercise, overweight and increased waist circumference and postprandial blood sugar more than 140 mg/dl was noted among pre-hypertensive compared to normotensive.Keywords: cardiovascular diseases, prehypertension, risk factors, undergraduate Students
Procedia PDF Downloads 4397741 Perception of Risks of the Telecommunication Towers in Malaysia: A Qualitative Inquiry
Authors: Y. Kamarulzaman, A. Madun, F. D. Yusop, N. Abdullah, N. K. Hoong
Abstract:
In 2011, the Malaysian Government has initiated a nationwide project called 1BestariNet which will adopt the using of technology in teaching and learning, resulting in the construction of telecommunication towers inside the public schools’ premise. Using qualitative approach, this study investigated public perception of risks associated with the project, particularly the telecommunication towers. Data collection involved observation and in-depth interviews with 22 individuals consist of a segment of public that was anxious about the risks of radio frequency electromagnetic field (RFEMF) which include two employees of telecommunication companies (telcos) and five employees of Government agencies. Observation of the location of the towers at 10 public schools, a public forum, and media reports provide valuable information in our analysis. The study finds that the main concern is related to the health risks. This study also shows that it is not easy for the Government to manage public perception mainly because it involves public trust. We find that risk perception is related with public trust, as well as the perceived benefits and level of knowledge. Efficient communication and continuous engagement with the local communities help to build and maintain public trust, reduce public fear and anxiety, hence mitigating the risk perception among the public.Keywords: risk perception, risk communication, trust, telecommunication tower, radio frequency electromagnetic field (RFEMF)
Procedia PDF Downloads 3207740 Optimization of Black Grass Jelly Formulation to Reduce Leaching and Increase Floating Rate
Authors: M. M. Nor, H. I. Sheikh, M. F. H. Hassan, S. Mokhtar, A. Suganthi, A. Fadhlina
Abstract:
Black grass jelly (BGJ) is a popular black jelly used in preparing various drinks and desserts. Food industries often use preservatives to maintain the physicochemical properties of foods, such as color and texture. These preservatives (e.g., phosphoric acid) are linked with deleterious health effects such as kidney disease. Using gelling agents, carrageenan, and gelatin to make BGJ could improve its physiochemical and textural properties. This study was designed to optimize BGJ-selected physicochemical and textural properties using carrageenan and gelatin. Various black grass jelly formulations (BGJF) were designed using an I-optimal mixture design in Design Expert® software. Data from commercial BGJ were used as a reference during the optimization process. The combination of carrageenan and gelatin added to the formulations was up to 14.38g (~5%), respectively. The results showed that adding 2.5g carrageenan and 2.5g gelatin at approximately 5g (~5%) effectively maintained most of the physiochemical properties with an overall desirability function of 0.81. This formulation was selected as the optimum black grass jelly formulation (OBGJF). The leaching properties and floating duration were measured on the OBGJF and commercial grass jelly for 20 min and 40 min, respectively. The results indicated that OBGJF showed significantly (p<0.0001) lower leaching rate and floating time (p<0.05). Hence, further optimization is needed to increase the floating duration of carrageenan and gelatin-based BGJ.Keywords: cincau, Mesona chinensis, black grass jelly, carrageenan, gelatin
Procedia PDF Downloads 827739 Research on the Optimization of Satellite Mission Scheduling
Authors: Pin-Ling Yin, Dung-Ying Lin
Abstract:
Satellites play an important role in our daily lives, from monitoring the Earth's environment and providing real-time disaster imagery to predicting extreme weather events. As technology advances and demands increase, the tasks undertaken by satellites have become increasingly complex, with more stringent resource management requirements. A common challenge in satellite mission scheduling is the limited availability of resources, including onboard memory, ground station accessibility, and satellite power. In this context, efficiently scheduling and managing the increasingly complex satellite missions under constrained resources has become a critical issue that needs to be addressed. The core of Satellite Onboard Activity Planning (SOAP) lies in optimizing the scheduling of the received tasks, arranging them on a timeline to form an executable onboard mission plan. This study aims to develop an optimization model that considers the various constraints involved in satellite mission scheduling, such as the non-overlapping execution periods for certain types of tasks, the requirement that tasks must fall within the contact range of specified types of ground stations during their execution, onboard memory capacity limits, and the collaborative constraints between different types of tasks. Specifically, this research constructs a mixed-integer programming mathematical model and solves it with a commercial optimization package. Simultaneously, as the problem size increases, the problem becomes more difficult to solve. Therefore, in this study, a heuristic algorithm has been developed to address the challenges of using commercial optimization package as the scale increases. The goal is to effectively plan satellite missions, maximizing the total number of executable tasks while considering task priorities and ensuring that tasks can be completed as early as possible without violating feasibility constraints. To verify the feasibility and effectiveness of the algorithm, test instances of various sizes were generated, and the results were validated through feedback from on-site users and compared against solutions obtained from a commercial optimization package. Numerical results show that the algorithm performs well under various scenarios, consistently meeting user requirements. The satellite mission scheduling algorithm proposed in this study can be flexibly extended to different types of satellite mission demands, achieving optimal resource allocation and enhancing the efficiency and effectiveness of satellite mission execution.Keywords: mixed-integer programming, meta-heuristics, optimization, resource management, satellite mission scheduling
Procedia PDF Downloads 257738 Review of Capitalization of Construction Industry on Sustainable Risk Management in Nigeria
Authors: Nnadi Ezekiel Ejiofor
Abstract:
The construction industry plays a decisive role in the healthy development of any nation. Not only large but even small construction projects contribute to a country’s economic growth. There is a need for good management to ensure successful delivery and sustainability because of the plethora of risks that have resulted in low-profit margins for contractors, cost and schedule overruns, poor quality delivery, and abandoned projects. This research reviewed Capitalization on Sustainable Risk Management. Questionnaires and oral interviews conducted were utilized as means of data collection. One hundred and ninety-eight (198) large construction firms in Nigeria form the population of this study. 15 (fifteen) companies that emanated from merger and acquisition were used for the study. The instruments used for data collection were a researcher-developed structured questionnaire based on a five-point rating scale, interviews, focus group discussion, and secondary sources (bill of quantities and stock and exchange commission). The instrument was validated by two experts in the field. The reliability of the instrument was established by applying the split-half method. Kendall’s coefficient of concordance was used to test the data, and a degree of agreement was obtained. Data were subjected to descriptive statistics and analyzed using analysis of variance, t-test, and SPSS. The identified impacts of capitalization were an increase in turnover (24.5%), improvement in the image (24.5%), risk reduction (20%), business expansion (17.3%), and geographical spread (13.6%). The study strongly advocates the inclusion of risk management evaluation as part of the construction procurement process.Keywords: capitalization, project delivery, risks, risk management, sustainability
Procedia PDF Downloads 607737 Iterative Replanning of Diesel Generator and Energy Storage System for Stable Operation of an Isolated Microgrid
Authors: Jiin Jeong, Taekwang Kim, Kwang Ryel Ryu
Abstract:
The target microgrid in this paper is isolated from the large central power system and is assumed to consist of wind generators, photovoltaic power generators, an energy storage system (ESS), a diesel power generator, the community load, and a dump load. The operation of such a microgrid can be hazardous because of the uncertain prediction of power supply and demand and especially due to the high fluctuation of the output from the wind generators. In this paper, we propose an iterative replanning method for determining the appropriate level of diesel generation and the charging/discharging cycles of the ESS for the upcoming one-hour horizon. To cope with the uncertainty of the estimation of supply and demand, the one-hour plan is built repeatedly in the regular interval of one minute by rolling the one-hour horizon. Since the plan should be built with a sufficiently large safe margin to avoid any possible black-out, some energy waste through the dump load is inevitable. In our approach, the level of safe margin is optimized through learning from the past experience. The simulation experiments show that our method combined with the margin optimization can reduce the dump load compared to the method without such optimization.Keywords: microgrid, operation planning, power efficiency optimization, supply and demand prediction
Procedia PDF Downloads 4327736 Process Optimization for Albanian Crude Oil Characterization
Authors: Xhaklina Cani, Ilirjan Malollari, Ismet Beqiraj, Lorina Lici
Abstract:
Oil characterization is an essential step in the design, simulation, and optimization of refining facilities. To achieve optimal crude selection and processing decisions, a refiner must have exact information refer to crude oil quality. This includes crude oil TBP-curve as the main data for correct operation of refinery crude oil atmospheric distillation plants. Crude oil is typically characterized based on a distillation assay. This procedure is reasonably well-defined and is based on the representation of the mixture of actual components that boil within a boiling point interval by hypothetical components that boil at the average boiling temperature of the interval. The crude oil assay typically includes TBP distillation according to ASTM D-2892, which can characterize this part of oil that boils up to 400 C atmospheric equivalent boiling point. To model the yield curves obtained by physical distillation is necessary to compare the differences between the modelling and the experimental data. Most commercial use a different number of components and pseudo-components to represent crude oil. Laboratory tests include distillations, vapor pressures, flash points, pour points, cetane numbers, octane numbers, densities, and viscosities. The aim of the study is the drawing of true boiling curves for different crude oil resources in Albania and to compare the differences between the modeling and the experimental data for optimal characterization of crude oil.Keywords: TBP distillation curves, crude oil, optimization, simulation
Procedia PDF Downloads 3047735 Multi-Objective Optimization of an Aerodynamic Feeding System Using Genetic Algorithm
Authors: Jan Busch, Peter Nyhuis
Abstract:
Considering the challenges of short product life cycles and growing variant diversity, cost minimization and manufacturing flexibility increasingly gain importance to maintain a competitive edge in today’s global and dynamic markets. In this context, an aerodynamic part feeding system for high-speed industrial assembly applications has been developed at the Institute of Production Systems and Logistics (IFA), Leibniz Universitaet Hannover. The aerodynamic part feeding system outperforms conventional systems with respect to its process safety, reliability, and operating speed. In this paper, a multi-objective optimisation of the aerodynamic feeding system regarding the orientation rate, the feeding velocity and the required nozzle pressure is presented.Keywords: aerodynamic feeding system, genetic algorithm, multi-objective optimization, workpiece orientation
Procedia PDF Downloads 5777734 Optimization of Technical and Technological Solutions for the Development of Offshore Hydrocarbon Fields in the Kaliningrad Region
Authors: Pavel Shcherban, Viktoria Ivanova, Alexander Neprokin, Vladislav Golovanov
Abstract:
Currently, LLC «Lukoil-Kaliningradmorneft» is implementing a comprehensive program for the development of offshore fields of the Kaliningrad region. This is largely associated with the depletion of the resource base of land in the region, as well as the positive results of geological investigation surrounding the Baltic Sea area and the data on the volume of hydrocarbon recovery from a single offshore field are working on the Kaliningrad region – D-6 «Kravtsovskoye».The article analyzes the main stages of the LLC «Lukoil-Kaliningradmorneft»’s development program for the development of the hydrocarbon resources of the region's shelf and suggests an optimization algorithm that allows managing a multi-criteria process of development of shelf deposits. The algorithm is formed on the basis of the problem of sequential decision making, which is a section of dynamic programming. Application of the algorithm during the consolidation of the initial data, the elaboration of project documentation, the further exploration and development of offshore fields will allow to optimize the complex of technical and technological solutions and increase the economic efficiency of the field development project implemented by LLC «Lukoil-Kaliningradmorneft».Keywords: offshore fields of hydrocarbons of the Baltic Sea, development of offshore oil and gas fields, optimization of the field development scheme, solution of multicriteria tasks in oil and gas complex, quality management in oil and gas complex
Procedia PDF Downloads 2007733 An Evolutionary Approach for QAOA for Max-Cut
Authors: Francesca Schiavello
Abstract:
This work aims to create a hybrid algorithm, combining Quantum Approximate Optimization Algorithm (QAOA) with an Evolutionary Algorithm (EA) in the place of traditional gradient based optimization processes. QAOA’s were first introduced in 2014, where, at the time, their algorithm performed better than the traditional best known classical algorithm for Max-cut graphs. Whilst classical algorithms have improved since then and have returned to being faster and more efficient, this was a huge milestone for quantum computing, and their work is often used as a benchmarking tool and a foundational tool to explore variants of QAOA’s. This, alongside with other famous algorithms like Grover’s or Shor’s, highlights to the world the potential that quantum computing holds. It also presents the reality of a real quantum advantage where, if the hardware continues to improve, this could constitute a revolutionary era. Given that the hardware is not there yet, many scientists are working on the software side of things in the hopes of future progress. Some of the major limitations holding back quantum computing are the quality of qubits and the noisy interference they generate in creating solutions, the barren plateaus that effectively hinder the optimization search in the latent space, and the availability of number of qubits limiting the scale of the problem that can be solved. These three issues are intertwined and are part of the motivation for using EAs in this work. Firstly, EAs are not based on gradient or linear optimization methods for the search in the latent space, and because of their freedom from gradients, they should suffer less from barren plateaus. Secondly, given that this algorithm performs a search in the solution space through a population of solutions, it can also be parallelized to speed up the search and optimization problem. The evaluation of the cost function, like in many other algorithms, is notoriously slow, and the ability to parallelize it can drastically improve the competitiveness of QAOA’s with respect to purely classical algorithms. Thirdly, because of the nature and structure of EA’s, solutions can be carried forward in time, making them more robust to noise and uncertainty. Preliminary results show that the EA algorithm attached to QAOA can perform on par with the traditional QAOA with a Cobyla optimizer, which is a linear based method, and in some instances, it can even create a better Max-Cut. Whilst the final objective of the work is to create an algorithm that can consistently beat the original QAOA, or its variants, due to either speedups or quality of the solution, this initial result is promising and show the potential of EAs in this field. Further tests need to be performed on an array of different graphs with the parallelization aspect of the work commencing in October 2023 and tests on real hardware scheduled for early 2024.Keywords: evolutionary algorithm, max cut, parallel simulation, quantum optimization
Procedia PDF Downloads 607732 Implementation of an Economic – Probabilistic Model to Risk Analysis of ERP Project in Technological Innovation Firms – A Case Study of ICT Industry in Iran
Authors: Reza Heidari, Maryam Amiri
Abstract:
In a technological world, many countries have a tendency to fortifying their companies and technological infrastructures. Also, one of the most important requirements for developing technology is innovation, and then, all companies are struggling to consider innovation as a basic principle. Since, the expansion of a product need to combine different technologies, therefore, different innovative projects would be run in the firms as a base of technology development. In such an environment, enterprise resource planning (ERP) has special significance in order to develop and strengthen of innovations. In this article, an economic-probabilistic analysis was provided to perform an implementation project of ERP in the technological innovation (TI) based firms. The used model in this article assesses simultaneously both risk and economic analysis in view of the probability of each event that is jointly between economical approach and risk investigation approach. To provide an economic-probabilistic analysis of risk of the project, activities and milestones in the cash flow were extracted. Also, probability of occurrence of each of them was assessed. Since, Resources planning in an innovative firm is the object of this project. Therefore, we extracted various risks that are in relation with innovative project and then they were evaluated in the form of cash flow. This model, by considering risks affecting the project and the probability of each of them and assign them to the project's cash flow categories, presents an adjusted cash flow based on Net Present Value (NPV) and with probabilistic simulation approach. Indeed, this model presented economic analysis of the project based on risks-adjusted. Then, it measures NPV of the project, by concerning that these risks which have the most effect on technological innovation projects, and in the following measures probability associated with the NPV for each category. As a result of application of presented model in the information and communication technology (ICT) industry, provided an appropriate analysis of feasibility of the project from the point of view of cash flow based on risk impact on the project. Obtained results can be given to decision makers until they can practically have a systematically analysis of the possibility of the project with an economic approach and as moderated.Keywords: cash flow categorization, economic evaluation, probabilistic, risk assessment, technological innovation
Procedia PDF Downloads 4037731 An Optimization Algorithm for Reducing the Liquid Oscillation in the Moving Containers
Authors: Reza Babajanivalashedi, Stefania Lo Feudo, Jean-Luc Dion
Abstract:
Liquid sloshing is a crucial problem for the dynamic of moving containers in the packaging industries. Sloshing issues have been so far mainly modeled within the framework of fluid dynamics or by using equivalent mechanical models with different kinds of movements and shapes of containers. Nevertheless, these approaches do not allow to determinate the shape of the free surface of the liquid in case of the irregular shape of the moving containers, so that experimental measurements may be required. If there is too much slosh in the moving tank, the liquid can be splashed out on the packages. So, the free surface oscillation must be controlled/reduced to eliminate the splashing. The purpose of this research is to propose an optimization algorithm for finding an optimum command law to reduce surface elevation. In the first step, the free surface of the liquid is simulated based on the separation variable and weak formulation models. Then Genetic and Gradient algorithms are developed for finding the optimum command law. The optimum command law is compared with existing command laws, and the results show that there is a significant difference in surface oscillation between optimum and existing command laws. This algorithm is applicable for different varieties of bottles in case of using the camera for detecting the liquid elevation, and it can produce new command laws for different kinds of tanks to reduce the surface oscillation and remove the splashing phenomenon.Keywords: sloshing phenomenon, separation variables, weak formulation, optimization algorithm, command law
Procedia PDF Downloads 1517730 Spatial Distribution and Cluster Analysis of Sexual Risk Behaviors and STIs Reported by Chinese Adults in Guangzhou, China: A Representative Population-Based Study
Authors: Fangjing Zhou, Wen Chen, Brian J. Hall, Yu Wang, Carl Latkin, Li Ling, Joseph D. Tucker
Abstract:
Background: Economic and social reforms designed to open China to the world has been successful, but also appear to have rapidly laid the foundation for the reemergence of STIs since 1980s. Changes in sexual behaviors, relationships, and norms among Chinese contributed to the STIs epidemic. As the massive population moved during the last 30 years, early coital debut, multiple sexual partnerships, and unprotected sex have increased within the general population. Our objectives were to assess associations between residences location, sexual risk behaviors and sexually transmitted infections (STIs) among adults living in Guangzhou, China. Methods: Stratified cluster sampling followed a two-step process was used to select populations aged 18-59 years in Guangzhou, China. Spatial methods including Geographic Information Systems (GIS) were utilized to identify 1400 coordinates with latitude and longitude. Face-to-face household interviews were conducted to collect self-report data on sexual risk behaviors and diagnosed STIs. Kulldorff’s spatial scan statistic was implemented to identify and detect spatial distribution and clusters of sexual risk behaviors and STIs. The presence and location of statistically significant clusters were mapped in the study areas using ArcGIS software. Results: In this study, 1215 of 1400 households attempted surveys, with 368 refusals, resulting in a sample of 751 completed surveys. The prevalence of self-reported sexual risk behaviors was between 5.1% and 50.0%. The self-reported lifetime prevalence of diagnosed STIs was 7.06%. Anal intercourse clustered in an area located along the border within the rural-urban continuum (p=0.001). High rate clusters for alcohol or other drugs using before sex (p=0.008) and migrants who lived in Guangzhou less than one year (p=0.007) overlapped this cluster. Excess cases for sex without a condom (p=0.031) overlapped the cluster for college students (p<0.001). Conclusions: Short-term migrants and college students reported greater sexual risk behaviors. Programs to increase safer sex within these communities to reduce the risk of STIs are warranted in Guangzhou. Spatial analysis identified geographical clusters of sexual risk behaviors, which is critical for optimizing surveillance and targeting control measures for these locations in the future.Keywords: cluster analysis, migrant, sexual risk behaviors, spatial distribution
Procedia PDF Downloads 340