Search results for: safety instrumented function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8255

Search results for: safety instrumented function

6305 Human Factors Considerations in New Generation Fighter Planes to Enhance Combat Effectiveness

Authors: Chitra Rajagopal, Indra Deo Kumar, Ruchi Joshi, Binoy Bhargavan

Abstract:

Role of fighter planes in modern network centric military warfare scenarios has changed significantly in the recent past. New generation fighter planes have multirole capability of engaging both air and ground targets with high precision. Multirole aircraft undertakes missions such as Air to Air combat, Air defense, Air to Surface role (including Air interdiction, Close air support, Maritime attack, Suppression and Destruction of enemy air defense), Reconnaissance, Electronic warfare missions, etc. Designers have primarily focused on development of technologies to enhance the combat performance of the fighter planes and very little attention is given to human factor aspects of technologies. Unique physical and psychological challenges are imposed on the pilots to meet operational requirements during these missions. Newly evolved technologies have enhanced aircraft performance in terms of its speed, firepower, stealth, electronic warfare, situational awareness, and vulnerability reduction capabilities. This paper highlights the impact of emerging technologies on human factors for various military operations and missions. Technologies such as ‘cooperative knowledge-based systems’ to aid pilot’s decision making in military conflict scenarios as well as simulation technologies to enhance human performance is also studied as a part of research work. Current and emerging pilot protection technologies and systems which form part of the integrated life support systems in new generation fighter planes is discussed. System safety analysis application to quantify the human reliability in military operations is also studied.

Keywords: combat effectiveness, emerging technologies, human factors, systems safety analysis

Procedia PDF Downloads 143
6304 Slope Stability Assessment in Metasedimentary Deposit of an Opencast Mine: The Case of the Dikuluwe-Mashamba (DIMA) Mine in the DR Congo

Authors: Dina Kon Mushid, Sage Ngoie, Tshimbalanga Madiba, Kabutakapua Kakanda

Abstract:

Slope stability assessment is still the biggest challenge in mining activities and civil engineering structures. The slope in an opencast mine frequently reaches multiple weak layers that lead to the instability of the pit. Faults and soft layers throughout the rock would increase weathering and erosion rates. Therefore, it is essential to investigate the stability of the complex strata to figure out how stable they are. In the Dikuluwe-Mashamba (DIMA) area, the lithology of the stratum is a set of metamorphic rocks whose parent rocks are sedimentary rocks with a low degree of metamorphism. Thus, due to the composition and metamorphism of the parent rock, the rock formation is different in hardness and softness, which means that when the content of dolomitic and siliceous is high, the rock is hard. It is softer when the content of argillaceous and sandy is high. Therefore, from the vertical direction, it appears as a weak and hard layer, and from the horizontal direction, it seems like a smooth and hard layer in the same rock layer. From the structural point of view, the main structures in the mining area are the Dikuluwe dipping syncline and the Mashamba dipping anticline, and the occurrence of rock formations varies greatly. During the folding process of the rock formation, the stress will concentrate on the soft layer, causing the weak layer to be broken. At the same time, the phenomenon of interlayer dislocation occurs. This article aimed to evaluate the stability of metasedimentary rocks of the Dikuluwe-Mashamba (DIMA) open-pit mine using limit equilibrium and stereographic methods Based on the presence of statistical structural planes, the stereographic projection was used to study the slope's stability and examine the discontinuity orientation data to identify failure zones along the mine. The results revealed that the slope angle is too steep, and it is easy to induce landslides. The numerical method's sensitivity analysis showed that the slope angle and groundwater significantly impact the slope safety factor. The increase in the groundwater level substantially reduces the stability of the slope. Among the factors affecting the variation in the rate of the safety factor, the bulk density of soil is greater than that of rock mass, the cohesion of soil mass is smaller than that of rock mass, and the friction angle in the rock mass is much larger than that in the soil mass. The analysis showed that the rock mass structure types are mostly scattered and fragmented; the stratum changes considerably, and the variation of rock and soil mechanics parameters is significant.

Keywords: slope stability, weak layer, safety factor, limit equilibrium method, stereography method

Procedia PDF Downloads 266
6303 Illustrative Effects of Social Capital on Perceived Health Status and Quality of Life among Older Adult in India: Evidence from WHO-Study on Global AGEing and Adults Health India

Authors: Himansu, Bedanga Talukdar

Abstract:

The aim of present study is to investigate the prevalence of various health outcomes and quality of life and analyzes the moderating role of social capital on health outcomes (i.e., self-rated good health (SRH), depression, functional health and quality of life) among elderly in India. Using WHO Study on Global AGEing and adults health (SAGE) data, with sample of 6559 elderly between 50 and above (Mage=61.81, SD=9.00) age were selected for analysis. Multivariate analysis accessed the prevalence of SRH, depression, functional limitation and quality of life among older adults. Logistic regression evaluates the effect of social capital along with other co-founders on SRH, depression, and functional limitation, whereas linear regression evaluates the effect of social capital with other co-founders on quality of life (QoL) among elderly. Empirical results reveal that (74%) of respondents were married, (70%) having low social action, (46%) medium sociability, (45%) low trust-solidarity, (58%) high safety, (65%) medium civic engagement and 37% reported medium psychological resources. The multivariate analysis, explains (SRH) is associated with age, female, having education, higher social action great trust, safety and greater psychological resources. Depression among elderly is greatly related to age, sex, education and higher wealth, higher sociability, having psychological resources. QoL is negatively associated with age, sex, being Muslim, whereas positive associated with higher education, currently married, civic engagement, having wealth, social action, trust and solidarity, safeness, and strong psychological resources.

Keywords: depressive symptom, functional limitation, older adults, quality of life, self rated health, social capital

Procedia PDF Downloads 225
6302 Revolutionizing Autonomous Trucking Logistics with Customer Relationship Management Cloud

Authors: Sharda Kumari, Saiman Shetty

Abstract:

Autonomous trucking is just one of the numerous significant shifts impacting fleet management services. The Society of Automotive Engineers (SAE) has defined six levels of vehicle automation that have been adopted internationally, including by the United States Department of Transportation. On public highways in the United States, organizations are testing driverless vehicles with at least Level 4 automation which indicates that a human is present in the vehicle and can disable automation, which is usually done while the trucks are not engaged in highway driving. However, completely driverless vehicles are presently being tested in the state of California. While autonomous trucking can increase safety, decrease trucking costs, provide solutions to trucker shortages, and improve efficiencies, logistics, too, requires advancements to keep up with trucking innovations. Given that artificial intelligence, machine learning, and automated procedures enable people to do their duties in other sectors with fewer resources, CRM (Customer Relationship Management) can be applied to the autonomous trucking business to provide the same level of efficiency. In a society witnessing significant digital disruptions, fleet management is likewise being transformed by technology. Utilizing strategic alliances to enhance core services is an effective technique for capitalizing on innovations and delivering enhanced services. Utilizing analytics on CRM systems improves cost control of fuel strategy, fleet maintenance, driver behavior, route planning, road safety compliance, and capacity utilization. Integration of autonomous trucks with automated fleet management, yard/terminal management, and customer service is possible, thus having significant power to redraw the lines between the public and private spheres in autonomous trucking logistics.

Keywords: autonomous vehicles, customer relationship management, customer experience, autonomous trucking, digital transformation

Procedia PDF Downloads 111
6301 Selective Solvent Extraction of Co from Ni and Mn through Outer-Sphere Interactions

Authors: Korban Oosthuizen, Robert C. Luckay

Abstract:

Due to the growing popularity of electric vehicles and the importance of cobalt as part of the cathode material for lithium-ion batteries, demand for this metal is on the rise. Recycling of the cathode materials by means of solvent extraction is an attractive means of recovering cobalt and easing the pressure on limited natural resources. In this study, a series of straight chain and macrocyclic diamine ligands were developed for the selective recovery of cobalt from the solution containing nickel and manganese by means of solvent extraction. This combination of metals is the major cathode material used in electric vehicle batteries. The ligands can be protonated and function as ion-pairing ligands targeting the anionic [CoCl₄]²⁻, a species which is not observed for Ni or Mn. Selectivity for Co was found to be good at very high chloride concentrations and low pH. Longer chains or larger macrocycles were found to enhance selectivity, and linear chains on the amide side groups also resulted in greater selectivity over the branched groups. The cation of the chloride salt used for adjusting chloride concentrations seems to play a major role in extraction through salting-out effects. The ligands developed in this study show good selectivity for Co over Ni and Mn but require very high chloride concentrations to function. This research does, however, open the door for further investigations into using diamines as solvent extraction ligands for the recovery of cobalt from spent lithium-ion batteries.

Keywords: hydrometallurgy, solvent extraction, cobalt, lithium-ion batteries

Procedia PDF Downloads 78
6300 Shared Vision System Support for Maintenance Tasks of Wind Turbines

Authors: Buket Celik Ünal, Onur Ünal

Abstract:

Communication is the most challenging part of maintenance operations. Communication between expert and fieldworker is crucial for effective maintenance and this also affects the safety of the fieldworkers. To support a machine user in a remote collaborative physical task, both, a mobile and a stationary device are needed. Such a system is called a shared vision system and the system supports two people to solve a problem from different places. This system reduces the errors and provides a reliable support for qualified and less qualified users. Through this research, it was aimed to validate the effectiveness of using a shared vision system to facilitate communication between on-site workers and those issuing instructions regarding maintenance or inspection works over long distances. The system is designed with head-worn display which is called a shared vision system. As a part of this study, a substitute system is used and implemented by using a shared vision system for maintenance operation. The benefits of the use of a shared vision system are analyzed and results are adapted to the wind turbines to improve the occupational safety and health for maintenance technicians. The motivation for the research effort in this study can be summarized in the following research questions: -How can expert support technician over long distances during maintenance operation? -What are the advantages of using a shared vision system? Experience from the experiment shows that using a shared vision system is an advantage for both electrical and mechanical system failures. Results support that the shared vision system can be used for wind turbine maintenance and repair tasks. Because wind turbine generator/gearbox and the substitute system have similar failures. Electrical failures, such as voltage irregularities, wiring failures and mechanical failures, such as alignment, vibration, over-speed conditions are the common and similar failures for both. Furthermore, it was analyzed the effectiveness of the shared vision system by using a smart glasses in connection with the maintenance task performed by a substitute system under four different circumstances, namely by using a shared vision system, an audio communication, a smartphone and by yourself condition. A suitable method for determining dependencies between factors measured in Chi Square Test, and Chi Square Test for Independence measured for determining a relationship between two qualitative variables and finally Mann Whitney U Test is used to compare any two data sets. While based on this experiment, no relation was found between the results and the gender. Participants` responses confirmed that the shared vision system is efficient and helpful for maintenance operations. From the results of the research, there was a statistically significant difference in the average time taken by subjects on works using a shared vision system under the other conditions. Additionally, this study confirmed that a shared vision system provides reduction in time to diagnose and resolve maintenance issues, reduction in diagnosis errors, reduced travel costs for experts, and increased reliability in service.

Keywords: communication support, maintenance and inspection tasks, occupational health and safety, shared vision system

Procedia PDF Downloads 261
6299 Arginase Activity and Nitric Oxide Levels in Patients Undergoing Open Heart Surgery with Cardiopulmonary Bypass

Authors: Mehmet Ali Kisaçam, P. Sema Temizer Ozan, Ayşe Doğan, Gonca Ozan, F. Sarper Türker

Abstract:

Cardiovascular disease which is one of the most common health problems worldwide has crucial importance because of its’ morbidity and mortality rates. Nitric oxide synthase and arginase use L-arginine as a substrate and produce nitric oxide (NO), citrulline and urea, ornithine respectively. Endothelial dysfunction is characterized by reduced bioavailability of vasodilator and anti-inflammatory molecule NO. The purpose of the study to assess endothelial function via arginase activity and NO levels in patients undergoing coronary artery bypass grafting (CABG) surgery. The study was conducted on 26 patients (14 male, 12 female) undergoing CABG surgery. Blood samples were collected from the subjects before surgery, after the termination and after 24 hours of the surgery. Arginase activity and NO levels measured in collected samples spectrophotometrically. Arginase activity decreased significantly in subjects after the termination of the surgery compared to before surgery data. 24 hours after the surgery there wasn’t any significance in arginase activity as it compared to before surgery and after the termination of the surgery. On the other hand, NO levels increased significantly in the subject after the termination of the surgery. However there was no significant increase in NO levels after 24 hours of the surgery, but there was an insignificant increase compared to before surgery data. The results indicate that after the termination of the surgery vascular and endothelial function improved and after 24 hours of the surgery arginase activity and NO levels returned to normal.

Keywords: arginase, bypass, cordiopulmonary, nitric oxide

Procedia PDF Downloads 207
6298 An Analysis of the Impact of Immunosuppression upon the Prevalence and Risk of Cancer

Authors: Aruha Khan, Brynn E. Kankel, Paraskevi Papadopoulou

Abstract:

In recent years, extensive research upon ‘stress’ has provided insight into its two distinct guises, namely the short–term (fight–or–flight) response versus the long–term (chronic) response. Specifically, the long–term or chronic response is associated with the suppression or dysregulation of immune function. It is also widely noted that the occurrence of cancer is greatly correlated to the suppression of the immune system. It is thus necessary to explore the impact of long–term or chronic stress upon the prevalence and risk of cancer. To what extent can the dysregulation of immune function caused by long–term exposure to stress be controlled or minimized? This study focuses explicitly upon immunosuppression due to its ability to increase disease susceptibility, including cancer itself. Based upon an analysis of the literature relating to the fundamental structure of the immune system alongside the prospective linkage of chronic stress and the development of cancer, immunosuppression may not necessarily correlate directly to the acquisition of cancer—although it remains a contributing factor. A cross-sectional analysis of the survey data from the University of Tennessee Medical Center (UTMC) and Harvard Medical School (HMS) will provide additional supporting evidence (or otherwise) for the hypothesis of the study about whether immunosuppression (caused by the chronic stress response) notably impacts the prevalence of cancer. Finally, a multidimensional framework related to education on chronic stress and its effects is proposed.

Keywords: immune system, immunosuppression, long–term (chronic) stress, risk of cancer

Procedia PDF Downloads 135
6297 Using Computer Vision and Machine Learning to Improve Facility Design for Healthcare Facility Worker Safety

Authors: Hengameh Hosseini

Abstract:

Design of large healthcare facilities – such as hospitals, multi-service line clinics, and nursing facilities - that can accommodate patients with wide-ranging disabilities is a challenging endeavor and one that is poorly understood among healthcare facility managers, administrators, and executives. An even less-understood extension of this problem is the implications of weakly or insufficiently accommodative design of facilities for healthcare workers in physically-intensive jobs who may also suffer from a range of disabilities and who are therefore at increased risk of workplace accident and injury. Combine this reality with the vast range of facility types, ages, and designs, and the problem of universal accommodation becomes even more daunting and complex. In this study, we focus on the implication of facility design for healthcare workers suffering with low vision who also have physically active jobs. The points of difficulty are myriad and could span health service infrastructure, the equipment used in health facilities, and transport to and from appointments and other services can all pose a barrier to health care if they are inaccessible, less accessible, or even simply less comfortable for people with various disabilities. We conduct a series of surveys and interviews with employees and administrators of 7 facilities of a range of sizes and ownership models in the Northeastern United States and combine that corpus with in-facility observations and data collection to identify five major points of failure common to all the facilities that we concluded could pose safety threats to employees with vision impairments, ranging from very minor to severe. We determine that lack of design empathy is a major commonality among facility management and ownership. We subsequently propose three methods for remedying this lack of empathy-informed design, to remedy the dangers posed to employees: the use of an existing open-sourced Augmented Reality application to simulate the low-vision experience for designers and managers; the use of a machine learning model we develop to automatically infer facility shortcomings from large datasets of recorded patient and employee reviews and feedback; and the use of a computer vision model fine tuned on images of each facility to infer and predict facility features, locations, and workflows, that could again pose meaningful dangers to visually impaired employees of each facility. After conducting a series of real-world comparative experiments with each of these approaches, we conclude that each of these are viable solutions under particular sets of conditions, and finally characterize the range of facility types, workforce composition profiles, and work conditions under which each of these methods would be most apt and successful.

Keywords: artificial intelligence, healthcare workers, facility design, disability, visually impaired, workplace safety

Procedia PDF Downloads 117
6296 Numerical Simulation of Two-Dimensional Flow over a Stationary Circular Cylinder Using Feedback Forcing Scheme Based Immersed Boundary Finite Volume Method

Authors: Ranjith Maniyeri, Ahamed C. Saleel

Abstract:

Two-dimensional fluid flow over a stationary circular cylinder is one of the bench mark problem in the field of fluid-structure interaction in computational fluid dynamics (CFD). Motivated by this, in the present work, a two-dimensional computational model is developed using an improved version of immersed boundary method which combines the feedback forcing scheme of the virtual boundary method with Peskin’s regularized delta function approach. Lagrangian coordinates are used to represent the cylinder and Eulerian coordinates are used to describe the fluid flow. A two-dimensional Dirac delta function is used to transfer the quantities between the sold to fluid domain. Further, continuity and momentum equations governing the fluid flow are solved using fractional step based finite volume method on a staggered Cartesian grid system. The developed code is validated by comparing the values of drag coefficient obtained for different Reynolds numbers with that of other researcher’s results. Also, through numerical simulations for different Reynolds numbers flow behavior is well captured. The stability analysis of the improved version of immersed boundary method is tested for different values of feedback forcing coefficients.

Keywords: Feedback Forcing Scheme, Finite Volume Method, Immersed Boundary Method, Navier-Stokes Equations

Procedia PDF Downloads 305
6295 Collocation Assessment between GEO and GSO Satellites

Authors: A. E. Emam, M. Abd Elghany

Abstract:

The change in orbit evolution between collocated satellites (X, Y) inside +/-0.09 ° E/W and +/- 0.07 ° N/S cluster, after one of these satellites is placed in an inclined orbit (satellite X) and the effect of this change in the collocation safety inside the cluster window has been studied and evaluated. Several collocation scenarios had been studied in order to adjust the location of both satellites inside their cluster to maximize the separation between them and safe the mission.

Keywords: satellite, GEO, collocation, risk assessment

Procedia PDF Downloads 397
6294 Nanoparaquat Effects on Oxidative Stress Status and Liver Function in Male Rats

Authors: Zahra Azizi, Ashkan Karbasi, Farzin Firouzian, Sara Soleimani Asl, Akram Ranjbar

Abstract:

Background: One of the most often used herbicides in agriculture is paraquat (PQ), which is very harmful to both people and animals. Chitosan is a well-known, non-toxic polymer commonly used in preparing particles via ionotropic gelation facilitated by negatively charged agents such as sodium alginate. This study aimed to compare the effects of PQ and nanoparaquat (PQNPs) on liver function in male rats. Materials & Methods: Rats were exposed to PQ & PQNPs (4 mg/kg/day, intraperitoneally) for seven days. Then, rats were anesthetized, and serum and liver samples were collected. Later, enzymatic activities such as alanine transaminase (ALT), aspartate transaminase (AST), and alkaline phosphatase (ALP) in serum and oxidative stress biomarkers such as lipid peroxidation (LPO), total antioxidant capacity (TAC) and total thiol groups (TTG) levels in liver tissue were measured by colorimetric methods. Also, histological changes in the liver were evaluated. Results: PQ altered the levels of ALT, AST, and ALP while inducing oxidative stress in the liver. Additionally, liver homogenates with PQ exposure had challenged LPO, TAC, and TTG levels. The severe liver damage is indicated by a significant increase in the enzyme activity of AST, ALT, and ALP in serum. According to the results of the current study, PQNPs, as compared to PQ and the control group, lowered ALT, AST, ALP, and LPO levels while increasing TAC and TTG levels. Conclusion: According to biochemical and histological investigations, PQ loaded in chitosan-alginate particles is more efficient than free PQ at reducing liver toxicity.

Keywords: paraquat, paraquat nanoparticles, liver, oxidative stress

Procedia PDF Downloads 71
6293 Over Expression of Mapk8ip3 Patient Variants in Zebrafish to Establish a Spectrum of Phenotypes in a Rare-Neurodevelopmental Disorder

Authors: Kinnsley Travis, Camerron M. Crowder

Abstract:

Mapk8ip3 (Mitogen-Activated Protein Kinase 8 Interacting Protein 3) is a gene that codes for the JIP3 protein, which is a part of the JIP scaffolding protein family. This protein is involved in axonal vesicle transport, elongation and regeneration. Variants in the Mapk8ip3 gene are associated with a rare-genetic condition that results in a neurodevelopmental disorder that can cause a range of phenotypes including global developmental delay and intellectual disability. Currently, there are 18 known individuals diagnosed to have sequenced confirmed Mapk8ip3 genetic disorders. This project focuses on examining the impact of a subset of missense patient variants on the Jip3 protein function by overexpressing the mRNA of these variants in a zebrafish knockout model for Jip3. Plasmids containing cDNA with individual missense variants were reverse transcribed, purified, and injected into single-cell zebrafish embryos (Wild Type, Jip3 -/+, and Jip3 -/-). At 6-days post mRNA microinjection, morphological, behavioral, and microscopic phenotypes were examined in zebrafish larvae. Morphologically, we compared the size and shape of the zebrafish during their development over a 5-day period. Total locomotive activity was assessed using the Microtracker assay and patterns of movement over time were examined using the DanioVision assay. Lastly, we used confocal microscopy to examine sensory axons for swelling and shortened length, which are phenotypes observed in the loss-of-function knockout Jip3 zebrafish model. Using these assays during embryonic development, we determined the impact of various missense variants on Jip3 protein function, compared to knockout and wild-type zebrafish embryo models. Variants in the gene Mapk8ip3 cause rare-neurodevelopmental disorders due to an essential role in axonal vesicle transport, elongation and regeneration. A subset of missense variants was examined by overexpressing the mRNA of these variants in a Jip3 knock-out zebrafish. Morphological, behavioral, and microscopic phenotypes were examined in zebrafish larvae. Using these assays, the spectrum of disorders can be phenotypically determined and the impact of variant location can be compared to knockout and wild-type zebrafish embryo models.

Keywords: rare disease, neurodevelopmental disorders, mrna overexpression, zebrafish research

Procedia PDF Downloads 117
6292 U.S. Trade and Trade Balance with China: Testing for Marshall-Lerner Condition and the J-Curve Hypothesis

Authors: Anisul Islam

Abstract:

The U.S. has a very strong trade relationship with China but with a large and persistent trade deficit. Some has argued that the undervalued Chinese Yuan is to be blamed for the persistent trade deficit. The empirical results are mixed at best. This paper empirically estimates the U.S. export function along with the U.S. import function with its trade with China with the purpose of testing for the existence of the Marshall-Lerner (ML) condition as well for the possible existence of the J-curve hypothesis. Annual export and import data will be utilized for as long as the time series data exists. The export and import functions will be estimated using advanced econometric techniques, along with appropriate diagnostic tests performed to examine the validity and reliability of the estimated results. The annual time-series data covers from 1975 to 2022 with a sample size of 48 years, the longest period ever utilized before in any previous study. The data is collected from several sources, such as the World Bank’s World Development Indicators, IMF Financial Statistics, IMF Direction of Trade Statistics, and several other sources. The paper is expected to shed important light on the ongoing debate regarding the persistent U.S. trade deficit with China and the policies that may be useful to reduce such deficits over time. As such, the paper will be of great interest for the academics, researchers, think tanks, global organizations, and policy makers in both China and the U.S.

Keywords: exports, imports, marshall-lerner condition, j-curve hypothesis, united states, china

Procedia PDF Downloads 66
6291 Workflow Based Inspection of Geometrical Adaptability from 3D CAD Models Considering Production Requirements

Authors: Tobias Huwer, Thomas Bobek, Gunter Spöcker

Abstract:

Driving forces for enhancements in production are trends like digitalization and individualized production. Currently, such developments are restricted to assembly parts. Thus, complex freeform surfaces are not addressed in this context. The need for efficient use of resources and near-net-shape production will require individualized production of complex shaped workpieces. Due to variations between nominal model and actual geometry, this can lead to changes in operations in Computer-aided process planning (CAPP) to make CAPP manageable for an adaptive serial production. In this context, 3D CAD data can be a key to realizing that objective. Along with developments in the geometrical adaptation, a preceding inspection method based on CAD data is required to support the process planner by finding objective criteria to make decisions about the adaptive manufacturability of workpieces. Nowadays, this kind of decisions is depending on the experience-based knowledge of humans (e.g. process planners) and results in subjective decisions – leading to a variability of workpiece quality and potential failure in production. In this paper, we present an automatic part inspection method, based on design and measurement data, which evaluates actual geometries of single workpiece preforms. The aim is to automatically determine the suitability of the current shape for further machining, and to provide a basis for an objective decision about subsequent adaptive manufacturability. The proposed method is realized by a workflow-based approach, keeping in mind the requirements of industrial applications. Workflows are a well-known design method of standardized processes. Especially in applications like aerospace industry standardization and certification of processes are an important aspect. Function blocks, providing a standardized, event-driven abstraction to algorithms and data exchange, will be used for modeling and execution of inspection workflows. Each analysis step of the inspection, such as positioning of measurement data or checking of geometrical criteria, will be carried out by function blocks. One advantage of this approach is its flexibility to design workflows and to adapt algorithms specific to the application domain. In general, within the specified tolerance range it will be checked if a geometrical adaption is possible. The development of particular function blocks is predicated on workpiece specific information e.g. design data. Furthermore, for different product lifecycle phases, appropriate logics and decision criteria have to be considered. For example, tolerances for geometric deviations are different in type and size for new-part production compared to repair processes. In addition to function blocks, appropriate referencing systems are important. They need to support exact determination of position and orientation of the actual geometries to provide a basis for precise analysis. The presented approach provides an inspection methodology for adaptive and part-individual process chains. The analysis of each workpiece results in an inspection protocol and an objective decision about further manufacturability. A representative application domain is the product lifecycle of turbine blades containing a new-part production and a maintenance process. In both cases, a geometrical adaptation is required to calculate individual production data. In contrast to existing approaches, the proposed initial inspection method provides information to decide between different potential adaptive machining processes.

Keywords: adaptive, CAx, function blocks, turbomachinery

Procedia PDF Downloads 298
6290 Development of Innovative Nuclear Fuel Pellets Using Additive Manufacturing

Authors: Paul Lemarignier, Olivier Fiquet, Vincent Pateloup

Abstract:

In line with the strong desire of nuclear energy players to have ever more effective products in terms of safety, research programs on E-ATF (Enhanced-Accident Tolerant Fuels) that are more resilient, particularly to the loss of coolant, have been launched in all countries with nuclear power plants. Among the multitude of solutions being developed internationally, carcinoembryonic antigen (CEA) and its partners are investigating a promising solution, which is the realization of CERMET (CERamic-METal) type fuel pellets made of a matrix of fissile material, uranium dioxide UO2, which has a low thermal conductivity, and a metallic phase with a high thermal conductivity to improve heat evacuation. Work has focused on the development by powder metallurgy of micro-structured CERMETs, characterized by networks of metallic phase embedded in the UO₂ matrix. Other types of macro-structured CERMETs, based on concepts proposed by thermal simulation studies, have been developed with a metallic phase with a specific geometry to optimize heat evacuation. This solution could not be developed using traditional processes, so additive manufacturing, which revolutionizes traditional design principles, is used to produce these innovative prototype concepts. At CEA Cadarache, work is first carried out on a non-radioactive surrogate material, alumina, in order to acquire skills and to develop the equipment, in particular the robocasting machine, an additive manufacturing technique selected for its simplicity and the possibility of optimizing the paste formulations. A manufacturing chain was set up, with the pastes production, the 3D printing of pellets, and the associated thermal post-treatment. The work leading to the first elaborations of macro-structured alumina/molybdenum CERMETs will be presented. This work was carried out with the support of Framatome and EdF.

Keywords: additive manufacturing, alumina, CERMET, molybdenum, nuclear safety

Procedia PDF Downloads 78
6289 The Impact of Board Director Characteristics on the Quality of Information Disclosure

Authors: Guo Jinhong

Abstract:

The purpose of this study is to explore the association between board member functions and information disclosure levels. Based on the literature variables, such as the characteristics of the board of directors in the past, a single comprehensive indicator is established as a substitute variable for board functions, and the information disclosure evaluation results published by the Securities and Foundation are used to measure the information disclosure level of the company. This study focuses on companies listed on the Taiwan Stock Exchange from 2006 to 2010 and uses descriptive statistical analysis, univariate analysis, correlation analysis and ordered normal probability (Ordered Probit) regression for empirical analysis. The empirical results show that there is a significant positive correlation between the function of board members and the level of information disclosure. This study also conducts a sensitivity test and draws similar conclusions, showing that boards with better board member functions have higher levels of information disclosure. In addition, this study also found that higher board independence, lower director shareholding pledge ratio, higher director shareholding ratio, and directors with rich professional knowledge and practical experience can help improve the level of information disclosure. The empirical results of this study provide strong support for the "relative regulations to improve the level of information disclosure" formulated by the competent authorities in recent years.

Keywords: function of board members, information disclosure, securities, foundation

Procedia PDF Downloads 97
6288 Carbon Fiber Manufacturing Conditions to Improve Interfacial Adhesion

Authors: Filip Stojcevski, Tim Hilditch, Luke Henderson

Abstract:

Although carbon fibre composites are becoming ever more prominent in the engineering industry, interfacial failure still remains one of the most common limitations to material performance. Carbon fiber surface treatments have played a major role in advancing composite properties however research into the influence of manufacturing variables on a fiber manufacturing line is lacking. This project investigates the impact of altering carbon fiber manufacturing conditions on a production line (specifically electrochemical oxidization and sizing variables) to assess fiber-matrix adhesion. Pristine virgin fibers were manufactured and interfacial adhesion systematically assessed from a microscale (single fiber) to a mesoscale (12k tow), and ultimately a macroscale (laminate). Correlations between interfacial shear strength (IFSS) at each level is explored as a function of known interfacial bonding mechanisms; namely mechanical interlocking, chemical adhesion and fiber wetting. Impact of these bonding mechanisms is assessed through extensive mechanical, topological and chemical characterisation. They are correlated to performance as a function of IFSS. Ultimately this study provides a bottoms up approach to improving composite laminates. By understanding the scaling effects from a singular fiber to a composite laminate and linking this knowledge to specific bonding mechanisms, material scientists can make an informed decision on the manufacturing conditions most beneficial for interfacial adhesion.

Keywords: carbon fibers, interfacial adhesion, surface treatment, sizing

Procedia PDF Downloads 265
6287 Hybrid Wind Solar Gas Reliability Optimization Using Harmony Search under Performance and Budget Constraints

Authors: Meziane Rachid, Boufala Seddik, Hamzi Amar, Amara Mohamed

Abstract:

Today’s energy industry seeks maximum benefit with maximum reliability. In order to achieve this goal, design engineers depend on reliability optimization techniques. This work uses a harmony search algorithm (HS) meta-heuristic optimization method to solve the problem of wind-Solar-Gas power systems design optimization. We consider the case where redundant electrical components are chosen to achieve a desirable level of reliability. The electrical power components of the system are characterized by their cost, capacity and reliability. The reliability is considered in this work as the ability to satisfy the consumer demand which is represented as a piecewise cumulative load curve. This definition of the reliability index is widely used for power systems. The proposed meta-heuristic seeks for the optimal design of series-parallel power systems in which a multiple choice of wind generators, transformers and lines are allowed from a list of product available in the market. Our approach has the advantage to allow electrical power components with different parameters to be allocated in electrical power systems. To allow fast reliability estimation, a universal moment generating function (UMGF) method is applied. A computer program has been developed to implement the UMGF and the HS algorithm. An illustrative example is presented.

Keywords: reliability optimization, harmony search optimization (HSA), universal generating function (UMGF)

Procedia PDF Downloads 576
6286 Minimally Invasive versus Conventional Sternotomy for Aortic Valve Replacement: A Systematic Review and Meta-Analysis

Authors: Ahmed Shaboub, Yusuf Jasim Althawadi, Shadi Alaa Abdelaal, Mohamed Hussein Abdalla, Hatem Amr Elzahaby, Mohamed Mohamed, Hazem S. Ghaith, Ahmed Negida

Abstract:

Objectives: We aimed to compare the safety and outcomes of the minimally invasive approaches versus conventional sternotomy procedures for aortic valve replacement. Methods: We conducted a PRISMA-compliant systematic review and meta-analysis. We ran an electronic search of PubMed, Cochrane CENTRAL, Scopus, and Web of Science to identify the relevant published studies. Data were extracted and pooled as standardized mean difference (SMD) or risk ratio (RR) using StataMP version 17 for macOS. Results: Forty-one studies with a total of 15,065 patients were included in this meta-analysis (minimally invasive approaches n=7231 vs. conventional sternotomy n=7834). The pooled effect size showed that minimally invasive approaches had lower mortality rate (RR 0.76, 95%CI [0.59 to 0.99]), intensive care unit and hospital stays (SMD -0.16 and -0.31, respectively), ventilation time (SMD -0.26, 95%CI [-0.38 to -0.15]), 24-h chest tube drainage (SMD -1.03, 95%CI [-1.53 to -0.53]), RBCs transfusion (RR 0.81, 95%CI [0.70 to 0.93]), wound infection (RR 0.66, 95%CI [0.47 to 0.92]) and acute renal failure (RR 0.65, 95%CI [0.46 to 0.93]). However, minimally invasive approaches had longer operative time, cross-clamp, and bypass times (SMD 0.47, 95%CI [0.22 to 0.72], SMD 0.27, 95%CI [0.07 to 0.48], and SMD 0.37, 95%CI [0.20 to 0.45], respectively). There were no differences between the two groups in blood loss, endocarditis, cardiac tamponade, stroke, arrhythmias, pneumonia, pneumothorax, bleeding reoperation, tracheostomy, hemodialysis, or myocardial infarction (all P>0.05). Conclusion: Current evidence showed higher safety and better operative outcomes with minimally invasive aortic valve replacement compared to the conventional approach. Future RCTs with long-term follow-ups are recommended.

Keywords: aortic replacement, minimally invasive, sternotomy, mini-sternotomy, aortic valve, meta analysis

Procedia PDF Downloads 123
6285 Optimum Drilling States in Down-the-Hole Percussive Drilling: An Experimental Investigation

Authors: Joao Victor Borges Dos Santos, Thomas Richard, Yevhen Kovalyshen

Abstract:

Down-the-hole (DTH) percussive drilling is an excavation method that is widely used in the mining industry due to its high efficiency in fragmenting hard rock formations. A DTH hammer system consists of a fluid driven (air or water) piston and a drill bit; the reciprocating movement of the piston transmits its kinetic energy to the drill bit by means of stress waves that propagate through the drill bit towards the rock formation. In the literature of percussive drilling, the existence of an optimum drilling state (Sweet Spot) is reported in some laboratory and field experimental studies. An optimum rate of penetration is achieved for a specific range of axial thrust (or weight-on-bit) beyond which the rate of penetration decreases. Several authors advance different explanations as possible root causes to the occurrence of the Sweet Spot, but a universal explanation or consensus does not exist yet. The experimental investigation in this work was initiated with drilling experiments conducted at a mining site. A full-scale drilling rig (equipped with a DTH hammer system) was instrumented with high precision sensors sampled at a very high sampling rate (kHz). Data was collected while two boreholes were being excavated, an in depth analysis of the recorded data confirmed that an optimum performance can be achieved for specific ranges of input thrust (weight-on-bit). The high sampling rate allowed to identify the bit penetration at each single impact (of the piston on the drill bit) as well as the impact frequency. These measurements provide a direct method to identify when the hammer does not fire, and drilling occurs without percussion, and the bit propagate the borehole by shearing the rock. The second stage of the experimental investigation was conducted in a laboratory environment with a custom-built equipment dubbed Woody. Woody allows the drilling of shallow holes few centimetres deep by successive discrete impacts from a piston. After each individual impact, the bit angular position is incremented by a fixed amount, the piston is moved back to its initial position at the top of the barrel, and the air pressure and thrust are set back to their pre-set values. The goal is to explore whether the observed optimum drilling state stems from the interaction between the drill bit and the rock (during impact) or governed by the overall system dynamics (between impacts). The experiments were conducted on samples of Calca Red, with a drill bit of 74 millimetres (outside diameter) and with weight-on-bit ranging from 0.3 kN to 3.7 kN. Results show that under the same piston impact energy and constant angular displacement of 15 degrees between impact, the average drill bit rate of penetration is independent of the weight-on-bit, which suggests that the sweet spot is not caused by intrinsic properties of the bit-rock interface.

Keywords: optimum drilling state, experimental investigation, field experiments, laboratory experiments, down-the-hole percussive drilling

Procedia PDF Downloads 92
6284 Integrating Best Practices for Construction Waste in Quality Management Systems

Authors: Paola Villoria Sáez, Mercedes Del Río Merino, Jaime Santa Cruz Astorqui, Antonio Rodríguez Sánchez

Abstract:

The Spanish construction industry generates large volumes of waste. However, despite the legislative improvements introduced for construction and demolition waste (CDW), construction waste recycling rate remains well below other European countries and also below the target set for 2020. This situation can be due to many difficulties. i.e.: The difficulty of onsite segregation or the estimation in advance of the total amount generated. Despite these difficulties, the proper management of CDW must be one of the main aspects to be considered by the construction companies. In this sense, some large national companies are implementing Integrated Management Systems (IMS) including not only quality and safety aspects, but also environment issues. However, although this fact is a reality for large construction companies still the vast majority of companies need to adopt this trend. In short, it is common to find in small and medium enterprises a decentralized management system: A single system of quality management, another for system safety management and a third one for environmental management system (EMS). In addition, the EMSs currently used address CDW superficially and are mainly focus on other environmental concerns such as carbon emissions. Therefore, this research determines and implements a specific best practice management system for CDW based on eight procedures in a Spanish Construction company. The main advantages and drawbacks of its implementation are highlighted. Results of this study show that establishing and implementing a CDW management system in building works, improve CDW quantification as the company obtains their own CDW generation ratio. This helps construction stakeholders when developing CDW Management Plans and also helps to achieve a higher adjustment of CDW management costs. Finally, integrating this CDW system with the EMS of the company favors the cohesion of the construction process organization at all stages, establishing responsibilities in the field of waste and providing a greater control over the process.

Keywords: construction and demolition waste, waste management, best practices, waste minimization, building, quality management systems

Procedia PDF Downloads 533
6283 Systems Engineering and Project Management Process Modeling in the Aeronautics Context: Case Study of SMEs

Authors: S. Lemoussu, J. C. Chaudemar, R. A. Vingerhoeds

Abstract:

The aeronautics sector is currently living an unprecedented growth largely due to innovative projects. In several cases, such innovative developments are being carried out by Small and Medium sized-Enterprises (SMEs). For instance, in Europe, a handful of SMEs are leading projects like airships, large civil drones, or flying cars. These SMEs have all limited resources, must make strategic decisions, take considerable financial risks and in the same time must take into account the constraints of safety, cost, time and performance as any commercial organization in this industry. Moreover, today, no international regulations fully exist for the development and certification of this kind of projects. The absence of such a precise and sufficiently detailed regulatory framework requires a very close contact with regulatory instances. But, SMEs do not always have sufficient resources and internal knowledge to handle this complexity and to discuss these issues. This poses additional challenges for those SMEs that have system integration responsibilities and that must provide all the necessary means of compliance to demonstrate their ability to design, produce, and operate airships with the expected level of safety and reliability. The final objective of our research is thus to provide a methodological framework supporting SMEs in their development taking into account recent innovation and institutional rules of the sector. We aim to provide a contribution to the problematic by developing a specific Model-Based Systems Engineering (MBSE) approach. Airspace regulation, aeronautics standards and international norms on systems engineering are taken on board to be formalized in a set of models. This paper presents the on-going research project combining Systems Engineering and Project Management process modeling and taking into account the metamodeling problematic.

Keywords: aeronautics, certification, process modeling, project management, SME, systems engineering

Procedia PDF Downloads 166
6282 Size Effects on Structural Performance of Concrete Gravity Dams

Authors: Mehmet Akköse

Abstract:

Concern about seismic safety of concrete dams have been growing around the world, partly because the population at risk in locations downstream of major dams continues to expand and also because it is increasingly evident that the seismic design concepts in use at the time most existing dams were built were inadequate. Most of the investigations in the past have been conducted on large dams, typically above 100m high. A large number of concrete dams in our country and in other parts of the world are less than 50m high. Most of these dams were usually designed using pseudo-static methods, ignoring the dynamic characteristics of the structure as well as the characteristics of the ground motion. Therefore, it is important to carry out investigations on seismic behavior this category of dam in order to assess and evaluate the safety of existing dams and improve the knowledge for different high dams to be constructed in the future. In this study, size effects on structural performance of concrete gravity dams subjected to near and far-fault ground motions are investigated including dam-water-foundation interaction. For this purpose, a benchmark problem proposed by ICOLD (International Committee on Large Dams) is chosen as a numerical application. Structural performance of the dam having five different heights is evaluated according to damage criterions in USACE (U.S. Army Corps of Engineers). It is decided according to their structural performance if non-linear analysis of the dams requires or not. The linear elastic dynamic analyses of the dams to near and far-fault ground motions are performed using the step-by-step integration technique. The integration time step is 0.0025 sec. The Rayleigh damping constants are calculated assuming 5% damping ratio. The program NONSAP modified for fluid-structure systems with the Lagrangian fluid finite element is employed in the response calculations.

Keywords: concrete gravity dams, Lagrangian approach, near and far-fault ground motion, USACE damage criterions

Procedia PDF Downloads 267
6281 Clinical Response of Nuberol Forte® (Paracetamol 650 MG+Orphenadrine 50 MG) For Pain Management with Musculoskeletal Conditions in Routine Pakistani Practice (NFORTE-EFFECT)

Authors: Shahid Noor, Kazim Najjad, Muhammad Nasir, Irshad Bhutto, Abdul Samad Memon, Khurram Anwar, Tehseen Riaz, Mian Muhammad Hanif, Nauman A. Mallik, Saeed Ahmed, Israr Ahmed, Ali Yasir

Abstract:

Background: Musculoskeletal pain is the most common complaint presented to the health practitioner. It is well known that untreated or under-treated pain can have a significant negative impact on an individual’s quality of life (QoL). Objectives: This study was conducted across 10 sites in six (6) major cities of Pakistan to evaluate the tolerability, safety, and the clinical response of Nuberol Forte® (Paracetamol 650 mg + Orphenadrine 50 mg) to musculoskeletal pain in routine Pakistani practice and its impact on improving the patient’s QoL. Design & Methods: This NFORT-EFFECT observational, prospective multicenter study was conducted in compliance with Good Clinical Practice guidelines and local regulatory requirements. The study sponsor was "The Searle Company Limited, Pakistan. To maintain the GCP compliances, the sponsor assigned the CRO for the site and data management. Ethical approval was obtained from an independent ethics committee. The IEC reviewed the progress of the study. Written informed consent was obtained from the study participants, and their confidentiality was maintained throughout the study. A total of 399 patients with known prescreened musculoskeletal conditions and pain who attended the study sites were recruited, as per the inclusion/exclusion criteria (clinicaltrials.gov ID# NCT04765787). The recruited patients were then prescribed Paracetamol (650 mg) and Orphenadrine (50 mg) combination (Nuberol Forte®) for 7 to 14 days as per the investigator's discretion based on the pain intensity. After the initial screening (visit 1), a follow-up visit was conducted after 1-2 weeks of the treatment (visit 2). Study Endpoints: The primary objective was to assess the pain management response of Nuberol Forte treatment and the overall safety of the drug. The Visual Analogue Scale (VAS) scale was used to measure pain severity. Secondary to pain, the patients' health-related quality of life (HRQoL) was also assessed using the Muscle, Joint Measure (MJM) scale. The safety was monitored on the first dose by the patients. These assessments were done on each study visit. Results: Out of 399 enrolled patients, 49.4% were males, and 50.6% were females with a mean age of 47.24 ± 14.20 years. Most patients were presented with Knee Osteoarthritis (OA), i.e., 148(38%), followed by backache 70(18.2%). A significant reduction in the mean pain score was observed after the treatment with the combination of Paracetamol and Orphenadrine (p<0.05). Furthermore, an overall improvement in the patient’s QoL was also observed. During the study, only ten patients reported mild adverse events (AEs). Conclusion: The combination of Paracetamol and Orphenadrine (Nuberol Forte®) exhibited effective pain management among patients with musculoskeletal conditions and also improved their QoL.

Keywords: musculoskeletal pain, orphenadrine/paracetamol combination, pain management, quality of life, Pakistani population

Procedia PDF Downloads 171
6280 When Conducting an Analysis of Workplace Incidents, It Is Imperative to Meticulously Calculate Both the Frequency and Severity of Injuries Sustain

Authors: Arash Yousefi

Abstract:

Experts suggest that relying exclusively on parameters to convey a situation or establish a condition may not be adequate. Assessing and appraising incidents in a system based on accident parameters, such as accident frequency, lost workdays, or fatalities, may not always be precise and occasionally erroneous. The frequency rate of accidents is a metric that assesses the correlation between the number of accidents causing work-time loss due to injuries and the total working hours of personnel over a year. Traditionally, this has been calculated based on one million working hours, but the American Occupational Safety and Health Organization has updated its standards. The new coefficient of 200/000 working hours is now used to compute the frequency rate of accidents. It's crucial to ensure that the total working hours of employees are equally represented when calculating individual event and incident numbers. The accident severity rate is a metric used to determine the amount of time lost or wasted during a given period, often a year, in relation to the total number of working hours. It measures the percentage of work hours lost or wasted compared to the total number of useful working hours, which provides valuable insight into the number of days lost or wasted due to work-related incidents for each working hour. Calculating the severity of an incident can be difficult if a worker suffers permanent disability or death. To determine lost days, coefficients specified in the "tables of days equivalent to OSHA or ANSI standards" for disabling injuries are used. The accident frequency coefficient denotes the rate at which accidents occur, while the accident severity coefficient specifies the extent of damage and injury caused by these accidents. These coefficients are crucial in accurately assessing the magnitude and impact of accidents.

Keywords: incidents, safety, analysis, frequency, severity, injuries, determine

Procedia PDF Downloads 91
6279 The Impact of Reducing Road Traffic Speed in London on Noise Levels: A Comparative Study of Field Measurement and Theoretical Calculation

Authors: Jessica Cecchinelli, Amer Ali

Abstract:

The continuing growth in road traffic and the resultant impact on the level of pollution and safety especially in urban areas have led local and national authorities to reduce traffic speed and flow in major towns and cities. Various boroughs of London have recently reduced the in-city speed limit from 30mph to 20mph mainly to calm traffic, improve safety and reduce noise and vibration. This paper reports the detailed field measurements using noise sensor and analyser and the corresponding theoretical calculations and analysis of the noise levels on a number of roads in the central London Borough of Camden where speed limit was reduced from 30mph to 20mph in all roads except the major routes of the ‘Transport for London (TfL)’. The measurements, which included the key noise levels and scales at residential streets and main roads, were conducted during weekdays and weekends normal and rush hours. The theoretical calculations were done according to the UK procedure ‘Calculation of Road Traffic Noise 1988’ and with conversion to the European L-day, L-evening, L-night, and L-den and other important levels. The current study also includes comparable data and analysis from previously measured noise in the Borough of Camden and other boroughs of central London. Classified traffic flow and speed on the roads concerned were observed and used in the calculation part of the study. Relevant data and description of the weather condition are reported. The paper also reports a field survey in the form of face-to-face interview questionnaires, which was carried out in parallel with the field measurement of noise, in order to ascertain the opinions and views of local residents and workers in the reduced speed zones of 20mph. The main findings are that the reduction in speed had reduced the noise pollution on the studied zones and that the measured and calculated noise levels for each speed zone are closely matched. Among the other findings was that of the field survey of the opinions and views of the local residents and workers in the reduced speed 20mph zones who supported the scheme and felt that it had improved the quality of life in their areas giving a sense of calmness and safety particularly for families with children, the elderly, and encouraged pedestrians and cyclists. The key conclusions are that lowering the speed limit in built-up areas would not just reduce the number of serious accidents but it would also reduce the noise pollution and promote clean modes of transport particularly walking and cycling. The details of the site observations and the corresponding calculations together with critical comparative analysis and relevant conclusions will be reported in the full version of the paper.

Keywords: noise calculation, noise field measurement, road traffic noise, speed limit in london, survey of people satisfaction

Procedia PDF Downloads 424
6278 Lactic Acid Solution and Aromatic Vinegar Nebulization to Improve Hunted Wild Boar Carcass Hygiene at Game-Handling Establishment: Preliminary Results

Authors: Rossana Roila, Raffaella Branciari, Lorenzo Cardinali, David Ranucci

Abstract:

The wild boar (Sus scrofa) population has strongly increased across Europe in the last decades, also causing severe fauna management issues. In central Italy, wild boar is the main hunted wild game species, with approximately 40,000 animals killed per year only in the Umbria region. The meat of the game is characterized by high-quality nutritional value as well as peculiar taste and aroma, largely appreciated by consumers. This type of meat and products thereof can meet the current consumers’ demand for higher quality foodstuff, not only from a nutritional and sensory point of view but also in relation to environmental sustainability, the non-use of chemicals, and animal welfare. The game meat production chain is characterized by some gaps from a hygienic point of view: the harvest process is usually conducted in a wild environment where animals can be more easily contaminated during hunting and subsequent practices. The definition and implementation of a certified and controlled supply chain could ensure quality, traceability and safety for the final consumer and therefore promote game meat products. According to European legislation in some animal species, such as bovine, the use of weak acid solutions for carcass decontamination is envisaged in order to ensure the maintenance of optimal hygienic characteristics. A preliminary study was carried out to evaluate the applicability of similar strategies to control the hygienic level of wild boar carcasses. The carcasses, harvested according to the selective method and processed into the game-handling establishment, were treated by nebulization with two different solutions: a 2% food-grade lactic acid solution and aromatic vinegar. Swab samples were performed before treatment and in different moments after-treatment of the carcasses surfaces and subsequently tested for Total Aerobic Mesophilic Load, Total Aerobic Psychrophilic Load, Enterobacteriaceae, Staphylococcus spp. and lactic acid bacteria. The results obtained for the targeted microbial populations showed a positive effect of the application of the lactic acid solution on all the populations investigated, while aromatic vinegar showed a lower effect on bacterial growth. This study could lay the foundations for the optimization of the use of a lactic acid solution to treat wild boar carcasses aiming to guarantee good hygienic level and safety of meat.

Keywords: game meat, food safety, process hygiene criteria, microbial population, microbial growth, food control

Procedia PDF Downloads 160
6277 Rational Bureaucracy and E-Government: A Philosophical Study of Universality of E-Government

Authors: Akbar Jamali

Abstract:

Hegel is the first great political philosopher who specifically contemplates on bureaucracy. For Hegel bureaucracy is the function of the state. Since state, essentially is a rational organization, its function; namely, bureaucracy must be rational. Since, what is rational is universal; Hegel had to explain how the bureaucracy could be understood as universal. Hegel discusses bureaucracy in his treatment of ‘executive power’. He analyses modern bureaucracy as a form of political organization, its constituent members, and its relation to the social environment. Therefore, the essence of bureaucracy in Hegel’s philosophy is the implementation of law and rules. Hegel argues that unlike the other social classes that are particular because they look for their own private interest, bureaucracy as a class is a ‘universal’ because their orientation is the interest of the state. State for Hegel is essentially rational and universal. It is the actualization of ‘objective Spirit’. Marx criticizes Hegel’s argument on the universality of state and bureaucracy. For Marx state is equal to bureaucracy, it constitutes a social class that based on the interest of bourgeois class that dominates the society and exploits proletarian class. Therefore, the main disagreement between these political philosophers is: whether the state (bureaucracy) is universal or particular. Growing e-government in modern state as an important aspect of development leads us to contemplate on the particularity and universality of e-government. In this article, we will argue that e-government essentially is universal. E-government, in itself, is impartial; therefore, it cannot be particular. The development of e-government eliminates many side effects of the private, personal or particular interest of the individuals who work as bureaucracy. Finally, we will argue that more a state is developed more it is universal. Therefore, development of e-government makes the state a more universal and affects the modern philosophical debate on the particularity or universality of bureaucracy and state.

Keywords: particularity, universality, rational bureaucracy, impartiality

Procedia PDF Downloads 251
6276 Optimization of Air Pollution Control Model for Mining

Authors: Zunaira Asif, Zhi Chen

Abstract:

The sustainable measures on air quality management are recognized as one of the most serious environmental concerns in the mining region. The mining operations emit various types of pollutants which have significant impacts on the environment. This study presents a stochastic control strategy by developing the air pollution control model to achieve a cost-effective solution. The optimization method is formulated to predict the cost of treatment using linear programming with an objective function and multi-constraints. The constraints mainly focus on two factors which are: production of metal should not exceed the available resources, and air quality should meet the standard criteria of the pollutant. The applicability of this model is explored through a case study of an open pit metal mine, Utah, USA. This method simultaneously uses meteorological data as a dispersion transfer function to support the practical local conditions. The probabilistic analysis and the uncertainties in the meteorological conditions are accomplished by Monte Carlo simulation. Reasonable results have been obtained to select the optimized treatment technology for PM2.5, PM10, NOx, and SO2. Additional comparison analysis shows that baghouse is the least cost option as compared to electrostatic precipitator and wet scrubbers for particulate matter, whereas non-selective catalytical reduction and dry-flue gas desulfurization are suitable for NOx and SO2 reduction respectively. Thus, this model can aid planners to reduce these pollutants at a marginal cost by suggesting control pollution devices, while accounting for dynamic meteorological conditions and mining activities.

Keywords: air pollution, linear programming, mining, optimization, treatment technologies

Procedia PDF Downloads 208