Search results for: Risk Mitigation
147 Reliability Assessment for Tie Line Capacity Assistance of Power Systems Based On Multi-Agent System
Authors: Nadheer A. Shalash, Abu Zaharin Bin Ahmad
Abstract:
Technological developments in industrial innovations have currently been related to interconnected system assistance and distribution networks. This important in order to enable an electrical load to continue receive power in the event of disconnection of load from the main power grid. This paper represents a method for reliability assessment of interconnected power systems based. The multi-agent system consists of four agents. The first agent was the generator agent to using as connected the generator to the grid depending on the state of the reserve margin and the load demand. The second was a load agent is that located at the load. Meanwhile, the third is so-called "the reverse margin agent" that to limit the reserve margin between 0 - 25% depend on the load and the unit size generator. In the end, calculation reliability Agent can be calculate expected energy not supplied (EENS), loss of load expectation (LOLE) and the effecting of tie line capacity to determine the risk levels Roy Billinton Test System (RBTS) can use to evaluated the reliability indices by using the developed JADE package. The results estimated of the reliability interconnection power systems presented in this paper. The overall reliability of power system can be improved. Thus, the market becomes more concentrated against demand increasing and the generation units were operating in relation to reliability indices.
Keywords: Reliability indices, Load expectation, Reserve margin, Daily load, Probability, Multi-agent system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2581146 Modeling Default Probabilities of the Chosen Czech Banks in the Time of the Financial Crisis
Authors: Petr Gurný
Abstract:
One of the most important tasks in the risk management is the correct determination of probability of default (PD) of particular financial subjects. In this paper a possibility of determination of financial institution’s PD according to the creditscoring models is discussed. The paper is divided into the two parts. The first part is devoted to the estimation of the three different models (based on the linear discriminant analysis, logit regression and probit regression) from the sample of almost three hundred US commercial banks. Afterwards these models are compared and verified on the control sample with the view to choose the best one. The second part of the paper is aimed at the application of the chosen model on the portfolio of three key Czech banks to estimate their present financial stability. However, it is not less important to be able to estimate the evolution of PD in the future. For this reason, the second task in this paper is to estimate the probability distribution of the future PD for the Czech banks. So, there are sampled randomly the values of particular indicators and estimated the PDs’ distribution, while it’s assumed that the indicators are distributed according to the multidimensional subordinated Lévy model (Variance Gamma model and Normal Inverse Gaussian model, particularly). Although the obtained results show that all banks are relatively healthy, there is still high chance that “a financial crisis” will occur, at least in terms of probability. This is indicated by estimation of the various quantiles in the estimated distributions. Finally, it should be noted that the applicability of the estimated model (with respect to the used data) is limited to the recessionary phase of the financial market.
Keywords: Credit-scoring Models, Multidimensional Subordinated Lévy Model, Probability of Default.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1919145 Origins of Strict Liability for Abnormally Dangerous Activities in the United States, Rylands v. Fletcher and a General Clause of Strict Liability in the UK
Authors: Maria Lubomira Kubica
Abstract:
The paper reveals the birth and evolution of the British precedent Rylands v. Fletcher that, once adopted on the other side of the Ocean (in United States), gave rise to a general clause of liability for abnormally dangerous activities recognized by the §20 of the American Restatements of the Law Third, Liability for Physical and Emotional Harm. The main goal of the paper was to analyze the development of the legal doctrine and of the case law posterior to the precedent together with the intent of the British judicature to leapfrog from the traditional rule contained in Rylands v. Fletcher to a general clause similar to that introduced in the United States and recently also on the European level. As it is well known, within the scope of tort law two different initiatives compete with the aim of harmonizing the European laws: European Group on Tort Law with its Principles of European Tort Law (hereinafter PETL) in which article 5:101 sets forth a general clause for strict liability for abnormally dangerous activities and Study Group on European Civil Code with its Common Frame of Reference (CFR) which promotes rather ad hoc model of listing out determined cases of strict liability. Very narrow application scope of the art. 5:101 PETL, restricted only to abnormally dangerous activities, stays in opposition to very broad spectrum of strict liability cases governed by the CFR. The former is a perfect example of a general clause that offers a minimum and basic standard, possibly acceptable also in those countries in which, like in the United Kingdom, this regime of liability is completely marginalized.
Keywords: Dangerous activities, general clause, risk, strict liability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2323144 Impact of Changes of the Conceptual Framework for Financial Reporting on the Indicators of the Financial Statement
Authors: Nadezhda Kvatashidze
Abstract:
The International Accounting Standards Board updated the conceptual framework for financial reporting. The main reason behind it is to resolve the tasks of the accounting, which are caused by the market development and business-transactions of a new economic content. Also, the investors call for higher transparency of information and responsibility for the results in order to make a more accurate risk assessment and forecast. All these make it necessary to further develop the conceptual framework for financial reporting so that the users get useful information. The market development and certain shortcomings of the conceptual framework revealed in practice require its reconsideration and finding new solutions. Some issues and concepts, such as disclosure and supply of information, its qualitative characteristics, assessment, and measurement uncertainty had to be supplemented and perfected. The criteria of recognition of certain elements (assets and liabilities) of reporting had to be updated, too and all this is set out in the updated edition of the conceptual framework for financial reporting, a comprehensive collection of concepts underlying preparation of the financial statement. The main objective of conceptual framework revision is to improve financial reporting and development of clear concepts package. This will support International Accounting Standards Board (IASB) to set common “Approach & Reflection” for similar transactions on the basis of mutually accepted concepts. As a result, companies will be able to develop coherent accounting policies for those transactions or events that are occurred from particular deals to which no standard is used or when standard allows choice of accounting policy.
Keywords: Conceptual framework, measurement basis, measurement uncertainty, neutrality, prudence, stewardship.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2509143 Development of Rock Engineering System-Based Models for Tunneling Progress Analysis and Evaluation: Case Study of Tailrace Tunnel of Azad Power Plant Project
Authors: S. Golmohammadi, M. Noorian Bidgoli
Abstract:
Tunneling progress is a key parameter in the blasting method of tunneling. Taking measures to enhance tunneling advance can limit the progress distance without a supporting system, subsequently reducing or eliminating the risk of damage. This paper focuses on modeling tunneling progress using three main groups of parameters (tunneling geometry, blasting pattern, and rock mass specifications) based on the Rock Engineering Systems (RES) methodology. In the proposed models, four main effective parameters on tunneling progress are considered as inputs (RMR, Q-system, Specific charge of blasting, Area), with progress as the output. Data from 86 blasts conducted at the tailrace tunnel in the Azad Dam, western Iran, were used to evaluate the progress value for each blast. The results indicated that, for the 86 blasts, the progress of the estimated model aligns mostly with the measured progress. This paper presents a method for building the interaction matrix (statistical base) of the RES model. Additionally, a comparison was made between the results of the new RES-based model and a Multi-Linear Regression (MLR) analysis model. In the RES-based model, the effective parameters are RMR (35.62%), Q (28.6%), q (specific charge of blasting) (20.35%), and A (15.42%), respectively, whereas for MLR analysis, the main parameters are RMR, Q (system), q, and A. These findings confirm the superior performance of the RES-based model over the other proposed models.
Keywords: Rock Engineering Systems, tunneling progress, Multi Linear Regression, Specific charge of blasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 141142 A Proposal for a Secure and Interoperable Data Framework for Energy Digitalization
Authors: Hebberly Ahatlan
Abstract:
The process of digitizing energy systems involves transforming traditional energy infrastructure into interconnected, data-driven systems that enhance efficiency, sustainability, and responsiveness. As smart grids become increasingly integral to the efficient distribution and management of electricity from both fossil and renewable energy sources, the energy industry faces strategic challenges associated with digitalization and interoperability — particularly in the context of modern energy business models, such as virtual power plants (VPPs). The critical challenge in modern smart grids is to seamlessly integrate diverse technologies and systems, including virtualization, grid computing and service-oriented architecture (SOA), across the entire energy ecosystem. Achieving this requires addressing issues like semantic interoperability, Information Technology (IT) and Operational Technology (OT) convergence, and digital asset scalability, all while ensuring security and risk management. This paper proposes a four-layer digitalization framework to tackle these challenges, encompassing persistent data protection, trusted key management, secure messaging, and authentication of IoT resources. Data assets generated through this framework enable AI systems to derive insights for improving smart grid operations, security, and revenue generation. Furthermore, this paper also proposes a Trusted Energy Interoperability Alliance as a universal guiding standard in the development of this digitalization framework to support more dynamic and interoperable energy markets.
Keywords: Digitalization, IT/OT convergence, semantic interoperability, TEIA alliance, VPP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 117141 Improvement to Pedestrian Walkway Facilities to Enhance Pedestrian Safety-Initiatives in India
Authors: Basavaraj Kabade, K. T. Nagaraja, Swathi Ramanathan, A. Veeraragavan, P. S. Reashma
Abstract:
Deteriorating quality of the pedestrian environment and the increasing risk of pedestrian crashes are major concerns for most of the cities in India. The recent shift in the priority to motorized transport and the abating condition of existing pedestrian facilities can be considered as prime reasons for the increasing pedestrian related crashes in India. Bengaluru City – the IT capital hub of the nation is not much different from this. The increase in number of pedestrian crashes in Bengaluru reflects the same. To resolve this issue and to ensure safe, sustainable and pedestrian friendly sidewalks, Govt. of Karnataka, India has implemented newfangled pedestrian sidewalks popularized programme named Tender S.U.R.E. (Specifications for Urban Road Execution) projects. Tender SURE adopts unique urban street design guidelines where the pedestrians are given prime preference. The present study presents an assessment of the quality and performance of the pedestrian side walk and the walkability index of the newly built pedestrian friendly sidewalks. Various physical and environmental factors affecting pedestrian safety are identified and studied in detail. The pedestrian mobility is quantified through Pedestrian Level of Service (PLoS) and the pedestrian walking comfort is measured by calculating the Walkability Index (WI). It is observed that the new initiatives taken in reference to improving pedestrian safety have succeeded in Bengaluru by attaining a level of Service of ‘A’ and with a good WI score.Keywords: Pedestrian safety, pedestrian level of service, right of way, Tender SURE, walkability index, walkway facilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2007140 Tele-Diagnosis System for Rural Thailand
Authors: C. Snae Namahoot, M. Brueckner
Abstract:
Thailand-s health system is challenged by the rising number of patients and decreasing ratio of medical practitioners/patients, especially in rural areas. This may tempt inexperienced GPs to rush through the process of anamnesis with the risk of incorrect diagnosis. Patients have to travel far to the hospital and wait for a long time presenting their case. Many patients try to cure themselves with traditional Thai medicine. Many countries are making use of the Internet for medical information gathering, distribution and storage. Telemedicine applications are a relatively new field of study in Thailand; the infrastructure of ICT had hampered widespread use of the Internet for using medical information. With recent improvements made health and technology professionals can work out novel applications and systems to help advance telemedicine for the benefit of the people. Here we explore the use of telemedicine for people with health problems in rural areas in Thailand and present a Telemedicine Diagnosis System for Rural Thailand (TEDIST) for diagnosing certain conditions that people with Internet access can use to establish contact with Community Health Centers, e.g. by mobile phone. The system uses a Web-based input method for individual patients- symptoms, which are taken by an expert system for the analysis of conditions and appropriate diseases. The analysis harnesses a knowledge base and a backward chaining component to find out, which health professionals should be presented with the case. Doctors have the opportunity to exchange emails or chat with the patients they are responsible for or other specialists. Patients- data are then stored in a Personal Health Record.Keywords: Biomedical engineering, data acquisition, expert system, information management system, and information retrieval.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2828139 Designing Social Care Plans Considering Cause-Effect Relationships: A Study in Scotland
Authors: Sotirios N. Raptis
Abstract:
The paper links social needs to social classes by the creation of cohorts of public services matched as causes to other ones as effects using cause-effect (CE) models. It then compares these associations using CE and typical regression methods (LR, ARMA). The paper discusses such public service groupings offered in Scotland in the long term to estimate the risk of multiple causes or effects that can ultimately reduce the healthcare cost by linking the next services to the likely causes of them. The same generic goal can be achieved using LR or ARMA and differences are discussed. The work uses Health and Social Care (H&Sc) public services data from 11 service packs offered by Public Health Services (PHS) Scotland that boil down to 110 single-attribute year series, called ’factors’. The study took place at Macmillan Cancer Support, UK and Abertay University, Dundee, from 2020 to 2023. The paper discusses CE relationships as a main method and compares sample findings with Linear Regression (LR), ARMA, to see how the services are linked. Relationships found were between smoking-related healthcare provision, mental-health-related services, and epidemiological weight in Primary-1-Education Body-Mass-Index (BMI) in children as CE models. Insurance companies and public policymakers can pack CE-linked services in plans such as those for the elderly, low-income people, in the long term. The linkage of services was confirmed allowing more accurate resource planning.
Keywords: Probability, regression, cause-effect cohorts, data frames, services, prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 57138 Localizing and Recognizing Integral Pitches of Cheque Document Images
Authors: Bremananth R., Veerabadran C. S., Andy W. H. Khong
Abstract:
Automatic reading of handwritten cheque is a computationally complex process and it plays an important role in financial risk management. Machine vision and learning provide a viable solution to this problem. Research effort has mostly been focused on recognizing diverse pitches of cheques and demand drafts with an identical outline. However most of these methods employ templatematching to localize the pitches and such schemes could potentially fail when applied to different types of outline maintained by the bank. In this paper, the so-called outline problem is resolved by a cheque information tree (CIT), which generalizes the localizing method to extract active-region-of-entities. In addition, the weight based density plot (WBDP) is performed to isolate text entities and read complete pitches. Recognition is based on texture features using neural classifiers. Legal amount is subsequently recognized by both texture and perceptual features. A post-processing phase is invoked to detect the incorrect readings by Type-2 grammar using the Turing machine. The performance of the proposed system was evaluated using cheque and demand drafts of 22 different banks. The test data consists of a collection of 1540 leafs obtained from 10 different account holders from each bank. Results show that this approach can easily be deployed without significant design amendments.Keywords: Cheque reading, Connectivity checking, Text localization, Texture analysis, Turing machine, Signature verification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1657137 Managing Iterations in Product Design and Development
Authors: K. Aravindhan, Trishit Bandyopadhyay, Mahesh Mehendale, Supriya Kumar De
Abstract:
The inherent iterative nature of product design and development poses significant challenge to reduce the product design and development time (PD). In order to shorten the time to market, organizations have adopted concurrent development where multiple specialized tasks and design activities are carried out in parallel. Iterative nature of work coupled with the overlap of activities can result in unpredictable time to completion and significant rework. Many of the products have missed the time to market window due to unanticipated or rather unplanned iteration and rework. The iterative and often overlapped processes introduce greater amounts of ambiguity in design and development, where the traditional methods and tools of project management provide less value. In this context, identifying critical metrics to understand the iteration probability is an open research area where significant contribution can be made given that iteration has been the key driver of cost and schedule risk in PD projects. Two important questions that the proposed study attempts to address are: Can we predict and identify the number of iterations in a product development flow? Can we provide managerial insights for a better control over iteration? The proposal introduces the concept of decision points and using this concept intends to develop metrics that can provide managerial insights into iteration predictability. By characterizing the product development flow as a network of decision points, the proposed research intends to delve further into iteration probability and attempts to provide more clarity.
Keywords: Decision Points, Iteration, Product Design, Rework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2192136 Tailoring of ECSS Standard for Space Qualification Test of CubeSat Nano-Satellite
Authors: B. Tiseo, V. Quaranta, G. Bruno, G. Sisinni
Abstract:
There is an increasing demand of nano-satellite development among universities, small companies, and emerging countries. Low-cost and fast-delivery are the main advantages of such class of satellites achieved by the extensive use of commercial-off-the-shelf components. On the other side, the loss of reliability and the poor success rate are limiting the use of nano-satellite to educational and technology demonstration and not to the commercial purpose. Standardization of nano-satellite environmental testing by tailoring the existing test standard for medium/large satellites is then a crucial step for their market growth. Thus, it is fundamental to find the right trade-off between the improvement of reliability and the need to keep their low-cost/fast-delivery advantages. This is particularly even more essential for satellites of CubeSat family. Such miniaturized and standardized satellites have 10 cm cubic form and mass no more than 1.33 kilograms per 1 unit (1U). For this class of nano-satellites, the qualification process is mandatory to reduce the risk of failure during a space mission. This paper reports the description and results of the space qualification test campaign performed on Endurosat’s CubeSat nano-satellite and modules. Mechanical and environmental tests have been carried out step by step: from the testing of the single subsystem up to the assembled CubeSat nano-satellite. Functional tests have been performed during all the test campaign to verify the functionalities of the systems. The test duration and levels have been selected by tailoring the European Space Agency standard ECSS-E-ST-10-03C and GEVS: GSFC-STD-7000A.Keywords: CubeSat, Nano-satellite, shock, testing, vibration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1716135 Surveillance for African Swine Fever and Classical Swine Fever in Benue State, Nigeria
Authors: A. Asambe, A. K. B. Sackey, L. B. Tekdek
Abstract:
A serosurveillance study was conducted to detect the presence of antibodies to African swine fever virus (ASFV) and Classical swine fever virus in pigs sampled from piggeries and Makurdi central slaughter slab in Benue State, Nigeria. 416 pigs from 74 piggeries across 12 LGAs and 44 pigs at the Makurdi central slaughter slab were sampled for serum. The sera collected were analysed using Indirect Enzyme Linked Immunosorbent Assay (ELISA) test kit to test for antibodies to ASFV, while competitive ELISA test kit was used to test for antibodies to CSFV. Of the 416 pigs from piggeries and 44 pigs sampled from the slaughter slab, seven (1.7%) and six (13.6%), respectively, tested positive to ASFV antibodies and was significantly associated (p < 0.0001). Out of the 12 LGAs sampled, Obi LGA had the highest ASFV antibody detection rate of (4.8%) and was significantly associated (p < 0.0001). None of the samples tested positive to CSFV antibodies. The study concluded that antibodies to CSFV were absent in the sampled pigs in piggeries and at the Makurdi central slaughter slab in Benue State, while antibodies to ASFV were present in both locations; hence, the need to keep an eye open for CSF too since both diseases may pose great risk in the study area. Further studies to characterise the ASFV circulating in Benue State and investigate the possible sources is recommended. Routine surveillance to provide a comprehensive and readily accessible data base to plan for the prevention of any fulminating outbreak is also recommended.Keywords: African swine fever, classical swine fever, piggery, slaughter slab, surveillance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1058134 Impacts of Climate Change under the Threat of Global Warming for an Agricultural Watershed of the Kangsabati River
Authors: Sujana Dhar, Asis Mazumdar
Abstract:
The effects of global warming on India vary from the submergence of low-lying islands and coastal lands to the melting of glaciers in the Indian Himalayas, threatening the volumetric flow rate of many of the most important rivers of India and South Asia. In India, such effects are projected to impact millions of lives. As a result of ongoing climate change, the climate of India has become increasingly volatile over the past several decades; this trend is expected to continue. Climate change is one of the most important global environmental challenges, with implications for food production, water supply, health, energy, etc. Addressing climate change requires a good scientific understanding as well as coordinated action at national and global level. The climate change issue is part of the larger challenge of sustainable development. As a result, climate policies can be more effective when consistently embedded within broader strategies designed to make national and regional development paths more sustainable. The impact of climate variability and change, climate policy responses, and associated socio-economic development will affect the ability of countries to achieve sustainable development goals. A very well calibrated Soil and Water Assessment Tool (R2 = 0.9968, NSE = 0.91) was exercised over the Khatra sub basin of the Kangsabati River watershed in Bankura district of West Bengal, India, in order to evaluate projected parameters for agricultural activities. Evapotranspiration, Transmission Losses, Potential Evapotranspiration and Lateral Flow to reach are evaluated from the years 2041-2050 in order to generate a picture for sustainable development of the river basin and its inhabitants. India has a significant stake in scientific advancement as well as an international understanding to promote mitigation and adaptation. This requires improved scientific understanding, capacity building, networking and broad consultation processes. This paper is a commitment towards the planning, management and development of the water resources of the Kangsabati River by presenting detailed future scenarios of the Kangsabati river basin, Khatra sub basin, over the mentioned time period. India-s economy and societal infrastructures are finely tuned to the remarkable stability of the Indian monsoon, with the consequence that vulnerability to small changes in monsoon rainfall is very high. In 2002 the monsoon rains failed during July, causing profound loss of agricultural production with a drop of over 3% in India-s GDP. Neither the prolonged break in the monsoon nor the seasonal rainfall deficit was predicted. While the general features of monsoon variability and change are fairly well-documented, the causal mechanisms and the role of regional ecosystems in modulating the changes are still not clear. Current climate models are very poor at modelling the Asian monsoon: this is a challenging and critical region where the ocean, atmosphere, land surface and mountains all interact. The impact of climate change on regional ecosystems is likewise unknown. The potential for the monsoon to become more volatile has major implications for India itself and for economies worldwide. Knowledge of future variability of the monsoon system, particularly in the context of global climate change, is of great concern for regional water and food security. The major findings of this paper were that of all the chosen projected parameters, transmission losses, soil water content, potential evapotranspiration, evapotranspiration and lateral flow to reach, display an increasing trend over the time period of years 2041- 2050.Keywords: Change, future water availability scenario, modeling, SWAT, global warming, sustainability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2595133 Classification of Acoustic Emission Based Partial Discharge in Oil Pressboard Insulation System Using Wavelet Analysis
Authors: Prasanta Kundu, N.K. Kishore, A.K. Sinha
Abstract:
Insulation used in transformer is mostly oil pressboard insulation. Insulation failure is one of the major causes of catastrophic failure of transformers. It is established that partial discharges (PD) cause insulation degradation and premature failure of insulation. Online monitoring of PDs can reduce the risk of catastrophic failure of transformers. There are different techniques of partial discharge measurement like, electrical, optical, acoustic, opto-acoustic and ultra high frequency (UHF). Being non invasive and non interference prone, acoustic emission technique is advantageous for online PD measurement. Acoustic detection of p.d. is based on the retrieval and analysis of mechanical or pressure signals produced by partial discharges. Partial discharges are classified according to the origin of discharges. Their effects on insulation deterioration are different for different types. This paper reports experimental results and analysis for classification of partial discharges using acoustic emission signal of laboratory simulated partial discharges in oil pressboard insulation system using three different electrode systems. Acoustic emission signal produced by PD are detected by sensors mounted on the experimental tank surface, stored on an oscilloscope and fed to computer for further analysis. The measured AE signals are analyzed using discrete wavelet transform analysis and wavelet packet analysis. Energy distribution in different frequency bands of discrete wavelet decomposed signal and wavelet packet decomposed signal is calculated. These analyses show a distinct feature useful for PD classification. Wavelet packet analysis can sort out any misclassification arising out of DWT in most cases.
Keywords: Acoustic emission, discrete wavelet transform, partial discharge, wavelet packet analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2987132 Environmental Consequences of Metal Concentrations in Stream Sediments of Atoyac River Basin, Central Mexico: Natural and Industrial Influences
Authors: V. C. Shruti, P. F. Rodríguez-Espinosa, D. C. Escobedo-Urías, Estefanía Martinez Tavera, M. P. Jonathan
Abstract:
Atoyac River, a major south-central river flowing through the states of Puebla and Tlaxcala in Mexico is significantly impacted by the natural volcanic inputs in addition with wastewater discharges from urban, agriculture and industrial zones. In the present study, core samples were collected from R. Atoyac and analyzed for sediment granularity, major (Al, Fe, Ca, Mg, K, P and S) and trace elemental concentrations (Ba, Cr, Cd, Mn, Pb, Sr, V, Zn, Zr). The textural studies reveal that the sediments are mostly sand sized particles exceeding 99% and with very few to no presence of mud fractions. It is observed that most of the metals like (avg: all values in μg g-1) Ca (35,528), Mg (10,789), K (7453), S (1394), Ba (203), Cr (30), Cd (4), Pb (11), Sr (435), Zn (76) and Zr (88) are enriched throughout the sediments mainly sourced from volcanic inputs, source rock composition of Atoyac River basin and industrial influences from the Puebla city region. Contamination indices, such as anthropogenic factor (AF), enrichment factor (EF) and geoaccumulation index (Igeo), were used to investigate the level of contamination and toxicity as well as quantitatively assess the influences of human activities on metal concentrations. The AF values (>1) for Ba, Ca, Mg, Na, K, P and S suggested volcanic inputs from the study region, where as Cd and Zn are attributed to the impacts of industrial inputs in this zone. The EF and Igeo values revealed an extreme enrichment of S and Cd. The ecological risks were evaluated using potential ecological risk index (RI) and the results indicate that the metals Cd and V pose a major hazard for the biological community.Keywords: Atoyac River, contamination indices, metal concentrations, Mexico, textural studies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1148131 Investigation of Rehabilitation Effects on Fire Damaged High Strength Concrete Beams
Authors: Eun Mi Ryu, Ah Young An, Ji Yeon Kang, Yeong Soo Shin, Hee Sun Kim
Abstract:
When high strength reinforced concrete is exposed to high temperature due to a fire, deteriorations occur such as loss in strength and elastic modulus, cracking and spalling of the concrete. Therefore, it is important to understand risk of structural safety in building structures by studying structural behaviors and rehabilitation of fire damaged high strength concrete structures. This paper aims at investigating rehabilitation effect on fire damaged high strength concrete beams using experimental and analytical methods. In the experiments, flexural specimens with high strength concrete are exposed to high temperatures according to ISO 834 standard time temperature curve. From four-point loading test, results show that maximum loads of the rehabilitated beams are similar to or higher than those of the non-fire damaged RC beam. In addition, structural analyses are performed using ABAQUS 6.10-3 with same conditions as experiments to provide accurate predictions on structural and mechanical behaviors of rehabilitated RC beams. The parameters are the fire cover thickness and strengths of repairing mortar. Analytical results show good rehabilitation effects, when the results predicted from the rehabilitated models are compared to structural behaviors of the non-damaged RC beams. In this study, fire damaged high strength concrete beams are rehabilitated using polymeric cement mortar. The predictions from the finite element (FE) models show good agreements with the experimental results and the modeling approaches can be used to investigate applicability of various rehabilitation methods for further study.Keywords: Fire, High strength concrete, Rehabilitation, Reinforced concrete beam.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2375130 The Impact of Regulatory Changes on the Development of Mobile Medical Apps
Abstract:
Mobile applications are being used to perform a wide variety of tasks in day-to-day life, ranging from checking email to controlling your home heating. Application developers have recognized the potential to transform a smart device into a medical device, by using a mobile medical application i.e. a mobile phone or a tablet. When initially conceived these mobile medical applications performed basic functions e.g. BMI calculator, accessing reference material etc.; however, increasing complexity offers clinicians and patients a range of functionality. As this complexity and functionality increases, so too does the potential risk associated with using such an application. Examples include any applications that provide the ability to inflate and deflate blood pressure cuffs, as well as applications that use patient-specific parameters and calculate dosage or create a dosage plan for radiation therapy. If an unapproved mobile medical application is marketed by a medical device organization, then they face significant penalties such as receiving an FDA warning letter to cease the prohibited activity, fines and possibility of facing a criminal conviction. Regulatory bodies have finalized guidance intended for mobile application developers to establish if their applications are subject to regulatory scrutiny. However, regulatory controls appear contradictory with the approaches taken by mobile application developers who generally work with short development cycles and very little documentation and as such, there is the potential to stifle further improvements due to these regulations. The research presented as part of this paper details how by adopting development techniques, such as agile software development, mobile medical application developers can meet regulatory requirements whilst still fostering innovation.
Keywords: Medical, mobile, applications, software Engineering, FDA, standards, regulations, agile.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2063129 The Advent of Electronic Logbook Technology - Reducing Cost and Risk to Both Marine Resources and the Fishing Industry
Authors: Amos Barkai, Guy Meredith, Fatima Felaar, Zahrah Dantie, Dave de Buys
Abstract:
Fisheries management all around the world is hampered by the lack, or poor quality, of critical data on fish resources and fishing operations. The main reasons for the chronic inability to collect good quality data during fishing operations is the culture of secrecy common among fishers and the lack of modern data gathering technology onboard most fishing vessels. In response, OLRAC-SPS, a South African company, developed fisheries datalogging software (eLog in short) and named it Olrac. The Olrac eLog solution is capable of collecting, analysing, plotting, mapping, reporting, tracing and transmitting all data related to fishing operations. Olrac can be used by skippers, fleet/company managers, offshore mariculture farmers, scientists, observers, compliance inspectors and fisheries management authorities. The authors believe that using eLog onboard fishing vessels has the potential to revolutionise the entire process of data collection and reporting during fishing operations and, if properly deployed and utilised, could transform the entire commercial fleet to a provider of good quality data and forever change the way fish resources are managed. In addition it will make it possible to trace catches back to the actual individual fishing operation, to improve fishing efficiency and to dramatically improve control of fishing operations and enforcement of fishing regulations.Keywords: data management, electronic logbook (eLog), electronic reporting system (ERS), fisheries management
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1976128 Investigation of Improved Chaotic Signal Tracking by Echo State Neural Networks and Multilayer Perceptron via Training of Extended Kalman Filter Approach
Authors: Farhad Asadi, S. Hossein Sadati
Abstract:
This paper presents a prediction performance of feedforward Multilayer Perceptron (MLP) and Echo State Networks (ESN) trained with extended Kalman filter. Feedforward neural networks and ESN are powerful neural networks which can track and predict nonlinear signals. However, their tracking performance depends on the specific signals or data sets, having the risk of instability accompanied by large error. In this study we explore this process by applying different network size and leaking rate for prediction of nonlinear or chaotic signals in MLP neural networks. Major problems of ESN training such as the problem of initialization of the network and improvement in the prediction performance are tackled. The influence of coefficient of activation function in the hidden layer and other key parameters are investigated by simulation results. Extended Kalman filter is employed in order to improve the sequential and regulation learning rate of the feedforward neural networks. This training approach has vital features in the training of the network when signals have chaotic or non-stationary sequential pattern. Minimization of the variance in each step of the computation and hence smoothing of tracking were obtained by examining the results, indicating satisfactory tracking characteristics for certain conditions. In addition, simulation results confirmed satisfactory performance of both of the two neural networks with modified parameterization in tracking of the nonlinear signals.Keywords: Feedforward neural networks, nonlinear signal prediction, echo state neural networks approach, leaking rates, capacity of neural networks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 758127 Detecting Financial Bubbles Using Gap between Common Stocks and Preferred Stocks
Authors: Changju Lee, Seungmo Ku, Sondo Kim, Woojin Chang
Abstract:
How to detecting financial bubble? Addressing this simple question has been the focus of a vast amount of empirical research spanning almost half a century. However, financial bubble is hard to observe and varying over the time; there needs to be more research on this area. In this paper, we used abnormal difference between common stocks price and those preferred stocks price to explain financial bubble. First, we proposed the ‘W-index’ which indicates spread between common stocks and those preferred stocks in stock market. Second, to prove that this ‘W-index’ is valid for measuring financial bubble, we showed that there is an inverse relationship between this ‘W-index’ and S&P500 rate of return. Specifically, our hypothesis is that when ‘W-index’ is comparably higher than other periods, financial bubbles are added up in stock market and vice versa; according to our hypothesis, if investors made long term investments when ‘W-index’ is high, they would have negative rate of return; however, if investors made long term investments when ‘W-index’ is low, they would have positive rate of return. By comparing correlation values and adjusted R-squared values of between W-index and S&P500 return, VIX index and S&P500 return, and TED index and S&P500 return, we showed only W-index has significant relationship between S&P500 rate of return. In addition, we figured out how long investors should hold their investment position regard the effect of financial bubble. Using this W-index, investors could measure financial bubble in the market and invest with low risk.
Keywords: Financial bubbles, detection, preferred stocks, pairs trading, future return, forecast.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1131126 Novel Use of a Quality Assurance Tool for Integrating Technology to HSE
Authors: Ragi Poyyara, Vivek V., Ashish Khaparde
Abstract:
The product development process (PDP) in the Technology group plays a very important role in the launch of any product. While a manufacturing process encourages the use of certain measures to reduce health, safety and environmental (HSE) risks on the shop floor, the PDP concentrates on the use of Geometric Dimensioning and Tolerancing (GD&T) to develop a flawless design. Furthermore, PDP distributes and coordinates activities between different departments such as marketing, purchasing, and manufacturing. However, it is seldom realized that PDP makes a significant contribution to developing a product that reduces HSE risks by encouraging the Technology group to use effective GD&T. The GD&T is a precise communication tool that uses a set of symbols, rules, and definitions to mathematically define parts to be manufactured. It is a quality assurance method widely used in the oil and gas sector. Traditionally it is used to ensure the interchangeability of a part without affecting its form, fit, and function. Parts that do not meet these requirements are rejected during quality audits. This paper discusses how the Technology group integrates this quality assurance tool into the PDP and how the tool plays a major role in helping the HSE department in its goal towards eliminating HSE incidents. The PDP involves a thorough risk assessment and establishes a method to address those risks during the design stage. An illustration shows how GD&T helped reduce safety risks by ergonomically improving assembling operations. A brief discussion explains how tolerances provided on a part help prevent finger injury. This tool has equipped Technology to produce fixtures, which are used daily in operations as well as manufacturing. By applying GD&T to create good fits, HSE risks are mitigated for operating personnel. Both customers and service providers benefit from reduced safety risks.
Keywords: HSE, PDP, GD&T, risks.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1834125 Bone Mineral Density and Frequency of Low-Trauma Fractures in Ukrainian Women with Metabolic Syndrome
Authors: Vladyslav Povoroznyuk, Larysa Martynyuk, Iryna Syzonenko, Liliya Martynyuk
Abstract:
Osteoporosis is one of the important problems in postmenopausal women due to an increased risk of sudden and unexpected fractures. This study is aimed to determine the connection between bone mineral density (BMD) and trabecular bone score (TBS) in Ukrainian women suffering from metabolic syndrome. Participating in the study, 566 menopausal women aged 50-79 year-old were examined and divided into two groups: Group A included 336 women with no obesity (BMI ≤ 29.9 kg/m2), and Group B – 230 women with metabolic syndrome (diagnosis according to IDF criteria, 2005). Dual-energy X-ray absorptiometry was used for measuring of lumbar spine (L1-L4), femoral neck, total body and forearm BMD and bone quality indexes (last according to Med-Imaps installation). Data were analyzed using Statistical Package 6.0. A significant increase of lumbar spine (L1-L4), femoral neck, total body and ultradistal radius BMD was found in women with metabolic syndrome compared to those without obesity (p < 0.001) both in their totality and in groups of 50-59 years, 60-69 years, and 70-79 years. TBS was significantly higher in non-obese women compared to metabolic syndrome patients of 50-59 years and in the general sample (p < 0.05). Analysis showed significant positive correlation between body mass index (BMI) and BMD at all levels. Significant negative correlation between BMI and TBS (L1-L4) was established. Despite the fact that BMD indexes were significantly higher in women with metabolic syndrome, the frequency of vertebral and non-vertebral fractures did not differ significantly in the groups of patients.
Keywords: Bone mineral density, trabecular bone score, metabolic syndrome, fracture.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1000124 Occurrence of Foreign Matter in Food: Applied Identification Method - Association of Official Agricultural Chemists (AOAC) and Food and Drug Administration (FDA)
Authors: E. C. Mattos, V. S. M. G. Daros, R. Dal Col, A. L. Nascimento
Abstract:
The aim of this study is to present the results of a retrospective survey on the foreign matter found in foods analyzed at the Adolfo Lutz Institute, from July 2001 to July 2015. All the analyses were conducted according to the official methods described on Association of Official Agricultural Chemists (AOAC) for the micro analytical procedures and Food and Drug Administration (FDA) for the macro analytical procedures. The results showed flours, cereals and derivatives such as baking and pasta products were the types of food where foreign matters were found more frequently followed by condiments and teas. Fragments of stored grains insects, its larvae, nets, excrement, dead mites and rodent excrement were the most foreign matter found in food. Besides, foreign matters that can cause a physical risk to the consumer’s health such as metal, stones, glass, wood were found but rarely. Miscellaneous (shell, sand, dirt and seeds) were also reported. There are a lot of extraneous materials that are considered unavoidable since are something inherent to the product itself, such as insect fragments in grains. In contrast, there are avoidable extraneous materials that are less tolerated because it is preventable with the Good Manufacturing Practice. The conclusion of this work is that although most extraneous materials found in food are considered unavoidable it is necessary to keep the Good Manufacturing Practice throughout the food processing as well as maintaining a constant surveillance of the production process in order to avoid accidents that may lead to occurrence of these extraneous materials in food.Keywords: Food contamination, extraneous materials, foreign matter, surveillance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3702123 Cirrhosis Mortality Prediction as Classification Using Frequent Subgraph Mining
Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride
Abstract:
In this work, we use machine learning and data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. Our work applies modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.
Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 449122 Analysis of Supply Side Factors Affecting Bank Financing of Non-Oil Exports in Nigeria
Authors: Sama’ila Idi Ningi, Abubakar Yusuf Dutse
Abstract:
The banking sector poses a lot of problems in Nigeria in general and the non-oil export sector in particular. The banks' lack effectiveness in handling small, medium or long-term credit risk (lack of training of loan officers, lack of information on borrowers and absence of a reliable credit registry) results in non-oil exporters being burdened with high requirements, such as up to three years of financial statements, enough collateral to cover both the loan principal and interest (including a cash deposit that may be up to 30% of the loans' net present value), and to provide every detail of the international trade transaction in question. The stated problems triggered this research. Consequently, information on bank financing of non-oil exports was collected from 100 respondents from the 20 Deposit Money Banks (DMBs) in Nigeria. The data was analysed by the use of descriptive statistics correlation and regression. It is found that, Nigerian banks are participants in the financing of non-oil exports. Despite their participation, the rate of interest for credit extended to non-oil export is usually high, ranging between 15-20%. Small and medium sized non-oil export businesses lack the credit history for banks to judge them as reputable. Banks also consider the non-oil export sector very risky for investment. The banks actually do grant less credit than the exporters may require and therefore are not properly funded by banks. Banks grant very low volume of foreign currency loan in addition to, unfavorable exchange rate at which Naira is exchanged to the Dollar and other currencies in the country. This makes importation of inputs costly and negatively impacted on the non-oil export performance in Nigeria.
Keywords: Supply Side Factors, Bank Financing, Non-Oil Exports.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2711121 Techniques of Construction Management in Civil Engineering
Authors: Mamoon M. Atout
Abstract:
The Middle East Gulf region has witnessed rapid growth and development in many areas over the last two decades. The development of the real-estate sector, construction industry and infrastructure projects are a major share of the development that has participated in the civilization of the countries of the Gulf. Construction industry projects were planned and managed by different types of experts, who came from all over the world having different types of experiences in construction management and industry. Some of these projects were completed on time, while many were not, due to many accumulating factors. Many accumulated factors are considered as the principle reason for the problem experienced at the project construction stage, which reflected negatively on the project success. Specific causes of delay have been identified by construction managers to avoid any unexpected delays through proper analysis and considerations to some implications such as risk assessment and analysis for many potential problems to ensure that projects will be delivered on time. Construction management implications were adopted and considered by project managers who have experience and knowledge in applying the techniques of the system of engineering construction management. The aim of this research is to determine the benefits of the implications of construction management by the construction team and level of considerations of the techniques and processes during the project development and construction phases to avoid any delay in the projects. It also aims to determine the factors that participate to project completion delays in case project managers are not well committed to their roles and responsibilities. The results of the analysis will determine the necessity of the applications required by the project team to avoid the causes of delays that help them deliver projects on time, e.g. verifying tender documents, quantities and preparing the construction method of the project.
Keywords: Construction management, control process, cost control, planning and scheduling, roles and responsibilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1428120 Disparities versus Similarities: WHO GPPQCL and ISO/IEC 17025:2017 International Standards for Quality Management Systems in Pharmaceutical Laboratories
Authors: M. A. Okezue, K. L. Clase, S. R. Byrn, P. Shivanand
Abstract:
Medicines regulatory authorities expect pharmaceutical companies and contract research organizations to seek ways to certify that their laboratory control measurements are reliable. Establishing and maintaining laboratory quality standards are essential in ensuring the accuracy of test results. ‘ISO/IEC 17025:2017’ and ‘WHO Good Practices for Pharmaceutical Quality Control Laboratories (GPPQCL)’ are two quality standards commonly employed in developing laboratory quality systems. A review was conducted on the two standards to elaborate on areas on convergence and divergence. The goal was to understand how differences in each standard's requirements may influence laboratories' choices as to which document is easier to adopt for quality systems. A qualitative review method compared similar items in the two standards while mapping out areas where there were specific differences in the requirements of the two documents. The review also provided a detailed description of the clauses and parts covering management and technical requirements in these laboratory standards. The review showed that both documents share requirements for over ten critical areas covering objectives, infrastructure, management systems, and laboratory processes. There were, however, differences in standard expectations where GPPQCL emphasizes system procedures for planning and future budgets that will ensure continuity. Conversely, ISO 17025 was more focused on the risk management approach to establish laboratory quality systems. Elements in the two documents form common standard requirements to assure the validity of laboratory test results that promote mutual recognition. The ISO standard currently has more global patronage than GPPQCL.
Keywords: ISO/IEC 17025:2017, laboratory standards, quality control, WHO GPPQCL
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1121119 Security Analysis of Password Hardened Multimodal Biometric Fuzzy Vault
Authors: V. S. Meenakshi, G. Padmavathi
Abstract:
Biometric techniques are gaining importance for personal authentication and identification as compared to the traditional authentication methods. Biometric templates are vulnerable to variety of attacks due to their inherent nature. When a person-s biometric is compromised his identity is lost. In contrast to password, biometric is not revocable. Therefore, providing security to the stored biometric template is very crucial. Crypto biometric systems are authentication systems, which blends the idea of cryptography and biometrics. Fuzzy vault is a proven crypto biometric construct which is used to secure the biometric templates. However fuzzy vault suffer from certain limitations like nonrevocability, cross matching. Security of the fuzzy vault is affected by the non-uniform nature of the biometric data. Fuzzy vault when hardened with password overcomes these limitations. Password provides an additional layer of security and enhances user privacy. Retina has certain advantages over other biometric traits. Retinal scans are used in high-end security applications like access control to areas or rooms in military installations, power plants, and other high risk security areas. This work applies the idea of fuzzy vault for retinal biometric template. Multimodal biometric system performance is well compared to single modal biometric systems. The proposed multi modal biometric fuzzy vault includes combined feature points from retina and fingerprint. The combined vault is hardened with user password for achieving high level of security. The security of the combined vault is measured using min-entropy. The proposed password hardened multi biometric fuzzy vault is robust towards stored biometric template attacks.Keywords: Biometric Template Security, Crypto Biometric Systems, Hardening Fuzzy Vault, Min-Entropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2159118 Feasibility Study of Mine Tailing’s Treatment by Acidithiobacillus thiooxidans DSM 26636
Authors: M. Gómez-Ramírez, A. Rivas-Castillo, I. Rodríguez-Pozos, R. A. Avalos-Zuñiga, N. G. Rojas-Avelizapa
Abstract:
Among the diverse types of pollutants produced by anthropogenic activities, metals represent a serious threat, due to their accumulation in ecosystems and their elevated toxicity. The mine tailings of abandoned mines contain high levels of metals such as arsenic (As), zinc (Zn), copper (Cu), and lead (Pb), which do not suffer any degradation process, they are accumulated in environment. Abandoned mine tailings potentially could contaminate rivers and aquifers representing a risk for human health due to their high metal content. In an attempt to remove the metals and thereby mitigate the environmental pollution, an environmentally friendly and economical method of bioremediation has been introduced. Bioleaching has been actively studied over the last several years, and it is one of the bioremediation solutions used to treat heavy metals contained in sewage sludge, sediment and contaminated soil. Acidithiobacillus thiooxidans, an extremely acidophilic, chemolithoautotrophic, gram-negative, rod shaped microorganism, which is typically related to Cu mining operations (bioleaching), has been well studied for industrial applications. The sulfuric acid produced plays a major role in bioleaching. Specifically, Acidithiobacillus thiooxidans strain DSM 26636 has been able to leach Al, Ni, V, Fe, Mg, Si, and Ni contained in slags from coal combustion wastes. The present study reports the ability of A. thiooxidans DSM 26636 for the bioleaching of metals contained in two different mine tailing samples (MT1 and MT2). It was observed that Al, Fe, and Mn were removed in 36.3±1.7, 191.2±1.6, and 4.5±0.2 mg/kg for MT1, and in 74.5±0.3, 208.3±0.5, and 20.9±0.1 for MT2. Besides, < 1.5 mg/kg of Au and Ru were also bioleached from MT1; in MT2, bioleaching of Zn was observed at 55.7±1.3 mg/kg, besides removal of < 1.5 mg/kg was observed for As, Ir, Li, and 0.6 for Os in this residue. These results show the potential of strain DSM 26636 for the bioleaching of metals that came from different mine tailings.
Keywords: A. thiooxidans, bioleaching, metals, mine tailings.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 987