Search results for: data integrity
25485 Exploring Attachment Mechanisms of Sulfate-Reducing Bacteria Biofilm to X52 Carbon Steel and Effective Mitigation Through Moringa Oleifera Extract
Authors: Hadjer Didouh, Mohammed Hadj Melliani, Izzeddine Sameut Bouhaik
Abstract:
Corrosion is a serious problem in industrial installations or metallic transport pipes. Corrosion is an interfacial process controlled by several parameters. The presence of microorganisms affects the kinetics of corrosion. This type of corrosion is often referred as bio-corrosion or corrosion influenced by microorganisms (MIC). The action of a microorganism or a bacterium is carried out by the formation of biofilm following its attachment to the metal surface. The formation of biofilm isolates the metal surface from its environment and allows the bacteria to control the parameters of the metal/bacteria interface. Biofilm formation by sulfate-reducing bacteria (SRB) X52 steel, poses substantial challenges in oil and gas industry SONATRACH of Algeria. This research delves into the complex attachment mechanisms employed by SRB biofilm on X52 carbon steel and investigates strategies for effective mitigation using biocides. The exploration commences by elucidating the underlying mechanisms facilitating SRB biofilm adhesion to X52 carbon steel, considering factors such as surface morphology, electrostatic interactions, and microbial extracellular substances. Advanced microscopy and spectroscopic techniques provide a support to the attachment processes, laying the foundation for targeted mitigation strategies. The use of 100 ppm of Moringa Oleifera extract biocide as a promising approach to control and prevent SRB biofilm formation on X52 carbon steel surfaces. Green extract undergo evaluation for their effectiveness in disrupting biofilm development while ensuring the integrity of the steel substrate. Systematic analysis is conducted on the biocide's impact on the biofilm's structural integrity, microbial viability, and overall attachment strength. This two-pronged investigation aims to deepen our comprehension of SRB biofilm dynamics and contribute to the development of effective strategies for mitigating its impact on X52 carbon steel.Keywords: bio-corrosion, biofilm, attachement, metal/bacteria interface
Procedia PDF Downloads 2525484 Exploring Attachment Mechanisms of Sulfate-Reducing Bacteria Biofilm to X52 Carbon Steel and Effective Mitigation Through Moringa Oleifera Extract
Authors: Hadjer Didouh, Mohammed Hadj Melliani, Izzeddine Sameut Bouhaik
Abstract:
Corrosion is a serious problem in industrial installations or metallic transport pipes. Corrosion is an interfacial process controlled by several parameters. The presence of microorganisms affects the kinetics of corrosion. This type of corrosion is often referred to as bio-corrosion or corrosion influenced by microorganisms (MIC). The action of a microorganism or a bacterium is carried out by the formation of biofilm following its attachment to the metal surface. The formation of biofilm isolates the metal surface from its environment and allows the bacteria to control the parameters of the metal/bacteria interface. Biofilm formation by sulfate-reducing bacteria (SRB) X52 steel poses substantial challenges in the oil and gas industry SONATRACH of Algeria. This research delves into the complex attachment mechanisms employed by SRB biofilm on X52 carbon steel and investigates innovative strategies for effective mitigation using biocides. The exploration commences by elucidating the underlying mechanisms facilitating SRB biofilm adhesion to X52 carbon steel, considering factors such as surface morphology, electrostatic interactions, and microbial extracellular substances. Advanced microscopy and spectroscopic techniques provide support to the attachment processes, laying the foundation for targeted mitigation strategies. The use of 100 ppm of Moringa Oleifera extract biocide as a promising approach to control and prevent SRB biofilm formation on X52 carbon steel surfaces. Green extracts undergo evaluation for their effectiveness in disrupting biofilm development while ensuring the integrity of the steel substrate. Systematic analysis is conducted on the biocide's impact on the biofilm's structural integrity, microbial viability, and overall attachment strength. This two-pronged investigation aims to deepen our comprehension of SRB biofilm dynamics and contribute to the development of effective strategies for mitigating its impact on X52 carbon steel.Keywords: attachment, bio-corrosion, biofilm, metal/bacteria interface
Procedia PDF Downloads 7325483 Mining Big Data in Telecommunications Industry: Challenges, Techniques, and Revenue Opportunity
Authors: Hoda A. Abdel Hafez
Abstract:
Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.Keywords: mining big data, big data, machine learning, telecommunication
Procedia PDF Downloads 41025482 Exploring Weld Rejection Rate Limits and Tracers Effects in Construction Projects
Authors: Abdalaziz M. Alsalhabi, Loai M. Alowa
Abstract:
This paper investigates Weld Rejection Rate (WRR) limits and tracer effects in construction projects, with a specific focus on a Gas Plant Project, a mega-project held by Saudi Aramco (SA) in Saudi Arabia. The study included a comprehensive examination of various factors impacting WRR limits. It commenced by comparing the Company practices with ASME standards, followed by an in-depth analysis of both weekly and cumulative projects' historical WRR data, evaluation of Radiographic Testing (RT) reports for rejected welds, and proposal of mitigation methods to eliminate future rejections. Additionally, the study revealed the causes of fluctuation in WRR data and benchmarked with the industry practices. Furthermore, a case study was conducted to explore the impact of tracers on WRR, providing insights into their influence on the welding process. This paper aims to achieve three primary objectives. Firstly, it seeks to validate the existing practices of WRR limits and advocate for their inclusion within relevant International Industry Standards. Secondly, it aims to validate the effectiveness of the WRR formula that incorporates tracer effects, ensuring its reliability in assessing weld quality. Lastly, this study aims to identify opportunities for process improvement in WRR control, with the ultimate goal of enhancing project processes and ensuring the integrity, safety, and efficiency of constructed assets.Keywords: weld rejection rate, weld repair rate in joint and linear basis, tracers effects, construction projects
Procedia PDF Downloads 4525481 Predicting Low Birth Weight Using Machine Learning: A Study on 53,637 Ethiopian Birth Data
Authors: Kehabtimer Shiferaw Kotiso, Getachew Hailemariam, Abiy Seifu Estifanos
Abstract:
Introduction: Despite the highest share of low birth weight (LBW) for neonatal mortality and morbidity, predicting births with LBW for better intervention preparation is challenging. This study aims to predict LBW using a dataset encompassing 53,637 birth cohorts collected from 36 primary hospitals across seven regions in Ethiopia from February 2022 to June 2024. Methods: We identified ten explanatory variables related to maternal and neonatal characteristics, including maternal education, age, residence, history of miscarriage or abortion, history of preterm birth, type of pregnancy, number of livebirths, number of stillbirths, antenatal care frequency, and sex of the fetus to predict LBW. Using WEKA 3.8.2, we developed and compared seven machine learning algorithms. Data preprocessing included handling missing values, outlier detection, and ensuring data integrity in birth weight records. Model performance was evaluated through metrics such as accuracy, precision, recall, F1-score, and area under the Receiver Operating Characteristic curve (ROC AUC) using 10-fold cross-validation. Results: The results demonstrated that the decision tree, J48, logistic regression, and gradient boosted trees model achieved the highest accuracy (94.5% to 94.6%) with a precision of 93.1% to 93.3%, F1-score of 92.7% to 93.1%, and ROC AUC of 71.8% to 76.6%. Conclusion: This study demonstrates the effectiveness of machine learning models in predicting LBW. The high accuracy and recall rates achieved indicate that these models can serve as valuable tools for healthcare policymakers and providers in identifying at-risk newborns and implementing timely interventions to achieve the sustainable developmental goal (SDG) related to neonatal mortality.Keywords: low birth weight, machine learning, classification, neonatal mortality, Ethiopia
Procedia PDF Downloads 2525480 Factors Affecting Citizens’ Behavioural Intention to Use E-voter Registration and Verification System Towards the Electoral Process in Nigeria
Authors: Aishatu Shuaibu
Abstract:
It is expected that electronic voter registration and verification in Nigeria will enhance the integrity of elections, which is vital for democratic development; it is also expected to enhance efficiency, transparency, and security. However, the reasons for citizens' intentions with respect to behavioral use of such platforms have not been studied in the literature much. This paper, therefore, intends to look into significant characteristics affecting the acceptance and use of e-voter technology among Nigerian residents. Data will be collected using a structured questionnaire from several local government areas (LGAs) around Nigeria to evaluate the influence of demographic characteristics, technology usability, security perceptions, and governmental education on the intention to implement e-voter systems. The results will offer vital insights into the barriers and drivers of voter technology acceptance, aiding in policy suggestions to enhance voter registration and verification processes within Nigeria's electoral framework. This study is designed to aid electoral stakeholders in devising successful strategies for encouraging the broad deployment of e-voter systems in Nigeria.Keywords: e-governance, e-voting, e-democracy, INEC, Nigeria
Procedia PDF Downloads 2325479 Higher Education Leadership and Creating Sites of Institutional Belonging: A Global Case Study
Authors: Lisa M. Coleman
Abstract:
The focus on disability, LGBTQ+, and internationalization has certainly been the subject of much research and programmatic across higher education. Many universities have entered into global partnerships with varying success and challenges across the various areas, including laws and policies. Attentiveness to the specific nuances of global inclusion, diversity, equity, belonging, and access (GIDBEA) and the leadership to support these efforts is crucial to the development of longstanding success across the programs. There have been a number of shifts related to diversification across student and alumni bodies. These shifts include but are not limited to how people identify gender, race, and sexuality (and the intersections across such identities), as well as trends across emerging and diverse disability communities. NYU is the most international campus in the United States, with the most campuses and sites outside of its county of origin and the most international students and exchange programs than any other university. As a result, the ongoing work related to GIDEBA is at the center of much of the leadership, administrative, and research efforts. Climate assessment work across NYU’s diverse global campus landscape will serve as the foundation to exemplify best practices related to data collection and dissemination, community and stakeholder engagement, and effective implementation of innovative strategies to close gap areas as identified. The data (quantitative and qualitative) and related research findings represent data collected from close to 22,000 stakeholders across the NYU campuses. The case study centers on specific methodological considerations, data integrity, stakeholder engagement from across student-faculty, staff, and alumni constituencies, and tactics to advance specific GIDBEA initiatives related to navigating shifting landscapes. Design thinking, incubation, and co-creation strategies have been employed to expand, leverage, actualize, and implement GIDBEA strategies that are – concrete, measurable, differentiated, and specific to global sites and regions and emerging trends.Keywords: disability, LGBTQ+, DEI, research, case studies
Procedia PDF Downloads 10625478 JavaScript Object Notation Data against eXtensible Markup Language Data in Software Applications a Software Testing Approach
Authors: Theertha Chandroth
Abstract:
This paper presents a comparative study on how to check JSON (JavaScript Object Notation) data against XML (eXtensible Markup Language) data from a software testing point of view. JSON and XML are widely used data interchange formats, each with its unique syntax and structure. The objective is to explore various techniques and methodologies for validating comparison and integration between JSON data to XML and vice versa. By understanding the process of checking JSON data against XML data, testers, developers and data practitioners can ensure accurate data representation, seamless data interchange, and effective data validation.Keywords: XML, JSON, data comparison, integration testing, Python, SQL
Procedia PDF Downloads 14025477 Using Machine Learning Techniques to Extract Useful Information from Dark Data
Authors: Nigar Hussain
Abstract:
It is a subset of big data. Dark data means those data in which we fail to use for future decisions. There are many issues in existing work, but some need powerful tools for utilizing dark data. It needs sufficient techniques to deal with dark data. That enables users to exploit their excellence, adaptability, speed, less time utilization, execution, and accessibility. Another issue is the way to utilize dark data to extract helpful information to settle on better choices. In this paper, we proposed upgrade strategies to remove the dark side from dark data. Using a supervised model and machine learning techniques, we utilized dark data and achieved an F1 score of 89.48%.Keywords: big data, dark data, machine learning, heatmap, random forest
Procedia PDF Downloads 3125476 Research Study on the Concept of Unity of Ummah and Its Sources in the Light of Islamic Teachings
Authors: Ghazi Abdul Rehman Qasmi
Abstract:
Islam is the preacher and torch-bearer of unity and solidarity. All the followers of Islam are advised to be united. Islam strongly condemns those elements which disunite the unity of Muslim Ummah. Like pearls in a rosary, Islam has united the Muslims from all over the world in the wreath of unity and forbade the Muslims to avoid separation and to be disintegrated. The aspect of unity is prominent in all divine injunctions and about worship. By offering five times obligatory congregational prayers, passion of mutual love and affection is increased and on the auspicious days like Friday, Eid-ul-fiter and Eid-ul-azha, majority of the Muslims come together at central places to offer these congregational prayers. Thus unity and harmony among the Muslims can be seen. Similarly the Muslim pilgrims from all over the world eliminate all kind of worldly discrimination to perform many rituals of pilgrimage while wearing white color cloth as a dress. Pilgrimage is a demonstration of Islamic strength. When the Muslims from all over the world perform the same activities together and they offer their prayers under the leadership of one leader (IMAM). Muslims come together on the occasion of pilgrimage to perform Tawaf (seven circuits,first three circuits at a hurried pace(Rammal) and followed by four times, more closely, at a leisurely pace, round the Holy Kaabah to perform circumambulation known as Tawaf in religious terminology,Saee(running or walking briskly seven times between two small hills Safa&Marwa), Ramy-al-jamarat (throwing pebbles at the stone pillars, symbolizing the devil). In this way dignity and sublimity of Islam is increased and unity and integrity of Muslim Ummah is promoted also. By studying the life history of Hazrat Muhammad (P.B.U.H) we come to know that our Holy Prophet (P.B.U.H) has put emphasis on unity and integrity. We have to follow the Islamic teachings to create awareness among the members of Muslim Ummah. In the light of the Holy Quran and Sunnah, we have to utilize all the sources and potential for this noble cause.Keywords: unity, Ummah, sources, Islamic teaching
Procedia PDF Downloads 29525475 Multi-Source Data Fusion for Urban Comprehensive Management
Authors: Bolin Hua
Abstract:
In city governance, various data are involved, including city component data, demographic data, housing data and all kinds of business data. These data reflects different aspects of people, events and activities. Data generated from various systems are different in form and data source are different because they may come from different sectors. In order to reflect one or several facets of an event or rule, data from multiple sources need fusion together. Data from different sources using different ways of collection raised several issues which need to be resolved. Problem of data fusion include data update and synchronization, data exchange and sharing, file parsing and entry, duplicate data and its comparison, resource catalogue construction. Governments adopt statistical analysis, time series analysis, extrapolation, monitoring analysis, value mining, scenario prediction in order to achieve pattern discovery, law verification, root cause analysis and public opinion monitoring. The result of Multi-source data fusion is to form a uniform central database, which includes people data, location data, object data, and institution data, business data and space data. We need to use meta data to be referred to and read when application needs to access, manipulate and display the data. A uniform meta data management ensures effectiveness and consistency of data in the process of data exchange, data modeling, data cleansing, data loading, data storing, data analysis, data search and data delivery.Keywords: multi-source data fusion, urban comprehensive management, information fusion, government data
Procedia PDF Downloads 39425474 How Trust Functions in Fostering Innovation and Technology Development
Authors: Obidimma Ezezika
Abstract:
In light of the increasing importance of trust in development programs, the purpose of this study, was to identify how trust functions as an essential key determinant in successful innovation and technology development programs. Using projects in the agricultural sector as case studies, we determined how the concept of trust is understood. Our data collection relied on semi-structured, face-to-face interviews conducted as part of a larger study investigating the role of trust in development programs. Interview transcripts were analyzed to create a narrative on how trust is understood by the study’s participants and how trust functions in fostering innovation. We identified six themes and showed how trust plays an important factor in innovation. These themes included the practice of integrity and honesty; delivery of results in an accountable manner; capability and competency; sharing of the same objectives and interests; transparency about actions and intentions through clear communication; and the targeting of services toward the interests of the public. The results of this study can provide guidance on how to enhance implementation mechanisms and provide impetus for organizations to implement trust building activities in fostering effective innovation.Keywords: trust, research, innovation, technology
Procedia PDF Downloads 48325473 AI Applications in Accounting: Transforming Finance with Technology
Authors: Alireza Karimi
Abstract:
Artificial Intelligence (AI) is reshaping various industries, and accounting is no exception. With the ability to process vast amounts of data quickly and accurately, AI is revolutionizing how financial professionals manage, analyze, and report financial information. In this article, we will explore the diverse applications of AI in accounting and its profound impact on the field. Automation of Repetitive Tasks: One of the most significant contributions of AI in accounting is automating repetitive tasks. AI-powered software can handle data entry, invoice processing, and reconciliation with minimal human intervention. This not only saves time but also reduces the risk of errors, leading to more accurate financial records. Pattern Recognition and Anomaly Detection: AI algorithms excel at pattern recognition. In accounting, this capability is leveraged to identify unusual patterns in financial data that might indicate fraud or errors. AI can swiftly detect discrepancies, enabling auditors and accountants to focus on resolving issues rather than hunting for them. Real-Time Financial Insights: AI-driven tools, using natural language processing and computer vision, can process documents faster than ever. This enables organizations to have real-time insights into their financial status, empowering decision-makers with up-to-date information for strategic planning. Fraud Detection and Prevention: AI is a powerful tool in the fight against financial fraud. It can analyze vast transaction datasets, flagging suspicious activities and reducing the likelihood of financial misconduct going unnoticed. This proactive approach safeguards a company's financial integrity. Enhanced Data Analysis and Forecasting: Machine learning, a subset of AI, is used for data analysis and forecasting. By examining historical financial data, AI models can provide forecasts and insights, aiding businesses in making informed financial decisions and optimizing their financial strategies. Artificial Intelligence is fundamentally transforming the accounting profession. From automating mundane tasks to enhancing data analysis and fraud detection, AI is making financial processes more efficient, accurate, and insightful. As AI continues to evolve, its role in accounting will only become more significant, offering accountants and finance professionals powerful tools to navigate the complexities of modern finance. Embracing AI in accounting is not just a trend; it's a necessity for staying competitive in the evolving financial landscape.Keywords: artificial intelligence, accounting automation, financial analysis, fraud detection, machine learning in finance
Procedia PDF Downloads 6325472 Reviewing Privacy Preserving Distributed Data Mining
Authors: Sajjad Baghernezhad, Saeideh Baghernezhad
Abstract:
Nowadays considering human involved in increasing data development some methods such as data mining to extract science are unavoidable. One of the discussions of data mining is inherent distribution of the data usually the bases creating or receiving such data belong to corporate or non-corporate persons and do not give their information freely to others. Yet there is no guarantee to enable someone to mine special data without entering in the owner’s privacy. Sending data and then gathering them by each vertical or horizontal software depends on the type of their preserving type and also executed to improve data privacy. In this study it was attempted to compare comprehensively preserving data methods; also general methods such as random data, coding and strong and weak points of each one are examined.Keywords: data mining, distributed data mining, privacy protection, privacy preserving
Procedia PDF Downloads 52625471 Causes of Institutionalization of Children and Adolescents in a Shelter in Brazil
Authors: Eduardo Guilherme, Sabrina Duarte
Abstract:
Shelters or orphanages are institutions responsible for ensuring the physical and mental integrity of children and adolescents who had their rights violated or neglected, whether from a social-leavers, is at personal risk to which they were exposed or the negligence of its parents; in Brazil about twenty thousand children and adolescents living in about five hundred registered shelters that receive funds from the federal government. We evaluated the records of institutionalized children and adolescents from the foundation of municipal shelter in Rio Negro/Parana State, Brazil since June/2000 to February/2015. Institutionalization of the causes cited were: lack of family/guardian material resources, abandonment by parents/guardians, domestic violence, substance abuse of parents/guardians, street experience, orphans and others. In Brazil, poverty and extreme poverty are closely related to the institutionalization of causes of children and adolescents. Census data in 2010, the Brazilian Institute of Geography and Statistics (IBGE) indicate that 40% of Brazilians living in poverty are girls and boys up to 14 years in a total of approximately 23 million individuals. Poverty denies children and adolescents their rights, representing a vulnerability which predisposes to some causes of shelter.Keywords: Brazil, shelter, orphanages, institutionalization
Procedia PDF Downloads 48825470 The Control of Wall Thickness Tolerance during Pipe Purchase Stage Based on Reliability Approach
Authors: Weichao Yu, Kai Wen, Weihe Huang, Yang Yang, Jing Gong
Abstract:
Metal-loss corrosion is a major threat to the safety and integrity of gas pipelines as it may result in the burst failures which can cause severe consequences that may include enormous economic losses as well as the personnel casualties. Therefore, it is important to ensure the corroding pipeline integrity and efficiency, considering the value of wall thickness, which plays an important role in the failure probability of corroding pipeline. Actually, the wall thickness is controlled during pipe purchase stage. For example, the API_SPEC_5L standard regulates the allowable tolerance of the wall thickness from the specified value during the pipe purchase. The allowable wall thickness tolerance will be used to determine the wall thickness distribution characteristic such as the mean value, standard deviation and distribution. Taking the uncertainties of the input variables in the burst limit-state function into account, the reliability approach rather than the deterministic approach will be used to evaluate the failure probability. Moreover, the cost of pipe purchase will be influenced by the allowable wall thickness tolerance. More strict control of the wall thickness usually corresponds to a higher pipe purchase cost. Therefore changing the wall thickness tolerance will vary both the probability of a burst failure and the cost of the pipe. This paper describes an approach to optimize the wall thickness tolerance considering both the safety and economy of corroding pipelines. In this paper, the corrosion burst limit-state function in Annex O of CSAZ662-7 is employed to evaluate the failure probability using the Monte Carlo simulation technique. By changing the allowable wall thickness tolerance, the parameters of the wall thickness distribution in the limit-state function will be changed. Using the reliability approach, the corresponding variations in the burst failure probability will be shown. On the other hand, changing the wall thickness tolerance will lead to a change in cost in pipe purchase. Using the variation of the failure probability and pipe cost caused by changing wall thickness tolerance specification, the optimal allowable tolerance can be obtained, and used to define pipe purchase specifications.Keywords: allowable tolerance, corroding pipeline segment, operation cost, production cost, reliability approach
Procedia PDF Downloads 39625469 The Right to Data Portability and Its Influence on the Development of Digital Services
Authors: Roman Bieda
Abstract:
The General Data Protection Regulation (GDPR) will come into force on 25 May 2018 which will create a new legal framework for the protection of personal data in the European Union. Article 20 of GDPR introduces a right to data portability. This right allows for data subjects to receive the personal data which they have provided to a data controller, in a structured, commonly used and machine-readable format, and to transmit this data to another data controller. The right to data portability, by facilitating transferring personal data between IT environments (e.g.: applications), will also facilitate changing the provider of services (e.g. changing a bank or a cloud computing service provider). Therefore, it will contribute to the development of competition and the digital market. The aim of this paper is to discuss the right to data portability and its influence on the development of new digital services.Keywords: data portability, digital market, GDPR, personal data
Procedia PDF Downloads 47525468 Learning the Most Common Causes of Major Industrial Accidents and Apply Best Practices to Prevent Such Accidents
Authors: Rajender Dahiya
Abstract:
Investigation outcomes of major process incidents have been consistent for decades and validate that the causes and consequences are often identical. The debate remains as we continue to experience similar process incidents even with enormous development of new tools, technologies, industry standards, codes, regulations, and learning processes? The objective of this paper is to investigate the most common causes of major industrial incidents and reveal industry challenges and best practices to prevent such incidents. The author, in his current role, performs audits and inspections of a variety of high-hazard industries in North America, including petroleum refineries, chemicals, petrochemicals, manufacturing, etc. In this paper, he shares real life scenarios, examples, and case studies from high hazards operating facilities including key challenges and best practices. This case study will provide a clear understanding of the importance of near miss incident investigation. The incident was a Safe operating limit excursion. The case describes the deficiencies in management programs, the competency of employees, and the culture of the corporation that includes hazard identification and risk assessment, maintaining the integrity of safety-critical equipment, operating discipline, learning from process safety near misses, process safety competency, process safety culture, audits, and performance measurement. Failure to identify the hazards and manage the risks of highly hazardous materials and processes is one of the primary root-causes of an incident, and failure to learn from past incidents is the leading cause of the recurrence of incidents. Several investigations of major incidents discovered that each showed several warning signs before occurring, and most importantly, all were preventable. The author will discuss why preventable incidents were not prevented and review the mutual causes of learning failures from past major incidents. The leading causes of past incidents are summarized below. Management failure to identify the hazard and/or mitigate the risk of hazardous processes or materials. This process starts early in the project stage and continues throughout the life cycle of the facility. For example, a poorly done hazard study such as HAZID, PHA, or LOPA is one of the leading causes of the failure. If this step is performed correctly, then the next potential cause is. Management failure to maintain the integrity of safety critical systems and equipment. In most of the incidents, mechanical integrity of the critical equipment was not maintained, safety barriers were either bypassed, disabled, or not maintained. The third major cause is Management failure to learn and/or apply learning from the past incidents. There were several precursors before those incidents. These precursors were either ignored altogether or not taken seriously. This paper will conclude by sharing how a well-implemented operating management system, good process safety culture, and competent leaders and staff contributed to managing the risks to prevent major incidents.Keywords: incident investigation, risk management, loss prevention, process safety, accident prevention
Procedia PDF Downloads 5725467 Integrating Heritage Conservation and Sustainable Development: The Role of Buffer Zones in Safeguarding the Tentative World Heritage Sites and Empowering Local Communities in India
Authors: Shweta Vardia
Abstract:
The 2021 decision by the World Heritage Center to align buffer zones with the 2015 Strategy for Sustainable Development marks a significant advancement in the protection of cultural and natural heritage sites. Buffer zones play a critical role in preserving the outstanding universal value, authenticity, and integrity of heritage sites, shielding them from threats such as urbanization, industrialization, and tourism. The 2015 Strategy emphasizes the integration of culture and heritage into sustainable development policies, highlighting the importance of community participation, traditional knowledge, and effective management in the conservation of heritage sites. This paper examines the implications of this strategic alignment for tentative World Heritage Sites in India. It explores how buffer zones can serve as tools for sustainable tourism, economic growth, and environmental protection while also addressing the socio-economic needs of local communities. By adopting a people-centered approach, the study underscores the need for active community involvement in heritage conservation, recognizing local residents as long-term custodians of cultural heritage. The role of buffer zones in promoting sustainable livelihoods, enhancing resilience to environmental changes, and fostering a sense of belonging among communities is also discussed. The challenges associated with buffer zones, including restrictive boundaries, unclear legislative frameworks, and potential disconnection from sociocultural contexts, are critically analyzed. The paper advocates for a holistic and integrated approach to buffer zone management, ensuring that policies are not only theoretically sound but also practically feasible. It concludes by emphasizing the need for collaborative efforts among conservation professionals, local communities, and policymakers to achieve sustainable development goals that respect both the heritage site's integrity and the well-being of surrounding populations.Keywords: buffer zones, India, local communities, urbanization, world heritage sites
Procedia PDF Downloads 3125466 Experimental and Numerical Investigation of “Machining Induced Residual Stresses” during Orthogonal Machining of Alloy Steel AISI 4340
Authors: Theena Thayalan, K. N. Ramesh Babu
Abstract:
Machining induced residual stress (RS) is one of the most important surface integrity parameters that characterize the near surface layer of a mechanical component, which plays a crucial role in controlling the performance, especially its fatigue life. Since experimental determination of RS is expensive and time consuming, it would be of great benefit if they could be predicted. In such case, it would be possible to select the cutting parameters required to produce a favorable RS profile. In the present study, an effort has been made to develop a 'two dimensional finite element model (FEM)' to simulate orthogonal cutting process and to predict surface and sub-surface RS using the commercial FEA software DEFORM-2D. The developed finite element model has been validated through experimental investigation of RS. In the experimentation, the orthogonal cutting tests were carried out on AISI 4340 by varying the cutting speed (VC) and uncut chip thickness (f) at three levels and the surface & sub-surface RS has been measured using XRD and Electro polishing techniques. The comparison showed that the RS obtained using developed numerical model is in reasonable agreement with that of experimental data.Keywords: FEM, machining, residual stress, XRF
Procedia PDF Downloads 34825465 Resveratrol-Phospholipid Complex for Sustained Delivery of Resveratrol via the Skin for the Treatment of Inflammatory Diseases
Authors: Malay K. Das, Bhupen Kalita
Abstract:
The poor oral bioavailability of resveratrol (RSV) due to presystemic metabolism can be avoided via dermal route of administration. The hydrophilic-lipophilic nature of resveratrol-phospholipid complex (RSVPs) favors the delivery of resveratrol via the skin. The RSVPs embedded polymeric patch with moderate adhesiveness was developed for dermal application for sustained anti-inflammatory effect. The prepared patches were evaluated for various physicochemical properties, surface morphology by SEM, TEM, and compatibility of patch components by FT-IR and DSC studies. The dermal flux of the optimized patch formulation was found to be at 4.28 ± 0.48 mg/cm2/24 h. The analysis of skin extract after permeation study revealed the presence of resveratrol, which confirmed the localization of RSVPs in the skin. The stability of RSVPs in the polymeric patch and the physiologic environment was confirmed by FE-SEM studies on the patches after drug release and skin permeation studies. The RSVPs particles released from the polymer matrix maintaining the structural integrity and permeate the keratinized horney layer of skin. The optimized patch formulation showed sustained anti-inflammatory effect (84.10% inhibition of inflammation at 24 h) in carrageenan-induced rat paw edema model compared to marketed diclofenac sodium gel (39.58% inhibition of inflammation at 24 h). The CLSM study confirmed the localization of RSVPs for a longer period, thus enabling drug targeting to the dermis for sustained anti-inflammatory effect. Histological studies with phase contrast trinocular microscope suggested no alteration of skin integrity and no evidence of the presence of inflammatory cells after exposure to the permeants. The patch was found to be safe for skin application as evaluated by Draize method for skin irritation scoring in a rabbit model. These results suggest the therapeutic efficacy of the developed patch in both acute and chronic inflammatory diseases.Keywords: resveratrol-phospholipid complex, skin delivery, sustained anti-inflammatory effect, inflammatory diseases, dermal patch
Procedia PDF Downloads 23225464 Exploring the Intersection of Accounting, Business, and Economics: Bridging Theory and Practice for Sustainable Growth
Authors: Stephen Acheampong Amoafoh
Abstract:
In today's dynamic economic landscape, businesses face multifaceted challenges that demand strategic foresight and informed decision-making. This abstract explores the pivotal role of financial analytics in driving business performance amidst evolving market conditions. By integrating accounting principles with economic insights, organizations can harness the power of data-driven strategies to optimize resource allocation, mitigate risks, and capitalize on emerging opportunities. This presentation will delve into the practical applications of financial analytics across various sectors, highlighting case studies and empirical evidence to underscore its efficacy in enhancing operational efficiency and fostering sustainable growth. From predictive modeling to performance benchmarking, attendees will gain invaluable insights into leveraging advanced analytics tools to drive profitability, streamline processes, and adapt to changing market dynamics. Moreover, this abstract will address the ethical considerations inherent in financial analytics, emphasizing the importance of transparency, integrity, and accountability in data-driven decision-making. By fostering a culture of ethical conduct and responsible stewardship, organizations can build trust with stakeholders and safeguard their long-term viability in an increasingly interconnected global economy. Ultimately, this abstract aims to stimulate dialogue and collaboration among scholars, practitioners, and policymakers, fostering knowledge exchange and innovation in the realms of accounting, business, and economics. Through interdisciplinary insights and actionable recommendations, participants will be equipped to navigate the complexities of today's business environment and seize opportunities for sustainable success.Keywords: financial analytics, business performance, data-driven strategies, sustainable growth
Procedia PDF Downloads 5525463 Recent Advances in Data Warehouse
Authors: Fahad Hanash Alzahrani
Abstract:
This paper describes some recent advances in a quickly developing area of data storing and processing based on Data Warehouses and Data Mining techniques, which are associated with software, hardware, data mining algorithms and visualisation techniques having common features for any specific problems and tasks of their implementation.Keywords: data warehouse, data mining, knowledge discovery in databases, on-line analytical processing
Procedia PDF Downloads 40425462 Curative Role of Bromoenol Lactone, an Inhibitor of Phospholipase A2 Enzyme, during Cigarette Smoke Condensate Induced Anomalies in Lung Epithelium
Authors: Subodh Kumar, Sanjeev Kumar Sharma, Gaurav Kaushik, Pramod Avti, Phulen Sarma, Bikash Medhi, Krishan Lal Khanduja
Abstract:
Background: It is well known that cigarette smoke is one of the causative factors in various lung diseases especially cancer. Carcinogens and oxidant molecules present in cigarette smoke not only damage the cellular constituents (lipids, proteins, DNA) but may also regulate the molecular pathways involved in inflammation and cancer. Continuous oxidative stress caused by the constituents of cigarette smoke leads to higher PhospholipaseA₂ (PLA₂) activity, resulting in elevated levels of secondary metabolites whose role is well defined in cancer. To reduce the burden of chronic inflammation as well as oxidative stress, and higher levels of secondary metabolites, we checked the curative potential of PLA₂ inhibitor Bromoenol Lactone (BEL) during continuous exposure of cigarette smoke condensate (CSC). Aim: To check the therapeutic potential of Bromoenol Lactone (BEL), an inhibitor of PhospholipaseA₂s, in pathways of CSC-induced changes in type I and type II alveolar epithelial cells. Methods: Effect of BEL on CSC-induced PLA2 activity were checked using colorimetric assay, cellular toxicity using cell viability assay, membrane integrity using fluorescein di-acetate (FDA) uptake assay, reactive oxygen species (ROS) levels and apoptosis markers through flow cytometry, and cellular regulation using MAPKinases levels, in lung epithelium. Results: BEL significantly mimicked CSC-induced PLA₂ activity, ROS levels, apoptosis, and kinases level whereas improved cellular viability and membrane integrity. Conclusions: Current observations revealed that BEL may be a potential therapeutic agent during Cigarette smoke-induced anomalies in lung epithelium.Keywords: cigarette smoke condensate, phospholipase A₂, oxidative stress, alveolar epithelium, bromoenol lactone
Procedia PDF Downloads 18925461 How to Use Big Data in Logistics Issues
Authors: Mehmet Akif Aslan, Mehmet Simsek, Eyup Sensoy
Abstract:
Big Data stands for today’s cutting-edge technology. As the technology becomes widespread, so does Data. Utilizing massive data sets enable companies to get competitive advantages over their adversaries. Out of many area of Big Data usage, logistics has significance role in both commercial sector and military. This paper lays out what big data is and how it is used in both military and commercial logistics.Keywords: big data, logistics, operational efficiency, risk management
Procedia PDF Downloads 64225460 Multisignature Schemes for Reinforcing Trust in Cloud Software-As-A-Service Services
Authors: Mustapha Hedabou, Ali Azougaghe, Ahmed Bentajer, Hicham Boukhris, Mourad Eddiwani, Zakaria Igarramen
Abstract:
Software-as-a-service (SaaS) is emerging as a dominant approach to delivering software. It encompasses a range of business, technical opportunities, issue, and challenges. Trustiness in the cloud services regarding the security and the privacy of the delivered data is the most critical issue with the SaaS model. In this paper, we survey the security concerns related to the SaaS model, and we propose the design of a trusted SaaS model that gives users more confidence into SaaS services by leveraging a trust in a neutral source code certifying authority. The proposed design is based on the use of the multisignature mechanism for signing the source code of the application service. In our model, the cloud provider acts as a root of trust by ensuring the integrity of the application service when it was running on its platform. The proposed design prevents insider attacks from tampering with application service before and after it was launched in a cloud provider platform.Keywords: cloud computing, SaaS Platform, TPM, trustiness, code source certification, multi-signature schemes
Procedia PDF Downloads 27625459 In Vitro Intestine Tissue Model to Study the Impact of Plastic Particles
Authors: Ashleigh Williams
Abstract:
Micro- and nanoplastics’ (MNLPs) omnipresence and ecological accumulation is evident when surveying recent environmental impact studies. For example, in 2014 it was estimated that at least 52.3 trillion plastic microparticles are floating at sea, and scientists have even found plastics present remote Arctic ice and snow (5,6). Plastics have even found their way into precipitation, with more than 1000 tons of microplastic rain precipitating onto the Western United States in 2020. Even more recent studies evaluating the chemical safety of reusable plastic bottles found that hundreds of chemicals leached into the control liquid in the bottle (ddH2O, ph = 7) during a 24-hour time period. A consequence of the increased abundance in plastic waste in the air, land, and water every year is the bioaccumulation of MNLPs in ecosystems and trophic niches of the animal food chain, which could potentially cause increased direct and indirect exposure of humans to MNLPs via inhalation, ingestion, and dermal contact. Though the detrimental, toxic effects of MNLPs have been established in marine biota, much less is known about the potentially hazardous health effects of chronic MNLP ingestion in humans. Recent data indicate that long-term exposure to MNLPs could cause possible inflammatory and dysbiotic effects. However, toxicity seems to be largely dose-, as well as size-dependent. In addition, the transcytotic uptake of MNLPs through the intestinal epithelia in humans remain relatively unknown. To this point, the goal of the current study was to investigate the mechanisms of micro- and nanoplastic uptake and transcytosis of Polystyrene (PE) in human stem-cell derived, physiologically relevant in vitro intestinal model systems, and to compare the relative effect of particle size (30 nm, 100 nm, 500 nm and 1 µm), and concentration (0 µg/mL, 250 µg/mL, 500 µg/mL, 1000 µg/mL) on polystyrene MNLP uptake, transcytosis and intestinal epithelial model integrity. Observational and quantitative data obtained from confocal microscopy, immunostaining, transepithelial electrical resistance (TEER) measurements, cryosectioning, and ELISA cytokine assays of the proinflammatory cytokines Interleukin-6 and Interleukin-8 were used to evaluate the localization and transcytosis of polystyrene MNPs and its impact on epithelial integrity in human-derived intestinal in vitro model systems. The effect of Microfold (M) cell induction on polystyrene micro- and nanoparticle (MNP) uptake, transcytosis, and potential inflammation was also assessed and compared to samples grown under standard conditions. Microfold (M) cells, link the human intestinal system to the immune system and are the primary cells in the epithelium responsible for sampling and transporting foreign matter of interest from the lumen of the gut to underlying immune cells. Given the uptake capabilities of Microfold cells to interact both specifically and nonspecific to abiotic and biotic materials, it was expected that M- cell induced in vitro samples would have increased binding, localization, and potentially transcytosis of Polystyrene MNLPs across the epithelial barrier. Experimental results of this study would not only help in the evaluation of the plastic toxicity, but would allow for more detailed modeling of gut inflammation and the intestinal immune system.Keywords: nanoplastics, enteroids, intestinal barrier, tissue engineering, microfold (M) cells
Procedia PDF Downloads 8525458 Statistical Analysis of Surface Roughness and Tool Life Using (RSM) in Face Milling
Authors: Mohieddine Benghersallah, Lakhdar Boulanouar, Salim Belhadi
Abstract:
Currently, higher production rate with required quality and low cost is the basic principle in the competitive manufacturing industry. This is mainly achieved by using high cutting speed and feed rates. Elevated temperatures in the cutting zone under these conditions shorten tool life and adversely affect the dimensional accuracy and surface integrity of component. Thus it is necessary to find optimum cutting conditions (cutting speed, feed rate, machining environment, tool material and geometry) that can produce components in accordance with the project and having a relatively high production rate. Response surface methodology is a collection of mathematical and statistical techniques that are useful for modelling and analysis of problems in which a response of interest is influenced by several variables and the objective is to optimize this response. The work presented in this paper examines the effects of cutting parameters (cutting speed, feed rate and depth of cut) on to the surface roughness through the mathematical model developed by using the data gathered from a series of milling experiments performed.Keywords: Statistical analysis (RSM), Bearing steel, Coating inserts, Tool life, Surface Roughness, End milling.
Procedia PDF Downloads 43225457 Development of Solid Electrolytes Based on Networked Cellulose
Authors: Boor Singh Lalia, Yarjan Abdul Samad, Raed Hashaikeh
Abstract:
Three different kinds of solid polymer electrolytes were prepared using polyethylene oxide (PEO) as a base polymer, networked cellulose (NC) as a physical support and LiClO4 as a conductive salt for the electrolytes. Networked cellulose, a modified form of cellulose, is a biodegradable and environmentally friendly additive which provides a strong fibrous networked support for structural stability of the electrolytes. Although the PEO/NC/LiClO4 electrolyte retains its structural integrity and mechanical properties at 100oC as compared to pristine PEO-based polymer electrolytes, it suffers from poor ionic conductivity. To improve the room temperature conductivity of the electrolyte, PEO is replaced by the polyethylene glycol (PEG) which is a liquid phase that provides high mobility for Li+ ions transport in the electrolyte. PEG/NC/LiClO4 shows improvement in ionic conductivity compared to PEO/NC/LiClO4 at room temperature, but it is brittle and tends to form cracks during processing. An advanced solid polymer electrolyte with optimum ionic conductivity and mechanical properties is developed by using a ternary system: TEGDME/PEO/NC+LiClO4. At room temperature, this electrolyte exhibits an ionic conductivity to the order of 10-5 S/cm, which is very high compared to that of the PEO/LiClO4 electrolyte. Pristine PEO electrolytes start melting at 65 °C and completely lose its mechanical strength. Dynamic mechanical analysis of TEGDME: PEO: NC (70:20:10 wt%) showed an improvement of storage modulus as compared to the pristine PEO in the 60–120 °C temperature range. Also, with an addition of NC, the electrolyte retains its mechanical integrity at 100 oC which is beneficial for Li-ion battery operation at high temperatures. Differential scanning calorimetry (DSC) and thermal gravimetry analysis (TGA) studies revealed that the ternary polymer electrolyte is thermally stable in the lithium ion battery operational temperature range. As-prepared polymer electrolyte was used to assemble LiFePO4/ TEGDME/PEO/NC+LiClO4/Li half cells and their electrochemical performance was studied via cyclic voltammetry and charge-discharge cycling.Keywords: solid polymer electrolyte, ionic conductivity, mechanical properties, lithium ion batteries, cyclic voltammetry
Procedia PDF Downloads 42925456 Validation of Mapping Historical Linked Data to International Committee for Documentation (CIDOC) Conceptual Reference Model Using Shapes Constraint Language
Authors: Ghazal Faraj, András Micsik
Abstract:
Shapes Constraint Language (SHACL), a World Wide Web Consortium (W3C) language, provides well-defined shapes and RDF graphs, named "shape graphs". These shape graphs validate other resource description framework (RDF) graphs which are called "data graphs". The structural features of SHACL permit generating a variety of conditions to evaluate string matching patterns, value type, and other constraints. Moreover, the framework of SHACL supports high-level validation by expressing more complex conditions in languages such as SPARQL protocol and RDF Query Language (SPARQL). SHACL includes two parts: SHACL Core and SHACL-SPARQL. SHACL Core includes all shapes that cover the most frequent constraint components. While SHACL-SPARQL is an extension that allows SHACL to express more complex customized constraints. Validating the efficacy of dataset mapping is an essential component of reconciled data mechanisms, as the enhancement of different datasets linking is a sustainable process. The conventional validation methods are the semantic reasoner and SPARQL queries. The former checks formalization errors and data type inconsistency, while the latter validates the data contradiction. After executing SPARQL queries, the retrieved information needs to be checked manually by an expert. However, this methodology is time-consuming and inaccurate as it does not test the mapping model comprehensively. Therefore, there is a serious need to expose a new methodology that covers the entire validation aspects for linking and mapping diverse datasets. Our goal is to conduct a new approach to achieve optimal validation outcomes. The first step towards this goal is implementing SHACL to validate the mapping between the International Committee for Documentation (CIDOC) conceptual reference model (CRM) and one of its ontologies. To initiate this project successfully, a thorough understanding of both source and target ontologies was required. Subsequently, the proper environment to run SHACL and its shape graphs were determined. As a case study, we performed SHACL over a CIDOC-CRM dataset after running a Pellet reasoner via the Protégé program. The applied validation falls under multiple categories: a) data type validation which constrains whether the source data is mapped to the correct data type. For instance, checking whether a birthdate is assigned to xsd:datetime and linked to Person entity via crm:P82a_begin_of_the_begin property. b) Data integrity validation which detects inconsistent data. For instance, inspecting whether a person's birthdate occurred before any of the linked event creation dates. The expected results of our work are: 1) highlighting validation techniques and categories, 2) selecting the most suitable techniques for those various categories of validation tasks. The next plan is to establish a comprehensive validation model and generate SHACL shapes automatically.Keywords: SHACL, CIDOC-CRM, SPARQL, validation of ontology mapping
Procedia PDF Downloads 253