Search results for: levelized cost of analysis
29845 Process Integration of Natural Gas Hydrate Production by CH₄-CO₂/H₂ Replacement Coupling Steam Methane Reforming
Authors: Mengying Wang, Xiaohui Wang, Chun Deng, Bei Liu, Changyu Sun, Guangjin Chen, Mahmoud El-Halwagi
Abstract:
Significant amounts of natural gas hydrates (NGHs) are considered potential new sustainable energy resources in the future. However, common used methods for methane gas recovery from hydrate sediments require high investment but with low gas production efficiency, and may cause potential environment and security problems. Therefore, there is a need for effective gas production from hydrates. The natural gas hydrate production method by CO₂/H₂ replacement coupling steam methane reforming can improve the replacement effect and reduce the cost of gas separation. This paper develops a simulation model of the gas production process integrated with steam reforming and membrane separation. The process parameters (i.e., reactor temperature, pressure, H₂O/CH₄ ratio) and the composition of CO₂ and H₂ in the feed gas are analyzed. Energy analysis is also conducted. Two design scenarios with different composition of CO₂ and H₂ in the feed gas are proposed and evaluated to assess the energy efficiency of the novel system. Results show that when the composition of CO₂ in the feed gas is between 43 % and 72 %, there is a certain composition that can meet the requirement that the flow rate of recycled gas is equal to that of feed gas, so as to ensure that the subsequent production process does not need to add feed gas or discharge recycled gas. The energy efficiency of the CO₂ in feed gas at 43 % and 72 % is greater than 1, and the energy efficiency is relatively higher when the CO₂ mole fraction in feed gas is 72 %.Keywords: Gas production, hydrate, process integration, steam reforming
Procedia PDF Downloads 18629844 Electrochemical Growth and Properties of Cu2O Nanostructures
Authors: A. Azizi, S. Laidoudi, G. Schmerber, A. Dinia
Abstract:
Cuprous oxide (Cu2O) is a well-known oxide semiconductor with a band gap of 2.1 eV and a natural p-type conductivity, which is an attractive material for device applications because of its abundant availability, non toxicity, and low production cost. It has a higher absorption coefficient in the visible region and the minority carrier diffusion length is also suitable for use as a solar cell absorber layer and it has been explored in junction with n type ZnO for photovoltaic applications. Cu2O nanostructures have been made by a variety of techniques; the electrodeposition method has emerged as one of the most promising processing routes as it is particularly provides advantages such as a low-cost, low temperature and a high level of purity in the products. In this work, Cu2O nanostructures prepared by electrodeposition from aqueous cupric sulfate solution with citric acid at 65°C onto a fluorine doped tin oxide (FTO) coated glass substrates were investigated. The effects of deposition potential on the electrochemical, surface morphology, structural and optical properties of Cu2O thin films were investigated. During cyclic voltammetry experiences, the potential interval where the electrodeposition of Cu2O is carried out was established. The Mott–Schottky (M-S) plot demonstrates that all the films are p-type semiconductors, the flat-band potential and the acceptor density for the Cu2O thin films are determined. AFM images reveal that the applied potential has a very significant influence on the surface morphology and size of the crystallites of thin Cu2O. The XRD measurements indicated that all the obtained films display a Cu2O cubic structure with a strong preferential orientation of the (111) direction. The optical transmission spectra in the UV-Visible domains revealed the highest transmission (75 %), and their calculated gap values increased from 1.93 to 2.24 eV, with increasing potentials.Keywords: Cu2O, electrodeposition, Mott–Schottky plot, nanostructure, optical properties, XRD
Procedia PDF Downloads 35829843 Prediction of the Heat Transfer Characteristics of Tunnel Concrete
Authors: Seung Cho Yang, Jae Sung Lee, Se Hee Park
Abstract:
This study suggests the analysis method to predict the damages of tunnel concrete caused by fires. The result obtained from the analyses of concrete temperatures at a fire in a tunnel using ABAQUS was compared with the test result. After the reliability of the analysis method was verified, the temperatures of a tunnel at a real fire and those of concrete during the fire were estimated to predict fire damages. The temperatures inside the tunnel were estimated by FDS, a CFD model. It was deduced that the fire performance of tunnel lining and the fire damages of the structure at an actual fire could be estimated by the analysis method.Keywords: fire resistance, heat transfer, numerical analysis, tunnel fire
Procedia PDF Downloads 44029842 Optimization of Structures with Mixed Integer Non-linear Programming (MINLP)
Authors: Stojan Kravanja, Andrej Ivanič, Tomaž Žula
Abstract:
This contribution focuses on structural optimization in civil engineering using mixed integer non-linear programming (MINLP). MINLP is characterized as a versatile method that can handle both continuous and discrete optimization variables simultaneously. Continuous variables are used to optimize parameters such as dimensions, stresses, masses, or costs, while discrete variables represent binary decisions to determine the presence or absence of structural elements within a structure while also calculating discrete materials and standard sections. The optimization process is divided into three main steps. First, a mechanical superstructure with a variety of different topology-, material- and dimensional alternatives. Next, a MINLP model is formulated to encapsulate the optimization problem. Finally, an optimal solution is searched in the direction of the defined objective function while respecting the structural constraints. The economic or mass objective function of the material and labor costs of a structure is subjected to the constraints known from structural analysis. These constraints include equations for the calculation of internal forces and deflections, as well as equations for the dimensioning of structural components (in accordance with the Eurocode standards). Given the complex, non-convex and highly non-linear nature of optimization problems in civil engineering, the Modified Outer-Approximation/Equality-Relaxation (OA/ER) algorithm is applied. This algorithm alternately solves subproblems of non-linear programming (NLP) and main problems of mixed-integer linear programming (MILP), in this way gradually refines the solution space up to the optimal solution. The NLP corresponds to the continuous optimization of parameters (with fixed topology, discrete materials and standard dimensions, all determined in the previous MILP), while the MILP involves a global approximation to the superstructure of alternatives, where a new topology, materials, standard dimensions are determined. The optimization of a convex problem is stopped when the MILP solution becomes better than the best NLP solution. Otherwise, it is terminated when the NLP solution can no longer be improved. While the OA/ER algorithm, like all other algorithms, does not guarantee global optimality due to the presence of non-convex functions, various modifications, including convexity tests, are implemented in OA/ER to mitigate these difficulties. The effectiveness of the proposed MINLP approach is demonstrated by its application to various structural optimization tasks, such as mass optimization of steel buildings, cost optimization of timber halls, composite floor systems, etc. Special optimization models have been developed for the optimization of these structures. The MINLP optimizations, facilitated by the user-friendly software package MIPSYN, provide insights into a mass or cost-optimal solutions, optimal structural topologies, optimal material and standard cross-section choices, confirming MINLP as a valuable method for the optimization of structures in civil engineering.Keywords: MINLP, mixed-integer non-linear programming, optimization, structures
Procedia PDF Downloads 5129841 The Economic Impact of Mediation: An Analysis in Time of Crisis
Authors: C. M. Cebola, V. H. Ferreira
Abstract:
In the past decade mediation has been legally implemented in European legal systems, especially after the publication by the European Union of the Directive 2008/52/EC on certain aspects of mediation in civil and mercantile matters. Developments in international trade and globalization in this new century have led to an increase of the number of litigations, often cross-border, and the courts have failed to respond adequately. We do not advocate that mediation should be promoted as the solution for all justice problems, but as a means with its own specificities that the parties may choose to consider as the best way to resolve their disputes. Thus, the implementation of mediation should be based on the advantages of its application. From the economic point of view, competitive negotiation can generate negative external effects in social terms. A solution reached in a court of law is not always the most efficient one considering all elements of society (economic social benefit). On the other hand, the administration of justice adds in economic terms transaction costs that can be mitigated by the application of other forms of conflict resolution, such as mediation. In this paper, the economic benefits of mediation will be analysed in the light of various studies on the functioning of justice. Several theoretical arguments will be confronted with empirical studies to demonstrate that mediation has significant positive economic effects. The objective is to contribute to the dissemination of mediation between companies and citizens, but also to demonstrate the cost to governments and states of still limited use of mediation, particularly in the current economic crisis and propose actions to develop the application of mediation.Keywords: economic impact, litigation costs, mediation, solutions
Procedia PDF Downloads 28529840 Experimental Analysis of the Origins of the Anisotropy Behavior in the 2017 AA Aluminum Alloy
Authors: May Abdelghani
Abstract:
The present work is devoted to the study of the microstructural anisotropy in mechanical cyclic behavior of the 2017AA aluminum alloy which is widely used in the aerospace industry. The main purpose of the study is to investigate the microstructural origins of this anisotropy already confirmed in our previous work in 2017AA aluminum alloy. To do this, we have used the microstructural analysis resources such as Scanning Electron Microscope (SEM) to see the differences between breaks from different directions of cyclic loading. Another resource of investigation was used in this study is that the EBSD method, which allows us to obtain a mapping of the crystallographic texture of our material. According to the obtained results in the microscopic analysis, we are able to identify the origins of the anisotropic behavior at the macroscopic scale.Keywords: fatigue damage, cyclic behavior, anisotropy, microstructural analysis
Procedia PDF Downloads 41729839 Principles and Practice of Therapeutic Architecture
Authors: Umedov Mekhroz, Griaznova Svetlana
Abstract:
The quality of life and well-being of patients, staff and visitors are central to the delivery of health care. Architecture and design are becoming an integral part of the healing and recovery approach. The most significant point that can be implemented in hospital buildings is the therapeutic value of the artificial environment, the design and integration of plants to bring the natural world into the healthcare environment. The hospital environment should feel like home comfort. The techniques that therapeutic architecture uses are very cheap, but provide real benefit to patients, staff and visitors, demonstrating that the difference is not in cost but in design quality. The best environment is not necessarily more expensive - it is about special use of light and color, rational use of materials and flexibility of premises. All this forms innovative concepts in modern hospital architecture, in new construction, renovation or expansion projects. The aim of the study is to identify the methods and principles of therapeutic architecture. The research methodology consists in studying and summarizing international experience in scientific research, literature, standards, methodological manuals and project materials on the research topic. The result of the research is the development of graphic-analytical tables based on the system analysis of the processed information; 3d visualization of hospital interiors based on processed information.Keywords: therapeutic architecture, healthcare interiors, sustainable design, materials, color scheme, lighting, environment.
Procedia PDF Downloads 12729838 Risks beyond Cyber in IoT Infrastructure and Services
Authors: Mattias Bergstrom
Abstract:
Significance of the Study: This research will provide new insights into the risks with digital embedded infrastructure. Through this research, we will analyze each risk and its potential negation strategies, especially for AI and autonomous automation. Moreover, the analysis that is presented in this paper will convey valuable information for future research that can create more stable, secure, and efficient autonomous systems. To learn and understand the risks, a large IoT system was envisioned, and risks with hardware, tampering, and cyberattacks were collected, researched, and evaluated to create a comprehensive understanding of the potential risks. Potential solutions have then been evaluated on an open source IoT hardware setup. This list shows the identified passive and active risks evaluated in the research. Passive Risks: (1) Hardware failures- Critical Systems relying on high rate data and data quality are growing; SCADA systems for infrastructure are good examples of such systems. (2) Hardware delivers erroneous data- Sensors break, and when they do so, they don’t always go silent; they can keep going, just that the data they deliver is garbage, and if that data is not filtered out, it becomes disruptive noise in the system. (3) Bad Hardware injection- Erroneous generated sensor data can be pumped into a system by malicious actors with the intent to create disruptive noise in critical systems. (4) Data gravity- The weight of the data collected will affect Data-Mobility. (5) Cost inhibitors- Running services that need huge centralized computing is cost inhibiting. Large complex AI can be extremely expensive to run. Active Risks: Denial of Service- It is one of the most simple attacks, where an attacker just overloads the system with bogus requests so that valid requests disappear in the noise. Malware- Malware can be anything from simple viruses to complex botnets created with specific goals, where the creator is stealing computer power and bandwidth from you to attack someone else. Ransomware- It is a kind of malware, but it is so different in its implementation that it is worth its own mention. The goal with these pieces of software is to encrypt your system so that it can only be unlocked with a key that is held for ransom. DNS spoofing- By spoofing DNS calls, valid requests and data dumps can be sent to bad destinations, where the data can be extracted for extortion or to corrupt and re-inject into a running system creating a data echo noise loop. After testing multiple potential solutions. We found that the most prominent solution to these risks was to use a Peer 2 Peer consensus algorithm over a blockchain to validate the data and behavior of the devices (sensors, storage, and computing) in the system. By the devices autonomously policing themselves for deviant behavior, all risks listed above can be negated. In conclusion, an Internet middleware that provides these features would be an easy and secure solution to any future autonomous IoT deployments. As it provides separation from the open Internet, at the same time, it is accessible over the blockchain keys.Keywords: IoT, security, infrastructure, SCADA, blockchain, AI
Procedia PDF Downloads 10829837 Exergy Analysis of Reverse Osmosis for Potable Water and Land Irrigation
Authors: M. Sarai Atab, A. Smallbone, A. P. Roskilly
Abstract:
A thermodynamic study is performed on the Reverse Osmosis (RO) desalination process for brackish water. The detailed RO model of thermodynamics properties with and without an energy recovery device was built in Simulink/MATLAB and validated against reported measurement data. The efficiency of desalination plants can be estimated by both the first and second laws of thermodynamics. While the first law focuses on the quantity of energy, the second law analysis (i.e. exergy analysis) introduces quality. This paper used the Main Outfall Drain in Iraq as a case study to conduct energy and exergy analysis of RO process. The result shows that it is feasible to use energy recovery method for reverse osmosis with salinity less than 15000 ppm as the exergy efficiency increases twice. Moreover, this analysis shows that the highest exergy destruction occurs in the rejected water and lowest occurs in the permeate flow rate accounting 37% for 4.3% respectively.Keywords: brackish water, exergy, irrigation, reverse osmosis (RO)
Procedia PDF Downloads 17829836 Optimization of Enzymatic Hydrolysis of Cooked Porcine Blood to Obtain Hydrolysates with Potential Biological Activities
Authors: Miguel Pereira, Lígia Pimentel, Manuela Pintado
Abstract:
Animal blood is a major by-product of slaughterhouses and still represents a cost and environmental problem in some countries. To be eliminated, blood should be stabilised by cooking and afterwards the slaughterhouses must have to pay for its incineration. In order to reduce the elimination costs and valorise the high protein content the aim of this study was the optimization of hydrolysis conditions, in terms of enzyme ratio and time, in order to obtain hydrolysates with biological activity. Two enzymes were tested in this assay: pepsin and proteases from Cynara cardunculus (cardosins). The latter has the advantage to be largely used in the Portuguese Dairy Industry and has a low price. The screening assays were carried out in a range of time between 0 and 10 h and using a ratio of enzyme/reaction volume between 0 and 5%. The assays were performed at the optimal conditions of pH and temperature for each enzyme: 55 °C at pH 5.2 for cardosins and 37 °C at pH 2.0 for pepsin. After reaction, the hydrolysates were evaluated by FPLC (Fast Protein Liquid Chromatography) and tested for their antioxidant activity by ABTS method. FPLC chromatograms showed different profiles when comparing the enzymatic reactions with the control (no enzyme added). The chromatogram exhibited new peaks with lower MW that were not present in control samples, demonstrating the hydrolysis by both enzymes. Regarding to the antioxidant activity, the best results for both enzymes were obtained using a ratio enzyme/reactional volume of 5% during 5 h of hydrolysis. However, the extension of reaction did not affect significantly the antioxidant activity. This has an industrial relevant aspect in what concerns to the process cost. In conclusion, the enzymatic blood hydrolysis can be a better alternative to the current elimination process allowing to the industry the reuse of an ingredient with biological properties and economic value.Keywords: antioxidant activity, blood, by-products, enzymatic hydrolysis
Procedia PDF Downloads 51429835 Research on Intercity Travel Mode Choice Behavior Considering Traveler’s Heterogeneity and Psychological Latent Variables
Authors: Yue Huang, Hongcheng Gan
Abstract:
The new urbanization pattern has led to a rapid growth in demand for short-distance intercity travel, and the emergence of new travel modes has also increased the variety of intercity travel options. In previous studies on intercity travel mode choice behavior, the impact of functional amenities of travel mode and travelers’ long-term personality characteristics has rarely been considered, and empirical results have typically been calibrated using revealed preference (RP) or stated preference (SP) data. This study designed a questionnaire that combines the RP and SP experiment from the perspective of a trip chain combining inner-city and intercity mobility, with consideration for the actual condition of the Huainan-Hefei traffic corridor. On the basis of RP/SP fusion data, a hybrid choice model considering both random taste heterogeneity and psychological characteristics was established to investigate travelers’ mode choice behavior for traditional train, high-speed rail, intercity bus, private car, and intercity online car-hailing. The findings show that intercity time and cost exert the greatest influence on mode choice, with significant heterogeneity across the population. Although inner-city cost does not demonstrate a significant influence, inner-city time plays an important role. Service attributes of travel mode, such as catering and hygiene services, as well as free wireless network supply, only play a minor role in mode selection. Finally, our study demonstrates that safety-seeking tendency, hedonism, and introversion all have differential and significant effects on intercity travel mode choice.Keywords: intercity travel mode choice, stated preference survey, hybrid choice model, RP/SP fusion data, psychological latent variable, heterogeneity
Procedia PDF Downloads 11429834 Sustainable Material Selection for Buildings: Analytic Network Process Method and Life Cycle Assessment Approach
Authors: Samira Mahmoudkelayeh, Katayoun Taghizade, Mitra Pourvaziri, Elnaz Asadian
Abstract:
Over the recent decades, depletion of resources and environmental concerns made researchers and practitioners present sustainable approaches. Since construction process consumes a great deal of both renewable and non-renewable resources, it is of great significance regarding environmental impacts. Choosing sustainable construction materials is a remarkable strategy presented in many researches and has a significant effect on building’s environmental footprint. This paper represents an assessment framework for selecting best sustainable materials for exterior enclosure in the city of Tehran based on sustainability principles (eco-friendly, cost effective and socio-cultural viable solutions). To perform a comprehensive analysis of environmental impacts, life cycle assessment, a cradle to grave approach is used. A questionnaire survey of construction experts has been conducted to determine the relative importance of criteria. Analytic Network Process (ANP) is applied as a multi-criteria decision-making method to choose sustainable material which consider interdependencies of criteria and sub-criteria. Finally, it prioritizes and aggregates relevant criteria into ultimate assessed score.Keywords: sustainable materials, building, analytic network process, life cycle assessment
Procedia PDF Downloads 24529833 Sustainable Solutions for Enhancing Efficiency, Safety, and Quality of Construction Value Chain Services Integration
Authors: Lo Kar Yin
Abstract:
In view of the increasing speed and quantity of the housing supply, building, and civil engineering infrastructure works triggered by the pandemic across the globe, contractors, professional services providers (PSP), including consultants (e.g., architect, project manager, civil/geotechnical/structural engineer, building services engineer, quantity surveyor/cost manager, etc.) and suppliers have faced tremendous challenges of the fierce market, limited manpower, and resources under contract prices fluctuation and competitive fee and price. With qualitative analysis, this paper is to review the available information from the industry stakeholders with a view to finding solutions for enhancing efficiency, safety, and quality of construction value chain services for public and private organizations/companies’ sustainable growth, not limited to checking the deliverables and data transfer from multi-disciplinary parties. Technology, contracts, and people are the key requirements for shaping the construction industry. With the integration of a modern engineering contract (e.g., NEC) collaborative approach, practical workflows are designed to address loopholes together with different levels of people employment/retention and technology adoption to achieve the best value for money.Keywords: efficiency, safety, quality, technology, contract, people, sustainable solutions, construction, services, integration
Procedia PDF Downloads 13929832 Produce Large Surface Area Activated Carbon from Biomass for Water Treatment
Authors: Rashad Al-Gaashani
Abstract:
The physicochemical activation method was used to produce high-quality activated carbon (AC) with a large surface area of about 2000 m2/g from low-cost and abundant biomass wastes in Qatar, namely date seeds. X-Ray diffraction (XRD), scanning electron spectroscopy (SEM), energy dispersive X-Ray spectroscopy (EDS), and Brunauer-Emmett-Teller (BET) surface area analysis was used to evaluate the AC samples. AC produced from date seeds has a wide range of pores available, including micro- and nano-pores. This type of AC with a well-developed pore structure may be very attractive for different applications, including air and water purification from micro and nano pollutants. Heavy metals iron (III) and copper (II) ions were removed from wastewater using the AC produced using a batch adsorption technique. The AC produced from date seeds biomass wastes shows high removal of heavy metals such as iron (III) ions (100%) and copper (II) ions (97.25%). The highest removal of copper (II) ions (100%) with AC produced from date seeds was found at pH 8, whereas the lowest removal (22.63%) occurred at pH 2. The effect of adsorption time, adsorbent dose, and pH on the removal of heavy metals was studied.Keywords: activated carbon, date seeds, biomass, heavy metals removal, water treatment
Procedia PDF Downloads 8129831 Effects of Inlet Filtration Pressure Loss on Single and Two-Spool Gas Turbine
Authors: Enyia James Diwa, Dodeye Ina Igbong, Archibong Archibong Eso
Abstract:
Gas turbine operators have been faced with the dramatic financial setback resulting from compressor fouling. In a highly deregulated power industry where there is stiffness in the market competition, has made it imperative to improvise means of reducing maintenance cost in other to yield maximum profit. Compressor fouling results from the deposition of contaminants in the presence of oil and moisture on the compressor blade or annulus surfaces, which leads to a loss in flow capacity and compressor efficiency. These combined effects reduce power output, increase heat rate and cause creep life reduction. This paper also contains a model of two gas turbine engines via Cranfield University software known as TURBOMATCH, which is simulation software for detecting engine fouling rate. The model engines are of different configurations and capacities, and are operating in two different modes of constant output power and turbine inlet temperature for a two and three stage filter system. The idea is to investigate the more economically viable filtration systems by gas turbine users based on performance only. It has been demonstrated in the results that the two spool engine is a little more beneficial compared to the single spool. This is as a result of a higher pressure ratio of the two spools as well as the deceleration of the high-pressure compressor and high-pressure turbine speed in a constant TET. Meanwhile, the inlet filtration system was properly designed and balanced with a well-timed and economical compressor washing regime/scheme to control compressor fouling. The different technologies of inlet air filtration and compressor washing are considered and an attempt at optimization with respect to the cost of a combination of both control measures are made.Keywords: inlet filtration, pressure loss, single spool, two spool
Procedia PDF Downloads 32629830 Assessing the Adaptive Re-Use Potential of Buildings as Part of the Disaster Management Process
Authors: A. Esra İdemen, Sinan M. Şener, Emrah Acar
Abstract:
The technological paradigm of the disaster management field, especially in the case of governmental intervention strategies, is generally based on rapid and flexible accommodation solutions. From various technical solution patterns used to address the immediate housing needs of disaster victims, the adaptive re-use of existing buildings can be considered to be both low-cost and practical. However, there is a scarcity of analytical methods to screen, select and adapt buildings to help decision makers in cases of emergency. Following an extensive literature review, this paper aims to highlight key points and problem areas associated with the adaptive re-use of buildings within the disaster management context. In other disciplines such as real estate management, the adaptive re-use potential (ARP) of existing buildings is typically based on the prioritization of a set of technical and non-technical criteria which are then weighted to arrive at an economically viable investment decision. After a disaster, however, the assessment of the ARP of buildings requires consideration of different/additional layers of analysis which stem from general disaster management principles and the peculiarities of different types of disasters, as well as of their victims. In this paper, a discussion of the development of an adaptive re-use potential (ARP) assessment model is presented. It is thought that governmental and non-governmental decision makers who are required to take quick decisions to accommodate displaced masses following disasters are likely to benefit from the implementation of such a model.Keywords: adaptive re-use of buildings, disaster management, temporary housing, assessment model
Procedia PDF Downloads 33629829 Polystyrene Paste as a Substitute for a Portland Cement: A Solution to the Nigerian Dilemma
Authors: Lanre Oluwafemi Akinyemi
Abstract:
The reduction of limestone to cement in Nigeria is expensive and requires huge amounts of energy. This significantly affects the cost of cement. Concrete is heavy: a cubic foot of it weighs about 150 lbs. and a cubic yard is about 4000 lbs. Thus a ready-mix truck with 9 cubic yards is carrying 36,000 lbs excluding the weight of the truck itself, thereby accumulating cost for also manufacturers. Therein lies the need to find a substitute for cement by using the polystyrene paste that benefits both the manufactures and the consumers. Polystyrene Paste Constructional Cement (PPCC), a patented material obtained by dissolving Waste EPS in volatile organic solvent, has recently been identified as a suitable binder/cement for construction and building material production. This paper illustrates the procedures of a test experiment undertaken to determine the splitting tensile strength of PPCC mortar compared to that of OPC (Ordinary Portland Cement). Expanded polystyrene was dissolved in gasoline to form a paste referred to as Polystyrene Paste Constructional Cement (PPCC). Mortars of mix ratios 1:4, 1:5, 1:6, 1:7 (PPCC: fine aggregate) batched by volume were used to produce 50mm x 100mm cylindrical PPCC mortar splitting tensile strength specimens. The control experiment was done by creating another series of cylindrical OPC mortar splitting tensile strength specimens following the same mix ratio used earlier. The PPCC cylindrical splitting tensile strength specimens were left to air-set, and the ones made with Ordinary Portland Cement (OPC) were demoded after 24 hours and cured in water. The cylindrical PPCC splitting tensile strength specimens were tested at 28 days and compared with those of the Ordinary Portland cement splitting tensile strength specimens. The result shows that hence for this two mixes, PPCC exhibits a better binding property than the OPC. With this my new invention I recommend the use of PPCC as a substitute for a Portland cement.Keywords: polystyrene paste, Portland cement, construction, mortar
Procedia PDF Downloads 16529828 Statistical Analysis of Interferon-γ for the Effectiveness of an Anti-Tuberculous Treatment
Authors: Shishen Xie, Yingda L. Xie
Abstract:
Tuberculosis (TB) is a potentially serious infectious disease that remains a health concern. The Interferon Gamma Release Assay (IGRA) is a blood test to find out if an individual is tuberculous positive or negative. This study applies statistical analysis to the clinical data of interferon-gamma levels of seventy-three subjects who diagnosed pulmonary TB in an anti-tuberculous treatment. Data analysis is performed to determine if there is a significant decline in interferon-gamma levels for the subjects during a period of six months, and to infer if the anti-tuberculous treatment is effective.Keywords: data analysis, interferon gamma release assay, statistical methods, tuberculosis infection
Procedia PDF Downloads 30829827 Multivariate Analysis of Spectroscopic Data for Agriculture Applications
Authors: Asmaa M. Hussein, Amr Wassal, Ahmed Farouk Al-Sadek, A. F. Abd El-Rahman
Abstract:
In this study, a multivariate analysis of potato spectroscopic data was presented to detect the presence of brown rot disease or not. Near-Infrared (NIR) spectroscopy (1,350-2,500 nm) combined with multivariate analysis was used as a rapid, non-destructive technique for the detection of brown rot disease in potatoes. Spectral measurements were performed in 565 samples, which were chosen randomly at the infection place in the potato slice. In this study, 254 infected and 311 uninfected (brown rot-free) samples were analyzed using different advanced statistical analysis techniques. The discrimination performance of different multivariate analysis techniques, including classification, pre-processing, and dimension reduction, were compared. Applying a random forest algorithm classifier with different pre-processing techniques to raw spectra had the best performance as the total classification accuracy of 98.7% was achieved in discriminating infected potatoes from control.Keywords: Brown rot disease, NIR spectroscopy, potato, random forest
Procedia PDF Downloads 19329826 Impact of Transitioning to Renewable Energy Sources on Key Performance Indicators and Artificial Intelligence Modules of Data Center
Authors: Ahmed Hossam ElMolla, Mohamed Hatem Saleh, Hamza Mostafa, Lara Mamdouh, Yassin Wael
Abstract:
Artificial intelligence (AI) is reshaping industries, and its potential to revolutionize renewable energy and data center operations is immense. By harnessing AI's capabilities, we can optimize energy consumption, predict fluctuations in renewable energy generation, and improve the efficiency of data center infrastructure. This convergence of technologies promises a future where energy is managed more intelligently, sustainably, and cost-effectively. The integration of AI into renewable energy systems unlocks a wealth of opportunities. Machine learning algorithms can analyze vast amounts of data to forecast weather patterns, solar irradiance, and wind speeds, enabling more accurate energy production planning. AI-powered systems can optimize energy storage and grid management, ensuring a stable power supply even during intermittent renewable generation. Moreover, AI can identify maintenance needs for renewable energy infrastructure, preventing costly breakdowns and maximizing system lifespan. Data centers, which consume substantial amounts of energy, are prime candidates for AI-driven optimization. AI can analyze energy consumption patterns, identify inefficiencies, and recommend adjustments to cooling systems, server utilization, and power distribution. Predictive maintenance using AI can prevent equipment failures, reducing energy waste and downtime. Additionally, AI can optimize data placement and retrieval, minimizing energy consumption associated with data transfer. As AI transforms renewable energy and data center operations, modified Key Performance Indicators (KPIs) will emerge. Traditional metrics like energy efficiency and cost-per-megawatt-hour will continue to be relevant, but additional KPIs focused on AI's impact will be essential. These might include AI-driven cost savings, predictive accuracy of energy generation and consumption, and the reduction of carbon emissions attributed to AI-optimized operations. By tracking these KPIs, organizations can measure the success of their AI initiatives and identify areas for improvement. Ultimately, the synergy between AI, renewable energy, and data centers holds the potential to create a more sustainable and resilient future. By embracing these technologies, we can build smarter, greener, and more efficient systems that benefit both the environment and the economy.Keywords: data center, artificial intelligence, renewable energy, energy efficiency, sustainability, optimization, predictive analytics, energy consumption, energy storage, grid management, data center optimization, key performance indicators, carbon emissions, resiliency
Procedia PDF Downloads 3829825 Onco@Home: Comparing the Costs, Revenues, and Patient Experience of Cancer Treatment at Home with the Standard of Care
Authors: Sarah Misplon, Wim Marneffe, Johan Helling, Jana Missiaen, Inge Decock, Dries Myny, Steve Lervant, Koen Vaneygen
Abstract:
The aim of this study was twofold. First, we investigated whether the current funding from the national health insurance (NHI) of home hospitalization (HH) for oncological patients is sufficient in Belgium. Second, we compared patient’s experiences and preferences of HH to the standard of care (SOC). Two HH models were examined in three Belgian hospitals and three home nursing organizations. In a first HH model, the blood draw and monitoring prior to intravenous therapy were performed by a trained home nurse at the patient’s home the day before the visit to the day hospital. In a second HH model, the administration of two subcutaneous treatments was partly provided at home instead of in the hospital. Therefore, we conducted (1) a bottom-up micro-costing study to compare the costs and revenues for the providers (hospitals and home care organizations), and (2) a cross-sectional survey to compare patient’s experiences and preferences of the SOC group and the HH group. Our results show that HH patients prefer HH and none of them wanted to return to SOC, although the satisfaction of patients was not significantly different between the two categories. At the same time, we find that costs associated to HH are higher overall. Comparing revenues with costs, we conclude that the current funding from NHI of HH for oncological patients is insufficient.Keywords: cost analysis, health insurance, preference, home hospitalization
Procedia PDF Downloads 12629824 Modeling and Statistical Analysis of a Soap Production Mix in Bejoy Manufacturing Industry, Anambra State, Nigeria
Authors: Okolie Chukwulozie Paul, Iwenofu Chinwe Onyedika, Sinebe Jude Ebieladoh, M. C. Nwosu
Abstract:
The research work is based on the statistical analysis of the processing data. The essence is to analyze the data statistically and to generate a design model for the production mix of soap manufacturing products in Bejoy manufacturing company Nkpologwu, Aguata Local Government Area, Anambra state, Nigeria. The statistical analysis shows the statistical analysis and the correlation of the data. T test, Partial correlation and bi-variate correlation were used to understand what the data portrays. The design model developed was used to model the data production yield and the correlation of the variables show that the R2 is 98.7%. However, the results confirm that the data is fit for further analysis and modeling. This was proved by the correlation and the R-squared.Keywords: General Linear Model, correlation, variables, pearson, significance, T-test, soap, production mix and statistic
Procedia PDF Downloads 45129823 New Media and the Personal Vote in General Elections: A Comparison of Constituency Level Candidates in the United Kingdom and Japan
Authors: Sean Vincent
Abstract:
Within the academic community, there is a consensus that political parties in established liberal democracies are facing a myriad of organisational challenges as a result of falling membership, weakening links to grass-roots support and rising voter apathy. During the same period of party decline and growing public disengagement political parties have become increasingly professionalised. The professionalisation of political parties owes much to changes in technology, with television becoming the dominant medium for political communication. In recent years, however, it has become clear that a new medium of communication is becoming utilised by political parties and candidates – New Media. New Media, a term hard to define but related to internet based communication, offers a potential revolution in political communication. It can be utilised by anyone with access to the internet and its most widely used platforms of communication such as Facebook and Twitter, are free to use. The advent of Web 2.0 has dramatically changed what can be done with the Internet. Websites now allow candidates at the constituency level to fundraise, organise and set out personalised policies. Social media allows them to communicate with supporters and potential voters practically cost-free. As such candidate dependency on the national party for resources and image now lies open to debate. Arguing that greater candidate independence may be a natural next step in light of the contemporary challenges faced by parties, this paper examines how New Media is being used by candidates at the constituency level to increase their personal vote. The paper will present findings from research carried out during two elections – the Japanese Lower House election of 2014 and the UK general election of 2015. During these elections a sample of candidates, totalling 150 candidates, from the three biggest parties in each country were selected and their new media output, specifically candidate websites, Twitter and Facebook output subjected to content analysis. The analysis examines how candidates are using new media to both become more functionally, through fundraising and volunteer mobilisation and politically, through the promotion of personal/local policies, independent from the national party. In order to validate the results of content analysis this paper will also present evidence from interviews carried out with 17 candidates that stood in the 2014 Japanese Lower House election or 2015 UK general election. With a combination of statistical analysis and interviews, several conclusions can be made about the use of New Media at constituency level. The findings show not just a clear difference in the way candidates from each country are using New Media but also differences within countries based upon the particular circumstances of each constituency. While it has not yet replaced traditional methods of fundraising and activist mobilisation, New Media is also becoming increasingly important in campaign organisation and the general consensus amongst candidates is that its importance will continue to grow along as politics in both countries becomes more diffuse.Keywords: political campaigns, elections, new media, political communication
Procedia PDF Downloads 23129822 Retrofitting Cement Plants with Oxyfuel Technology for Carbon Capture
Authors: Peloriadi Konstantina, Fakis Dimitris, Grammelis Panagiotis
Abstract:
Methods for carbon capture and storage (CCS) can play a key role in the reduction of industrial CO₂ emissions, especially in the cement industry, which accounts for 7% of global emissions. Cement industries around the world have committed to address this problem by reaching carbon neutrality by the year 2050. The aim of the work to be presented was to contribute to the decarbonization strategy by integrating the 1st generation oxyfuel technology in cement production plants. This technology has been shown to improve fuel efficiency while providing one of the most cost-effective solutions when compared to other capture methods. A validated simulation of the cement plant was thus used as a basis to develop an oxyfuel retrofitted cement process. The process model for the oxyfuel technology is developed on the ASPEN (Advanced System for Process Engineering) PLUSTM simulation software. This process consists of an Air Separation Unit (ASU), an oxyfuel cement plant with coal and alternative solid fuel (ASF) as feedstock, and a carbon dioxide processing unit (CPU). A detailed description and analysis of the CPU will be presented, including the findings of a literature review and simulation results, regarding the effects of flue gas impurities during operation. Acknowledgment: This research has been conducted in the framework of the EU funded AC2OCEM project, which investigates first and the second generation oxyfuel concepts.Keywords: oxyfuel technology, carbon capture and storage, CO₂ processing unit, cement, aspen plus
Procedia PDF Downloads 19929821 Same-Day Detection Method of Salmonella Spp., Shigella Spp. and Listeria Monocytogenes with Fluorescence-Based Triplex Real-Time PCR
Authors: Ergun Sakalar, Kubra Bilgic
Abstract:
Faster detection and characterization of pathogens are the basis of the evoid from foodborne pathogens. Salmonella spp., Shigella spp. and Listeria monocytogenes are common foodborne bacteria that are among the most life-threatining. It is important to rapid and accurate detection of these pathogens to prevent food poisoning and outbreaks or to manage food chains. The present work promise to develop a sensitive, species specific and reliable PCR based detection system for simultaneous detection of Salmonella spp., Shigella spp. and Listeria monocytogenes. For this purpose, three genes were picked out, ompC for Salmonella spp., ipaH for Shigella spp. and hlyA for L. monocytogenes. After short pre-enrichment of milk was passed through a vacuum filter and bacterial DNA was exracted using commercially available kit GIDAGEN®(Turkey, İstanbul). Detection of amplicons was verified by examination of the melting temperature (Tm) that are 72° C, 78° C, 82° C for Salmonella spp., Shigella spp. and L. monocytogenes, respectively. The method specificity was checked against a group of bacteria strains, and also carried out sensitivity test resulting in under 10² CFU mL⁻¹ of milk for each bacteria strain. Our results show that the flourescence based triplex qPCR method can be used routinely to detect Salmonella spp., Shigella spp. and L. monocytogenes during the milk processing procedures in order to reduce cost, time of analysis and the risk of foodborne disease outbreaks.Keywords: evagreen, food-born bacteria, pathogen detection, real-time pcr
Procedia PDF Downloads 24529820 Developing an Automated Protocol for the Wristband Extraction Process Using Opentrons
Authors: Tei Kim, Brooklynn McNeil, Kathryn Dunn, Douglas I. Walker
Abstract:
To better characterize the relationship between complex chemical exposures and disease, our laboratory uses an approach that combines low-cost, polydimethylsiloxane (silicone) wristband samplers that absorb many of the chemicals we are exposed to with untargeted high-resolution mass spectrometry (HRMS) to characterize 1000’s of chemicals at a time. In studies with human populations, these wristbands can provide an important measure of our environment: however, there is a need to use this approach in large cohorts to study exposures associated with the disease. To facilitate the use of silicone samplers in large scale population studies, the goal of this research project was to establish automated sample preparation methods that improve throughput, robustness, and scalability of analytical methods for silicone wristbands. Using the Opentron OT2 automated liquid platform, which provides a low-cost and opensource framework for automated pipetting, we created two separate workflows that translate the manual wristband preparation method to a fully automated protocol that requires minor intervention by the operator. These protocols include a sequence generation step, which defines the location of all plates and labware according to user-specified settings, and a transfer protocol that includes all necessary instrument parameters and instructions for automated solvent extraction of wristband samplers. These protocols were written in Python and uploaded to GitHub for use by others in the research community. Results from this project show it is possible to establish automated and open source methods for the preparation of silicone wristband samplers to support profiling of many environmental exposures. Ongoing studies include deployment in longitudinal cohort studies to investigate the relationship between personal chemical exposure and disease.Keywords: bioinformatics, automation, opentrons, research
Procedia PDF Downloads 11929819 Utilizing Temporal and Frequency Features in Fault Detection of Electric Motor Bearings with Advanced Methods
Authors: Mohammad Arabi
Abstract:
The development of advanced technologies in the field of signal processing and vibration analysis has enabled more accurate analysis and fault detection in electrical systems. This research investigates the application of temporal and frequency features in detecting faults in electric motor bearings, aiming to enhance fault detection accuracy and prevent unexpected failures. The use of methods such as deep learning algorithms and neural networks in this process can yield better results. The main objective of this research is to evaluate the efficiency and accuracy of methods based on temporal and frequency features in identifying faults in electric motor bearings to prevent sudden breakdowns and operational issues. Additionally, the feasibility of using techniques such as machine learning and optimization algorithms to improve the fault detection process is also considered. This research employed an experimental method and random sampling. Vibration signals were collected from electric motors under normal and faulty conditions. After standardizing the data, temporal and frequency features were extracted. These features were then analyzed using statistical methods such as analysis of variance (ANOVA) and t-tests, as well as machine learning algorithms like artificial neural networks and support vector machines (SVM). The results showed that using temporal and frequency features significantly improves the accuracy of fault detection in electric motor bearings. ANOVA indicated significant differences between normal and faulty signals. Additionally, t-tests confirmed statistically significant differences between the features extracted from normal and faulty signals. Machine learning algorithms such as neural networks and SVM also significantly increased detection accuracy, demonstrating high effectiveness in timely and accurate fault detection. This study demonstrates that using temporal and frequency features combined with machine learning algorithms can serve as an effective tool for detecting faults in electric motor bearings. This approach not only enhances fault detection accuracy but also simplifies and streamlines the detection process. However, challenges such as data standardization and the cost of implementing advanced monitoring systems must also be considered. Utilizing temporal and frequency features in fault detection of electric motor bearings, along with advanced machine learning methods, offers an effective solution for preventing failures and ensuring the operational health of electric motors. Given the promising results of this research, it is recommended that this technology be more widely adopted in industrial maintenance processes.Keywords: electric motor, fault detection, frequency features, temporal features
Procedia PDF Downloads 5629818 Exploring Intercultural Communication and Organizational Challenges of Women's Stereotypes: Gendered Expectancies
Authors: Andrew Enaifoghe
Abstract:
Women's roles in the past and modern society were typically subordinate to men. This form of discrimination against women prevented them from taking on leadership roles as they were considered male roles. However, some theories, like social thought, suggest that human minds form a map during socialization, where each category of things/objects is represented in schemata or nodes. These representations or nodules are interrelated, subject to their probability of developing together and formed based on previous experiences. The consequences of gender roles and the threat of stereotyping in the workplace have been debated by the researcher. The study also looks at the effects of stereotypes beyond test performance and the submission of socio-cultural briefs low-cost interventions in the working environment through organizational and intercultural communication. This study adopted a qualitative research method with a systematic document analysis, which allows researchers to study by consulting and making sense of written materials available in the public or private domain. The study employed the Social Identity Theory (SIT) and Organizational Control Theory to conceptualize this paper. The study discovered that when women use an interpersonally oriented leadership style in male-dominated industries, they have been found to suffer from high levels of mental ill-health and continue to endure significant amounts of pressure from their professions.Keywords: gender roles, stereotyping, organizational, intercultural communication
Procedia PDF Downloads 2429817 Treatment with Triton-X 100: An Enhancement Approach for Cardboard Bioprocessing
Authors: Ahlam Said Al Azkawi, Nallusamy Sivakumar, Saif Nasser Al Bahri
Abstract:
Diverse approaches and pathways are under development with the determination to develop cellulosic biofuels and other bio-products eventually at commercial scale in “bio-refineries”; however, the key challenge is mainly the high level of complexity in processing the feedstock which is complicated and energy consuming. To overcome the complications in utilizing the naturally occurring lignocellulose biomass, using waste paper as a feedstock for bio-production may solve the problem. Besides being abundant and cheap, bioprocessing of waste paper has evolved in response to the public concern from rising landfill cost from shrinking landfill capacity. Cardboard (CB) is one of the major components of municipal solid waste and one of the most important items to recycle. Although 50-70% of cardboard constitute is known to be cellulose and hemicellulose, the presence of lignin around them cause hydrophobic cross-link which physically obstructs the hydrolysis by rendering it resistant to enzymatic cleavage. Therefore, pretreatment is required to disrupt this resistance and to enhance the exposure of the targeted carbohydrates to the hydrolytic enzymes. Several pretreatment approaches have been explored, and the best ones would be those can influence cellulose conversion rates and hydrolytic enzyme performance with minimal or less cost and downstream processes. One of the promising strategies in this field is the application of surfactants, especially non-ionic surfactants. In this study, triton-X 100 was used as surfactants to treat cardboard prior enzymatic hydrolysis and compare it with acid treatment using 0.1% H2SO4. The effect of the surfactant enhancement was evaluated through its effect on hydrolysis rate in respect to time in addition to evaluating the structural changes and modification by scanning electron microscope (SEM) and X-ray diffraction (XRD) and through compositional analysis. Further work was performed to produce ethanol from CB treated with triton-X 100 via separate hydrolysis and fermentation (SHF) and simultaneous saccharification and fermentation (SSF). The hydrolysis studies have demonstrated enhancement in saccharification by 35%. After 72 h of hydrolysis, a saccharification rate of 98% was achieved from CB enhanced with triton-X 100, while only 89 of saccharification achieved from acid pre-treated CB. At 120 h, the saccharification % exceeded 100 as reducing sugars continued to increase with time. This enhancement was not supported by any significant changes in the cardboard content as the cellulose, hemicellulose and lignin content remained same after treatment, but obvious structural changes were observed through SEM images. The cellulose fibers were clearly exposed with very less debris and deposits compared to cardboard without triton-X 100. The XRD pattern has also revealed the ability of the surfactant in removing calcium carbonate, a filler found in waste paper known to have negative effect on enzymatic hydrolysis. The cellulose crystallinity without surfactant was 73.18% and reduced to 66.68% rendering it more amorphous and susceptible to enzymatic attack. Triton-X 100 has proved to effectively enhance CB hydrolysis and eventually had positive effect on the ethanol yield via SSF. Treating cardboard with only triton-X 100 was a sufficient treatment to enhance the enzymatic hydrolysis and ethanol production.Keywords: cardboard, enhancement, ethanol, hydrolysis, treatment, Triton-X 100
Procedia PDF Downloads 15529816 The Nexus between Socio-Economic Inequalities and the Talibanization in Pakistan’s Federally Administrated Tribal Areas
Authors: Sajjad Ahmed
Abstract:
Since September 2001, the Federally Administered Tribal Areas (FATA) have become a hotbed of Talibanization. The eruption of Talibanization has caused a catastrophic human and socio-economic cost on Pakistan ever since. The vast majority of extant studies have tended to focus on assessing the current disparaging and destructive condition of FATA as a product of the notorious 'Global War on Terrorism' and its consequences in the form of the Afghan war and the rising socio-political unrest in the region. This, however, is not the case. This study argues that the Talibanization has not happened overnight, the magma of current militant volcanic outburst has been stockpiled since the inception of Pakistan in 1947. The study claims that the Talibanization is the expression of the conflict between the privileged and the underprivileged. The prevailing situation in FATA warrants an in-depth analysis of the problem. By using a qualitative and quantitative research principle, this paper attempts to critically examine 'How is Talibanization in Pakistan connected with the political, social, and economic conditions in FATA?' The critical analyses of this study would assist to policymakers in order to formulate all-encompassing anti-radicalization policies to effectively root out Talibanization in FATA. This research intends to explore the undiscovered root causes of the problem and to suggest remedial measures.Keywords: exclusion, FATA (Federally Administrated Tribal Areas), inequalities, marginalization, Pakistan, socio-economic, talibanization
Procedia PDF Downloads 142