Search results for: nitroglycerine recovery and purge system
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18929

Search results for: nitroglycerine recovery and purge system

12899 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features

Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh

Abstract:

In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.

Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve

Procedia PDF Downloads 262
12898 Fault Tree Analysis (FTA) of CNC Turning Center

Authors: R. B. Patil, B. S. Kothavale, L. Y. Waghmode

Abstract:

Today, the CNC turning center becomes an important machine tool for manufacturing industry worldwide. However, as the breakdown of a single CNC turning center may result in the production of an entire plant being halted. For this reason, operations and preventive maintenance have to be minimized to ensure availability of the system. Indeed, improving the availability of the CNC turning center as a whole, objectively leads to a substantial reduction in production loss, operating, maintenance and support cost. In this paper, fault tree analysis (FTA) method is used for reliability analysis of CNC turning center. The major faults associated with the system and the causes for the faults are presented graphically. Boolean algebra is used for evaluating fault tree (FT) diagram and for deriving governing reliability model for CNC turning center. Failure data over a period of six years has been collected and used for evaluating the model. Qualitative and quantitative analysis is also carried out to identify critical sub-systems and components of CNC turning center. It is found that, at the end of the warranty period (one year), the reliability of the CNC turning center as a whole is around 0.61628.

Keywords: fault tree analysis (FTA), reliability analysis, risk assessment, hazard analysis

Procedia PDF Downloads 414
12897 Downward Vertical Evacuation for Disabilities People from Tsunami Using Escape Bunker Technology

Authors: Febrian Tegar Wicaksana, Niqmatul Kurniati, Surya Nandika

Abstract:

Indonesia is one of the countries that have great number of disaster occurrence and threat because it is located in not only between three tectonic plates such as Eurasia plates, Indo-Australia plates and Pacific plates, but also in the Ring of Fire path, like earthquake, Tsunami, volcanic eruption and many more. Recently, research shows that there are potential areas that will be devastated by Tsunami in southern coast of Java. Tsunami is a series of waves in a body of water caused by the displacement of a large volume of water, generally in an ocean. When the waves enter shallow water, they may rise to several feet or, in rare cases, tens of feet, striking the coast with devastating force. The parameter for reference such as magnitude, the depth of epicentre, distance between epicentres with land, the depth of every points, when reached the shore and the growth of waves. Interaction between parameters will bring the big variance of Tsunami wave. Based on that, we can formulate preparation that needed for disaster mitigation strategies. The mitigation strategies will take the important role in an effort to reduce the number of victims and damage in the area. It will reduce the number of victim and casualties. Reducing is directed to the most difficult mobilization casualties in the tsunami disaster area like old people, sick people and disabilities people. Until now, the method that used for rescuing people from Tsunami is basic horizontal evacuation. This evacuation system is not optimal because it needs so long time and it cannot be used by people with disabilities. The writers propose to create a vertical evacuation model with an escape bunker system. This bunker system is chosen because the downward vertical evacuation is considered more efficient and faster. Especially in coastal areas without any highlands surround it. The downward evacuation system is better than upward evacuation because it can avoid the risk of erosion at the ground around the structure which can affect the building. The structure of the bunker and the evacuation process while, and even after, disaster are the main priority to be considered. The power of bunker has quake’s resistance, the durability from water stream, variety of interaction to the ground, and waterproof design. When the situation is back to normal, victim and casualties can go into the safer place. The bunker will be located near the hospital and public places, and will have wide entrance supported by large slide in it so it will ease the disabilities people. The technology of the escape bunker system is expected to reduce the number of victims who have low mobility in the Tsunami.

Keywords: escape bunker, tsunami, vertical evacuation, mitigation, disaster management

Procedia PDF Downloads 492
12896 Design of DNA Origami Structures Using LAMP Products as a Combined System for the Detection of Extended Spectrum B-Lactamases

Authors: Kalaumari Mayoral-Peña, Ana I. Montejano-Montelongo, Josué Reyes-Muñoz, Gonzalo A. Ortiz-Mancilla, Mayrin Rodríguez-Cruz, Víctor Hernández-Villalobos, Jesús A. Guzmán-López, Santiago García-Jacobo, Iván Licona-Vázquez, Grisel Fierros-Romero, Rosario Flores-Vallejo

Abstract:

The group B-lactamic antibiotics include some of the most frequently used small drug molecules against bacterial infections. Nevertheless, an alarming decrease in their efficacy has been reported due to the emergence of antibiotic-resistant bacteria. Infections caused by bacteria expressing extended Spectrum B-lactamases (ESBLs) are difficult to treat and account for higher morbidity and mortality rates, delayed recovery, and high economic burden. According to the Global Report on Antimicrobial Resistance Surveillance, it is estimated that mortality due to resistant bacteria will ascend to 10 million cases per year worldwide. These facts highlight the importance of developing low-cost and readily accessible detection methods of drug-resistant ESBLs bacteria to prevent their spread and promote accurate and fast diagnosis. Bacterial detection is commonly done using molecular diagnostic techniques, where PCR stands out for its high performance. However, this technique requires specialized equipment not available everywhere, is time-consuming, and has a high cost. Loop-Mediated Isothermal Amplification (LAMP) is an alternative technique that works at a constant temperature, significantly decreasing the equipment cost. It yields double-stranded DNA of several lengths with repetitions of the target DNA sequence as a product. Although positive and negative results from LAMP can be discriminated by colorimetry, fluorescence, and turbidity, there is still a large room for improvement in the point-of-care implementation. DNA origami is a technique that allows the formation of 3D nanometric structures by folding a large single-stranded DNA (scaffold) into a determined shape with the help of short DNA sequences (staples), which hybridize with the scaffold. This research aimed to generate DNA origami structures using LAMP products as scaffolds to improve the sensitivity to detect ESBLs in point-of-care diagnosis. For this study, the coding sequence of the CTM-X-15 ESBL of E. coli was used to generate the LAMP products. The set of LAMP primers were designed using PrimerExplorerV5. As a result, a target sequence of 200 nucleotides from CTM-X-15 ESBL was obtained. Afterward, eight different DNA origami structures were designed using the target sequence in the SDCadnano and analyzed with CanDo to evaluate the stability of the 3D structures. The designs were constructed minimizing the total number of staples to reduce costs and complexity for point-of-care applications. After analyzing the DNA origami designs, two structures were selected. The first one was a zig-zag flat structure, while the second one was a wall-like shape. Given the sequence repetitions in the scaffold sequence, both were able to be assembled with only 6 different staples each one, ranging between 18 to 80 nucleotides. Simulations of both structures were performed using scaffolds of different sizes yielding stable structures in all the cases. The generation of the LAMP products were tested by colorimetry and electrophoresis. The formation of the DNA structures was analyzed using electrophoresis and colorimetry. The modeling of novel detection methods through bioinformatics tools allows reliable control and prediction of results. To our knowledge, this is the first study that uses LAMP products and DNA-origami in combination to delect ESBL-producing bacterial strains, which represent a promising methodology for diagnosis in the point-of-care.

Keywords: beta-lactamases, antibiotic resistance, DNA origami, isothermal amplification, LAMP technique, molecular diagnosis

Procedia PDF Downloads 222
12895 Consideration of Failed Fuel Detector Location through Computational Flow Dynamics Analysis on Primary Cooling System Flow with Two Outlets

Authors: Sanghoon Bae, Hanju Cha

Abstract:

Failed fuel detector (FFD) in research reactor is a very crucial instrument to detect the anomaly from failed fuels in the early stage around primary cooling system (PCS) outlet prior to the decay tank. FFD is considered as a mandatory sensor to ensure the integrity of fuel assemblies and mitigate the consequence from a failed fuel accident. For the effective function of FFD, the location of them should be determined by contemplating the effect from coolant flow around two outlets. For this, the analysis on computational flow dynamics (CFD) should be first performed how the coolant outlet flow including radioactive materials from failed fuels are mixed and discharged through the outlet plenum within certain seconds. The analysis result shows that the outlet flow is well mixed regardless of the position of failed fuel and ultimately illustrates the effect of detector location.

Keywords: computational flow dynamics (CFD), failed fuel detector (FFD), fresh fuel assembly (FFA), spent fuel assembly (SFA)

Procedia PDF Downloads 240
12894 Automated System: Managing the Production and Distribution of Radiopharmaceuticals

Authors: Shayma Mohammed, Adel Trabelsi

Abstract:

Radiopharmacy is the art of preparing high-quality, radioactive, medicinal products for use in diagnosis and therapy. Radiopharmaceuticals unlike normal medicines, this dual aspect (radioactive, medical) makes their management highly critical. One of the most convincing applications of modern technologies is the ability to delegate the execution of repetitive tasks to programming scripts. Automation has found its way to the most skilled jobs, to improve the company's overall performance by allowing human workers to focus on more important tasks than document filling. This project aims to contribute to implement a comprehensive system to insure rigorous management of radiopharmaceuticals through the use of a platform that links the Nuclear Medicine Service Management System to the Nuclear Radio-pharmacy Management System in accordance with the recommendations of World Health Organization (WHO) and International Atomic Energy Agency (IAEA). In this project we attempt to build a web application that targets radiopharmacies, the platform is built atop the inherently compatible web stack which allows it to work in virtually any environment. Different technologies are used in this project (PHP, Symfony, MySQL Workbench, Bootstrap, Angular 7, Visual Studio Code and TypeScript). The operating principle of the platform is mainly based on two parts: Radiopharmaceutical Backoffice for the Radiopharmacian, who is responsible for the realization of radiopharmaceutical preparations and their delivery and Medical Backoffice for the Doctor, who holds the authorization for the possession and use of radionuclides and he/she is responsible for ordering radioactive products. The application consists of sven modules: Production, Quality Control/Quality Assurance, Release, General Management, References, Transport and Stock Management. It allows 8 classes of users: The Production Manager (PM), Quality Control Manager (QCM), Stock Manager (SM), General Manager (GM), Client (Doctor), Parking and Transport Manager (PTM), Qualified Person (QP) and Technical and Production Staff. Digital platform bringing together all players involved in the use of radiopharmaceuticals and integrating the stages of preparation, production and distribution, Web technologies, in particular, promise to offer all the benefits of automation while requiring no more than a web browser to act as a user client, which is a strength because the web stack is by nature multi-platform. This platform will provide a traceability system for radiopharmaceuticals products to ensure the safety and radioprotection of actors and of patients. The new integrated platform is an alternative to write all the boilerplate paperwork manually, which is a tedious and error-prone task. It would minimize manual human manipulation, which has proven to be the main source of error in nuclear medicine. A codified electronic transfer of information from radiopharmaceutical preparation to delivery will further reduce the risk of maladministration.

Keywords: automated system, management, radiopharmacy, technical papers

Procedia PDF Downloads 156
12893 Altasreef: Automated System of Quran Verbs for Urdu Language

Authors: Haq Nawaz, Muhammad Amjad Iqbal, Kamran Malik

Abstract:

"Altasreef" is an automated system available for Web and Android users which provide facility to the users to learn the Quran verbs. It provides the facility to the users to practice the learned material and also provide facility of exams of Arabic verbs variation focusing on Quran text. Arabic is a highly inflectional language. Almost all of its words connect to roots of three, four or five letters which approach the meaning of all their inflectional forms. In Arabic, a verb is formed by inserting the consonants into one of a set of verb patterns. Suffixes and prefixes are then added to generate the meaning of number, person, and gender. The active/passive voice and perfective aspect and other patterns are than generated. This application is designed for learners of Quranic Arabic who already have learn basics of Arabic conjugation. Application also provides the facility of translation of generated patterns. These translations are generated with the help of rule-based approach to give 100% results to the learners.

Keywords: NLP, Quran, Computational Linguistics, E Learning

Procedia PDF Downloads 167
12892 Facing Global Competition through Participation in Global Innovation Networks: The Case of Mechatronics District in the Veneto Region

Authors: Monica Plechero

Abstract:

Many firms belonging to Italian industrial districts faced a crisis starting from 2000 and upsurging during 2008-2014. To remain competitive in the global market, these firms and their local systems need to renovate their traditional competitive advantages, strengthen their link with global flows of knowledge. This may be particularly relevant in sectors such as the mechatronics, that combine traditional knowledge domain with new knowledge domains (e.g. mechanics, electronics, and informatics). This sector is nowadays one of the key sectors within the so-called ‘smart specialization strategy’ that can lead part of the Italian traditional industry towards new economic developmental opportunities. This paper, by investigating the mechatronics district of the Veneto region, wants to shed new light on how firms of a local system can gain from the globalization of innovation and innovation networks. Methodologically, the paper relies on primary data collected through a survey targeting firms of the local system, as well as on a number of qualitative case studies. The relevant role of medium size companies in the district emerges as evident, as they have wider opportunities to be involved in different processes of globalization of innovation. Indeed, with respect to small companies, the size of medium firms allows them to exploit strategically international markets and globally distributed knowledge. Supporting medium firms’ global innovation strategies, and incentivizing their role as district gatekeepers, may strengthen the competitive capability of the local system and provide new opportunities to positively face global competition.

Keywords: global innovation network, industrial district, internationalization, innovation, mechatronics, Veneto region

Procedia PDF Downloads 230
12891 Sustainability of Heritage Management in Aksum: Focus on Heritage Conservation and Interpretation

Authors: Gebrekiros Welegebriel Asfaw

Abstract:

The management of the fragile, unique and irreplaceable cultural heritage from different perspectives is becoming a major challenge as important elements of culture are vanishing throughout the globe. The major purpose of this study is to assess how the cultural heritages of Aksum are managed for their future sustainability from heritage conservation and interpretation perspectives. Descriptive type of research design inculcating both quantitative and qualitative research methods is employed. Primary quantitative data was collected from 189 respondents (19 professionals, 88 tourism service providers and 82 tourists) and interview was conducted with 33 targeted informants from heritage and related professions, security employees, local community, service providers and church representatives by applying probability and non probability sampling methods. Findings of the study reveal that the overall sustainable management status of the cultural heritage of Aksum is below average. It is found that the sustainability of cultural heritage management in Aksum is facing a lot of unfavorable factors like lack of long term planning, incompatible system of heritage administration, limited capacity and number of professionals, scant attention to community based heritage and tourism development, dirtiness and drainage problems, problems with stakeholder involvement and cooperation, lack of organized interpretation and presentation systems and others. So, re-organization of the management system, creating platform for coordination among stakeholders and developing appropriate interpretation system can be good remedies. Introducing community based heritage and tourism development concept is also recommendable for a long term win-win success in Aksum.

Keywords: Aksum, conservation, interpretation, Sustainable Cultural Heritage Management

Procedia PDF Downloads 324
12890 Analysis of Elastic-Plastic Deformation of Reinforced Concrete Shear-Wall Structures under Earthquake Excitations

Authors: Oleg Kabantsev, Karomatullo Umarov

Abstract:

The engineering analysis of earthquake consequences demonstrates a significantly different level of damage to load-bearing systems of different types. Buildings with reinforced concrete columns and separate shear-walls receive the highest level of damage. Traditional methods for predicting damage under earthquake excitations do not provide an answer to the question about the reasons for the increased vulnerability of reinforced concrete frames with shear-walls bearing systems. Thus, the study of the problem of formation and accumulation of damages in the structures reinforced concrete frame with shear-walls requires the use of new methods of assessment of the stress-strain state, as well as new approaches to the calculation of the distribution of forces and stresses in the load-bearing system based on account of various mechanisms of elastic-plastic deformation of reinforced concrete columns and walls. The results of research into the processes of non-linear deformation of structures with a transition to destruction (collapse) will allow to substantiate the characteristics of limit states of various structures forming an earthquake-resistant load-bearing system. The research of elastic-plastic deformation processes of reinforced concrete structures of frames with shear-walls is carried out on the basis of experimentally established parameters of limit deformations of concrete and reinforcement under dynamic excitations. Limit values of deformations are defined for conditions under which local damages of the maximum permissible level are formed in constructions. The research is performed by numerical methods using ETABS software. The research results indicate that under earthquake excitations, plastic deformations of various levels are formed in various groups of elements of the frame with the shear-wall load-bearing system. During the main period of seismic effects in the shear-wall elements of the load-bearing system, there are insignificant volumes of plastic deformations, which are significantly lower than the permissible level. At the same time, plastic deformations are formed in the columns and do not exceed the permissible value. At the final stage of seismic excitations in shear-walls, the level of plastic deformations reaches values corresponding to the plasticity coefficient of concrete , which is less than the maximum permissible value. Such volume of plastic deformations leads to an increase in general deformations of the bearing system. With the specified parameters of the deformation of the shear-walls in concrete columns, plastic deformations exceeding the limiting values develop, which leads to the collapse of such columns. Based on the results presented in this study, it can be concluded that the application seismic-force-reduction factor, common for the all load-bearing system, does not correspond to the real conditions of formation and accumulation of damages in elements of the load-bearing system. Using a single coefficient of seismic-force-reduction factor leads to errors in predicting the seismic resistance of reinforced concrete load-bearing systems. In order to provide the required level of seismic resistance buildings with reinforced concrete columns and separate shear-walls, it is necessary to use values of the coefficient of seismic-force-reduction factor differentiated by types of structural groups.1

Keywords: reinforced concrete structures, earthquake excitation, plasticity coefficients, seismic-force-reduction factor, nonlinear dynamic analysis

Procedia PDF Downloads 206
12889 Data Refinement Enhances The Accuracy of Short-Term Traffic Latency Prediction

Authors: Man Fung Ho, Lap So, Jiaqi Zhang, Yuheng Zhao, Huiyang Lu, Tat Shing Choi, K. Y. Michael Wong

Abstract:

Nowadays, a tremendous amount of data is available in the transportation system, enabling the development of various machine learning approaches to make short-term latency predictions. A natural question is then the choice of relevant information to enable accurate predictions. Using traffic data collected from the Taiwan Freeway System, we consider the prediction of short-term latency of a freeway segment with a length of 17 km covering 5 measurement points, each collecting vehicle-by-vehicle data through the electronic toll collection system. The processed data include the past latencies of the freeway segment with different time lags, the traffic conditions of the individual segments (the accumulations, the traffic fluxes, the entrance and exit rates), the total accumulations, and the weekday latency profiles obtained by Gaussian process regression of past data. We arrive at several important conclusions about how data should be refined to obtain accurate predictions, which have implications for future system-wide latency predictions. (1) We find that the prediction of median latency is much more accurate and meaningful than the prediction of average latency, as the latter is plagued by outliers. This is verified by machine-learning prediction using XGBoost that yields a 35% improvement in the mean square error of the 5-minute averaged latencies. (2) We find that the median latency of the segment 15 minutes ago is a very good baseline for performance comparison, and we have evidence that further improvement is achieved by machine learning approaches such as XGBoost and Long Short-Term Memory (LSTM). (3) By analyzing the feature importance score in XGBoost and calculating the mutual information between the inputs and the latencies to be predicted, we identify a sequence of inputs ranked in importance. It confirms that the past latencies are most informative of the predicted latencies, followed by the total accumulation, whereas inputs such as the entrance and exit rates are uninformative. It also confirms that the inputs are much less informative of the average latencies than the median latencies. (4) For predicting the latencies of segments composed of two or three sub-segments, summing up the predicted latencies of each sub-segment is more accurate than the one-step prediction of the whole segment, especially with the latency prediction of the downstream sub-segments trained to anticipate latencies several minutes ahead. The duration of the anticipation time is an increasing function of the traveling time of the upstream segment. The above findings have important implications to predicting the full set of latencies among the various locations in the freeway system.

Keywords: data refinement, machine learning, mutual information, short-term latency prediction

Procedia PDF Downloads 169
12888 Peril´s Environment of Energetic Infrastructure Complex System, Modelling by the Crisis Situation Algorithms

Authors: Jiří F. Urbánek, Alena Oulehlová, Hana Malachová, Jiří J. Urbánek Jr.

Abstract:

Crisis situations investigation and modelling are introduced and made within the complex system of energetic critical infrastructure, operating on peril´s environments. Every crisis situations and perils has an origin in the emergency/ crisis event occurrence and they need critical/ crisis interfaces assessment. Here, the emergency events can be expected - then crisis scenarios can be pre-prepared by pertinent organizational crisis management authorities towards their coping; or it may be unexpected - without pre-prepared scenario of event. But the both need operational coping by means of crisis management as well. The operation, forms, characteristics, behaviour and utilization of crisis management have various qualities, depending on real critical infrastructure organization perils, and prevention training processes. An aim is always - better security and continuity of the organization, which successful obtainment needs to find and investigate critical/ crisis zones and functions in critical infrastructure organization models, operating in pertinent perils environment. Our DYVELOP (Dynamic Vector Logistics of Processes) method is disposables for it. Here, it is necessary to derive and create identification algorithm of critical/ crisis interfaces. The locations of critical/ crisis interfaces are the flags of crisis situation in organization of critical infrastructure models. Then, the model of crisis situation will be displayed at real organization of Czech energetic crisis infrastructure subject in real peril environment. These efficient measures are necessary for the infrastructure protection. They will be derived for peril mitigation, crisis situation coping and for environmentally friendly organization survival, continuity and its sustainable development advanced possibilities.

Keywords: algorithms, energetic infrastructure complex system, modelling, peril´s environment

Procedia PDF Downloads 402
12887 Evolving Paradigm of Right to Development in International Human Rights Law and Its Transformation into the National Legal System: Challenges and Responses in Pakistan

Authors: Naeem Ullah Khan, Kalsoom Khan

Abstract:

No state can be progressive and prosperous in which a large number of people is deprived of their basic economic rights and freedoms. In the contemporary world of globalization, the right to development has gained a momentum force in the domain of International Development Law (IDL) and has integrated into the National Legal System (NLS) of the major developed states. The international experts on human rights argued that the right to development (RTD) is called a third-generation human right which tends to enhance the welfare and prosperity of individuals, and thus, it is a right to a process whose outcomes are human rights despite the controversy on the implications of RTD. In the Pakistan legal system, the RTD has not been expressly stated in the constitution of the Islamic Republic of Pakistan, 1973. However, there are some implied constitutional provisions which reflect the concept of RTD. The jurisprudence on RTD is still an evolving paradigm in the contextual perspective of Pakistan, and the superior court of diverse jurisdiction acts as a catalyst regarding the protection and enforcement of RTD in the interest of the public at large. However, the case law explores the positive inclination of the courts in Pakistan on RTD be incorporated as an express provision in the chapters of fundamental rights; in this scenario, the high court’s of Pakistan under Article 199 and the supreme court of Pakistan under Article 184(3) have exercised jurisdiction on the enforcement of RTD. This paper inter-alia examines the national dimensions of RTD from the standpoint of state practice in Pakistan and it analyzes the experience of judiciary in the protection and enforcement of RTD. Moreover, the paper highlights the social and cultural challenges to Pakistan in the implementation of RTD and possible solution to improve the conditions of human rights in Pakistan. This paper will also highlight the steps taken by Pakistan regarding the awareness, incorporation, and propagation of RTD at the national level.

Keywords: globalization, Pakistan, RTD, third-generation right

Procedia PDF Downloads 168
12886 The Laser Line Detection for Autonomous Mapping Based on Color Segmentation

Authors: Pavel Chmelar, Martin Dobrovolny

Abstract:

Laser projection or laser footprint detection is today widely used in many fields of robotics, measurement, or electronics. The system accuracy strictly depends on precise laser footprint detection on target objects. This article deals with the laser line detection based on the RGB segmentation and the component labeling. As a measurement device was used the developed optical rangefinder. The optical rangefinder is equipped with vertical sweeping of the laser beam and high quality camera. This system was developed mainly for automatic exploration and mapping of unknown spaces. In the first section is presented a new detection algorithm. In the second section are presented measurements results. The measurements were performed in variable light conditions in interiors. The last part of the article present achieved results and their differences between day and night measurements.

Keywords: color segmentation, component labelling, laser line detection, automatic mapping, distance measurement, vector map

Procedia PDF Downloads 432
12885 A Statistical-Algorithmic Approach for the Design and Evaluation of a Fresnel Solar Concentrator-Receiver System

Authors: Hassan Qandil

Abstract:

Using a statistical algorithm incorporated in MATLAB, four types of non-imaging Fresnel lenses are designed; spot-flat, linear-flat, dome-shaped and semi-cylindrical-shaped. The optimization employs a statistical ray-tracing methodology of the incident light, mainly considering effects of chromatic aberration, varying focal lengths, solar inclination and azimuth angles, lens and receiver apertures, and the optimum number of prism grooves. While adopting an equal-groove-width assumption of the Poly-methyl-methacrylate (PMMA) prisms, the main target is to maximize the ray intensity on the receiver’s aperture and therefore achieving higher values of heat flux. The algorithm outputs prism angles and 2D sketches. 3D drawings are then generated via AutoCAD and linked to COMSOL Multiphysics software to simulate the lenses under solar ray conditions, which provides optical and thermal analysis at both the lens’ and the receiver’s apertures while setting conditions as per the Dallas-TX weather data. Once the lenses’ characterization is finalized, receivers are designed based on its optimized aperture size. Several cavity shapes; including triangular, arc-shaped and trapezoidal, are tested while coupled with a variety of receiver materials, working fluids, heat transfer mechanisms, and enclosure designs. A vacuum-reflective enclosure is also simulated for an enhanced thermal absorption efficiency. Each receiver type is simulated via COMSOL while coupled with the optimized lens. A lab-scale prototype for the optimum lens-receiver configuration is then fabricated for experimental evaluation. Application-based testing is also performed for the selected configuration, including that of a photovoltaic-thermal cogeneration system and solar furnace system. Finally, some future research work is pointed out, including the coupling of the collector-receiver system with an end-user power generator, and the use of a multi-layered genetic algorithm for comparative studies.

Keywords: COMSOL, concentrator, energy, fresnel, optics, renewable, solar

Procedia PDF Downloads 155
12884 Forced Vibration of an Auxetic Cylindrical Shell Containing Fluid Under the Influence of Shock Load

Authors: Korosh Khorshidi

Abstract:

Due to the increasing use of different materials, such as auxetic structures, it is necessary to investigate mechanical phenomena, such as vibration, in structures made of these types of materials. This paper examines the forced vibrations of a three-layer cylindrical shell containing inviscid fluid under shock load. All three layers are made of aluminum, and the central layer is made of a re-entrant honeycomb cell structure. Using high-order shear deformation theories (HSDT) and Hamilton’s principle, the governing equations of the system have been extracted and solved by the Galerkin weighted residual method. The outputs of the Abaqus finite element software are used to validate the results. The system is investigated with both simple and clamped support conditions. Finally, this study investigates the influence of the geometrical parameters of the shell and the auxetic structure, as well as the type, intensity, duration, and location of the load, and the effect of the fluid on the dynamic and time responses.

Keywords: force vibration, cylindrical shell, auxetic structure, inviscid fluid

Procedia PDF Downloads 43
12883 A Pervasive System Architecture for Smart Environments in Internet of Things Context

Authors: Patrick Santos, João Casal, João Santos Luis Varandas, Tiago Alves, Carlos Romeiro, Sérgio Lourenço

Abstract:

Nowadays, technology makes it possible to, in one hand, communicate with various objects of the daily life through the Internet, and in the other, put these objects interacting with each other through this channel. Simultaneously, with the raise of smartphones as the most ubiquitous technology on persons lives, emerge new agents for these devices - Intelligent Personal Assistants. These agents have the goal of helping the user manage and organize his information as well as supporting the user in his/her day-to-day tasks. Moreover, other emergent concept is the Cloud Computing, which allows computation and storage to get out of the users devices, bringing benefits in terms of performance, security, interoperability and others. Connecting these three paradigms, in this work we propose an architecture for an intelligent system which provides an interface that assists the user on smart environments, informing, suggesting actions and allowing to manage the objects of his/her daily life.

Keywords: internet of things, cloud, intelligent personal assistant, architecture

Procedia PDF Downloads 514
12882 A Theoretical Study of Multi-Leaf Spring in Seismic Response Control

Authors: M. Ezati Kooshki , H. Pourmohamad

Abstract:

Leaf spring dampers are used for commercial vehicles and heavy tracks. The main function of this damper in these vehicles is protection against damage and providing comfort for drivers by creating suspension between road and vehicle. This paper presents a new device, circular leaf spring damper, which is frequently used on vehicles, aiming to gain seismic protection of structures. Finite element analyses were conducted on several one-story structures using finite element software (Abaqus, v6.10-1). The time history analysis was conducted on the records of Kobe (1995) and San Fernando (1971) ground motions to demonstrate the advantages of using leaf spring in structures as compared to simple bracing system. This paper also suggests extending the use of this damper in structures, considering its large control force despite high cycle fatigue properties and low prices.

Keywords: bracing system, finite element analysis, leaf spring, seismic protection, time history analysis

Procedia PDF Downloads 405
12881 The Importance of Jewish Influence on Foundation of Manichaean Philosophical and Religious System

Authors: Tatyana Suvorkina

Abstract:

It is indisputable that the problem of the origin of Manichaeism is very complex. Manichaeism is characterized as a syncretic religion, which was influenced by many teachings, but it is difficult to define one which can be called fundamental. The aim of this paper is an attempt to regard Jewish apocalyptic tradition as one of the most defining source of formation of Manichaean systems. To realize this aim a comparison of the Manichean texts and the Jewish apocryphal literature is made. Consideration is given first to the Coptic Manichaean treatise Kephalaia, The Cologne Mani Codex and to books of Enoch. Under the article it is not denied that Manichaeism was influenced by different doctrines and, passed through centuries, it could adapt and strengthen this influence at an even deeper level. But the fact that the Judeo-Christian environment where Mani grew up and where the first sprouts of his teaching were formed had impact on future prophet seems obvious. Nevertheless, attempts to analyze the system of Mani within the Jewish tradition are quite rare, although such studies were carried out for Gnosticism. But Manichaeism, despite the Gnostic features it contains, is not 'one of the Gnostics' to place it under this term among the rest. Frequently, gnostic currents are pointed out as the main sources for the formation of Mani’s teachings. But it seems possible that Mani's interest in Gnosticism was motivated by the fact that he considered it as something close to that interpretation of Hebrew texts, which he aspired to undertake. The question of understanding the Manichaean system is connected not only with Manichaeism but also with other dualistic teachings, which were recognized by contemporaries as Manichaean. It is seen that polemics between Manicheans and Hellenized Christianity separated from Judaism and continued to separate with every century, were polemics between adherents of initially two different worldviews who had, however, a common source. Therefore an analysis of the controversy in the context of interpretations of this common source by disputing parties is seen very important for further study.

Keywords: dualism, Jewish apocalypticism, Manichaeism, syncretism

Procedia PDF Downloads 186
12880 Storms Dynamics in the Black Sea in the Context of the Climate Changes

Authors: Eugen Rusu

Abstract:

The objective of the work proposed is to perform an analysis of the wave conditions in the Black Sea basin. This is especially focused on the spatial and temporal occurrences and on the dynamics of the most extreme storms in the context of the climate changes. A numerical modelling system, based on the spectral phase averaged wave model SWAN, has been implemented and validated against both in situ measurements and remotely sensed data, all along the sea. Moreover, a successive correction method for the assimilation of the satellite data has been associated with the wave modelling system. This is based on the optimal interpolation of the satellite data. Previous studies show that the process of data assimilation improves considerably the reliability of the results provided by the modelling system. This especially concerns the most sensitive cases from the point of view of the accuracy of the wave predictions, as the extreme storm situations are. Following this numerical approach, it has to be highlighted that the results provided by the wave modelling system above described are in general in line with those provided by some similar wave prediction systems implemented in enclosed or semi-enclosed sea basins. Simulations of this wave modelling system with data assimilation have been performed for the 30-year period 1987-2016. Considering this database, the next step was to analyze the intensity and the dynamics of the higher storms encountered in this period. According to the data resulted from the model simulations, the western side of the sea is considerably more energetic than the rest of the basin. In this western region, regular strong storms provide usually significant wave heights greater than 8m. This may lead to maximum wave heights even greater than 15m. Such regular strong storms may occur several times in one year, usually in the wintertime, or in late autumn, and it can be noticed that their frequency becomes higher in the last decade. As regards the case of the most extreme storms, significant wave heights greater than 10m and maximum wave heights close to 20m (and even greater) may occur. Such extreme storms, which in the past were noticed only once in four or five years, are more recent to be faced almost every year in the Black Sea, and this seems to be a consequence of the climate changes. The analysis performed included also the dynamics of the monthly and annual significant wave height maxima as well as the identification of the most probable spatial and temporal occurrences of the extreme storm events. Finally, it can be concluded that the present work provides valuable information related to the characteristics of the storm conditions and on their dynamics in the Black Sea. This environment is currently subjected to high navigation traffic and intense offshore and nearshore activities and the strong storms that systematically occur may produce accidents with very serious consequences.

Keywords: Black Sea, extreme storms, SWAN simulations, waves

Procedia PDF Downloads 248
12879 Experimental Study of an Isobaric Expansion Heat Engine with Hydraulic Power Output for Conversion of Low-Grade-Heat to Electricity

Authors: Maxim Glushenkov, Alexander Kronberg

Abstract:

Isobaric expansion (IE) process is an alternative to conventional gas/vapor expansion accompanied by a pressure decrease typical of all state-of-the-art heat engines. The elimination of the expansion stage accompanied by useful work means that the most critical and expensive parts of ORC systems (turbine, screw expander, etc.) are also eliminated. In many cases, IE heat engines can be more efficient than conventional expansion machines. In addition, IE machines have a very simple, reliable, and inexpensive design. They can also perform all the known operations of existing heat engines and provide usable energy in a very convenient hydraulic or pneumatic form. This paper reports measurement made with the engine operating as a heat-to-shaft-power or electricity converter and a comparison of the experimental results to a thermodynamic model. Experiments were carried out at heat source temperature in the range 30–85 °C and heat sink temperature around 20 °C; refrigerant R134a was used as the engine working fluid. The pressure difference generated by the engine varied from 2.5 bar at the heat source temperature 40 °C to 23 bar at the heat source temperature 85 °C. Using a differential piston, the generated pressure was quadrupled to pump hydraulic oil through a hydraulic motor that generates shaft power and is connected to an alternator. At the frequency of about 0.5 Hz, the engine operates with useful powers up to 1 kW and an oil pumping flowrate of 7 L/min. Depending on the temperature of the heat source, the obtained efficiency was 3.5 – 6 %. This efficiency looks very high, considering such a low temperature difference (10 – 65 °C) and low power (< 1 kW). The engine’s observed performance is in good agreement with the predictions of the model. The results are very promising, showing that the engine is a simple and low-cost alternative to ORC plants and other known energy conversion systems, especially at low temperatures (< 100 °C) and low power range (< 500 kW) where other known technologies are not economic. Thus low-grade solar, geothermal energy, biomass combustion, and waste heat with a temperature above 30 °C can be involved into various energy conversion processes.

Keywords: isobaric expansion, low-grade heat, heat engine, renewable energy, waste heat recovery

Procedia PDF Downloads 226
12878 Probabilistic Life Cycle Assessment of the Nano Membrane Toilet

Authors: A. Anastasopoulou, A. Kolios, T. Somorin, A. Sowale, Y. Jiang, B. Fidalgo, A. Parker, L. Williams, M. Collins, E. J. McAdam, S. Tyrrel

Abstract:

Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.

Keywords: sanitation systems, nano-membrane toilet, lca, stochastic uncertainty analysis, Monte Carlo simulations, artificial neural network

Procedia PDF Downloads 225
12877 Optimization of Switched Reluctance Motor for Drive System in Automotive Applications

Authors: A. Peniak, J. Makarovič, P. Rafajdus, P. Dúbravka

Abstract:

The purpose of this work is to optimize a Switched Reluctance Motor (SRM) for an automotive application, specifically for a fully electric car. A new optimization approach is proposed. This unique approach transforms automotive customer requirements into an optimization problem, based on sound knowledge of a SRM theory. The approach combines an analytical and a finite element analysis of the motor to quantify static nonlinear and dynamic performance parameters, as phase currents and motor torque maps, an output power and power losses in order to find the optimal motor as close to the reality as possible, within reasonable time. The new approach yields the optimal motor which is competitive with other types of already proposed motors for automotive applications. This distinctive approach can also be used to optimize other types of electrical motors, when parts specifically related to the SRM are adjusted accordingly.

Keywords: automotive, drive system, electric car, finite element method, hybrid car, optimization, switched reluctance motor

Procedia PDF Downloads 521
12876 Consumers' Awareness, Knowledge, and Perception towards Goods and Services Tax in India

Authors: Harjinder Kaur

Abstract:

GST was implemented by government with the expectation to reform the taxation system of India. So this study basically seeks to understand the consumers’ awareness, knowledge and perception about the implementation of GST. To conduct this study, 100 respondents of all demographic profile were randomly selected from the Punjab region of India. To investigate the relationship between demographic profile and level of awareness and knowledge about GST, one way ANOVA test was used and it is found that there is a significant relationship between gender, age and qualification and level of awareness and knowledge. Furthermore, due to the lack of information on GST, the respondents had a high negative perception. The study also reveals that the implementation of GST has resulted in higher prices for goods and services and thus this tax may cause burden to people. Also after implementation of GST financial issues such as inflation, rising cost of living, economic instability have impacted many Indian consumers in terms of their spending. But at the same time it is also perceived that GST is designed to remove the burden of many indirect taxes and aims to develop the more efficient tax system which increases the revenue of country.

Keywords: goods and service tax, consumers awareness, knowledge, perception

Procedia PDF Downloads 192
12875 Infrared Lightbox and iPhone App for Improving Detection Limit of Phosphate Detecting Dip Strips

Authors: H. Heidari-Bafroui, B. Ribeiro, A. Charbaji, C. Anagnostopoulos, M. Faghri

Abstract:

In this paper, we report the development of a portable and inexpensive infrared lightbox for improving the detection limits of paper-based phosphate devices. Commercial paper-based devices utilize the molybdenum blue protocol to detect phosphate in the environment. Although these devices are easy to use and have a long shelf life, their main deficiency is their low sensitivity based on the qualitative results obtained via a color chart. To improve the results, we constructed a compact infrared lightbox that communicates wirelessly with a smartphone. The system measures the absorbance of radiation for the molybdenum blue reaction in the infrared region of the spectrum. It consists of a lightbox illuminated by four infrared light-emitting diodes, an infrared digital camera, a Raspberry Pi microcontroller, a mini-router, and an iPhone to control the microcontroller. An iPhone application was also developed to analyze images captured by the infrared camera in order to quantify phosphate concentrations. Additionally, the app connects to an online data center to present a highly scalable worldwide system for tracking and analyzing field measurements. In this study, the detection limits for two popular commercial devices were improved by a factor of 4 for the Quantofix devices (from 1.3 ppm using visible light to 300 ppb using infrared illumination) and a factor of 6 for the Indigo units (from 9.2 ppm to 1.4 ppm) with repeatability of less than or equal to 1.2% relative standard deviation (RSD). The system also provides more granular concentration information compared to the discrete color chart used by commercial devices and it can be easily adapted for use in other applications.

Keywords: infrared lightbox, paper-based device, phosphate detection, smartphone colorimetric analyzer

Procedia PDF Downloads 123
12874 Function Study of IrMYB55 in Regulating Synthesis of Terpenoids in Isodon Rubescens

Authors: Qingfang Guo

Abstract:

Isodon rubescens is rich in a variety of terpenes such as oridonin. It has important medicinal value. MYB transcription factors are involved in the regulation of plant secondary metabolic pathways. The combined transcriptomics and metabolomics analysis revealed that IrMYB55 might be involved in the regulation of the synthesis of terpenes. The function of IrMYB55 was further verified by establishing of a genetic transformation system by CRISPR/Cas9. Obtaining a virus-mediated Isodon rubescens gene silencing material. The main research results are as follows: (1) Screening IrMYB which can regulate the synthesis of terpenes. Metabolomics and transcriptomics analyses of materials with high (TJ)-and low (FL)-content populations which revealed significant differences in terpene content and IrMYB55 expression. Correlation analysis showed that the expression level of IrMYB55 had a significant correlation with the content of terpenes. (2) Establishment of a genetic transformation system of Isodon rubescens. The IrPDS gene could be knocked out by injection of Isodon rubescens cotyledon, and the transformed material showed obvious albino phenotype. Subsequently, IrMYB55 conversion material was obtained by this method. (3) The IrMYB55 silencing material was obtained. Subcellular localization indicated that IrMYB55 was located in the nucleus, indicating that it might regulate the synthesis of terpenoids through transcription. In summary, IrMYB55 that may regulate the synthesis of oridonin was dug out from the transcriptome and metabolome data. In this study, a genetic transformation system of Isodon rubescens was successfully established. Further studies showed that IrMYB55 regulated the transcription level of genes related to the synthesis of terpenoids, thereby promoting the accumulation of oridonin.

Keywords: isodon rubescens, MYB, oridonin, CRISPR/Cas9

Procedia PDF Downloads 29
12873 Ophthalmic Services Covered by Albasar International Foundation in Sudan

Authors: Mohammad Ibrahim

Abstract:

The study was conducted at Albasar international foundation ophthalmic hospitals in Sudan to study the burden and patterns of ophthalmic disorder in the sector. Review of the hospitals records revealed that the total number of patient examined in the hospitals and outreached camps conducted by the hospitals is 10,513,874, the total number of surgeries is 694,015 and the total number of pupils at school program is 230,382. The organization working with the highest management system and standards and quality result based planning. The study yielded that the ophthalmic problem in Sudan are of great percentage and the temporal blindness disorder are high since major cases and surgeries were Cataract (57.8%). Retinal problem (2.9%), Glaucoma (2.4%), Orbit and Occulo-plastic disorders (2.2%) other disorders are refractive errors, squint and strabismus, Corneal, Pediatrics and minor ophthalmic disorders.

Keywords: hospitals and outreach ophthalmic services, largest coverage of ophthalmic services, nonprofitable ophthalmic services, strong management system and standards

Procedia PDF Downloads 410
12872 Investigation of Correlation Between Radon Concentration and Metals in Produced Water from Oilfield Activities

Authors: Nacer Hamza

Abstract:

Naturally radiation exposure that present due to the cosmic ray or the naturel occurring radioactives materials(NORMs) that originated in the earth's crust and are present everywhere in the environment(1) , a significant concentration of NORMs reported in the produced water which comes out during the oil extraction process, so that the management of this produced water is a challenge for oil and gas companies which include either minimization of produced water which considered as the best way in the term of environment based in the fact that ,the lower water produced the lower cost in treating this water , recycling and reuse by reinjected produced water that fulfills some requirements to enhance oil recovery or disposal in the case that the produced water cannot be minimize or reuse. In the purpose of produced water management, the investigation of NORMs activity concentration present in it considered as the main step for more understanding of the radionuclide’s distribution. Many studies reported the present of NORMs in produced water and investigated the correlation between 〖Ra〗^226and the different metals present in produced water(2) including Cations and anions〖Na〗^+,〖Cl〗^-, 〖Fe〗^(2+), 〖Ca〗^(2+) . and lead, nickel, zinc, cadmium, and copper commonly exist as heavy metal in oil and gas field produced water(3). However, there are no real interesting to investigate the correlation between 〖Rn〗^222and the different metals exist in produced water. methods using, in first to measure the radon concentration activity in produced water samples is a RAD7 .RAD7 is a radiometer instrument based on the solid state detectors(4) which is a type of semi-conductor detector for alpha particles emitting from Rn and their progenies, in second the concentration of different metals presents in produced water measure using an atomic absorption spectrometry AAS. Then to investigate the correlation between the 〖Rn〗^222concentration activity and the metals concentration in produced water a statistical method is Pearson correlation analysis which based in the correlation coefficient obtained between the 〖Rn〗^222 and metals. Such investigation is important to more understanding how the radionuclides act in produced water based on this correlation with metals , in first due to the fact that 〖Rn〗^222decays through the sequence 〖Po〗^218, 〖Pb〗^214, 〖Bi〗^214, 〖Po〗^214, and〖Pb〗^210, those daughters are metals thus they will precipitate with metals present in produced water, secondly the short half-life of 〖Rn〗^222 (3.82 days) lead to faster precipitation of its progenies with metals in produced water.

Keywords: norms, radon concentration, produced water, heavy metals

Procedia PDF Downloads 147
12871 Predicting Seoul Bus Ridership Using Artificial Neural Network Algorithm with Smartcard Data

Authors: Hosuk Shin, Young-Hyun Seo, Eunhak Lee, Seung-Young Kho

Abstract:

Currently, in Seoul, users have the privilege to avoid riding crowded buses with the installation of Bus Information System (BIS). BIS has three levels of on-board bus ridership level information (spacious, normal, and crowded). However, there are flaws in the system due to it being real time which could provide incomplete information to the user. For example, a bus comes to the station, and on the BIS it shows that the bus is crowded, but on the stop that the user is waiting many people get off, which would mean that this station the information should show as normal or spacious. To fix this problem, this study predicts the bus ridership level using smart card data to provide more accurate information about the passenger ridership level on the bus. An Artificial Neural Network (ANN) is an interconnected group of nodes, that was created based on the human brain. Forecasting has been one of the major applications of ANN due to the data-driven self-adaptive methods of the algorithm itself. According to the results, the ANN algorithm was stable and robust with somewhat small error ratio, so the results were rational and reasonable.

Keywords: smartcard data, ANN, bus, ridership

Procedia PDF Downloads 167
12870 Feasibility Study for Implementation of Geothermal Energy Technology as a Means of Thermal Energy Supply for Medium Size Community Building

Authors: Sreto Boljevic

Abstract:

Heating systems based on geothermal energy sources are becoming increasingly popular among commercial/community buildings as management of these buildings looks for a more efficient and environmentally friendly way to manage the heating system. The thermal energy supply of most European commercial/community buildings at present is provided mainly by energy extracted from natural gas. In order to reduce greenhouse gas emissions and achieve climate change targets set by the EU, restructuring in the area of thermal energy supply is essential. At present, heating and cooling account for approx... 50% of the EU primary energy supply. Due to its physical characteristics, thermal energy cannot be distributed or exchange over long distances, contrary to electricity and gas energy carriers. Compared to electricity and the gas sectors, heating remains a generally black box, with large unknowns to a researcher and policymaker. Ain literature number of documents address policies for promoting renewable energy technology to facilitate heating for residential/community/commercial buildings and assess the balance between heat supply and heat savings. Ground source heat pump (GSHP) technology has been an extremely attractive alternative to traditional electric and fossil fuel space heating equipment used to supply thermal energy for residential/community/commercial buildings. The main purpose of this paper is to create an algorithm using an analytical approach that could enable a feasibility study regarding the implementation of GSHP technology in community building with existing fossil-fueled heating systems. The main results obtained by the algorithm will enable building management and GSHP system designers to define the optimal size of the system regarding technical, environmental, and economic impacts of the system implementation, including payback period time. In addition, an algorithm is created to be utilized for a feasibility study for many different types of buildings. The algorithm is tested on a building that was built in 1930 and is used as a church located in Cork city. The heating of the building is currently provided by a 105kW gas boiler.

Keywords: GSHP, greenhouse gas emission, low-enthalpy, renewable energy

Procedia PDF Downloads 220