Search results for: petrochemical systems
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4408

Search results for: petrochemical systems

178 Surface Thermodynamics Approach to Mycobacterium tuberculosis (M-TB) – Human Sputum Interactions

Authors: J. L. Chukwuneke, C. H. Achebe, S. N. Omenyi

Abstract:

This research work presents the surface thermodynamics approach to M-TB/HIV-Human sputum interactions. This involved the use of the Hamaker coefficient concept as a surface energetics tool in determining the interaction processes, with the surface interfacial energies explained using van der Waals concept of particle interactions. The Lifshitz derivation for van der Waals forces was applied as an alternative to the contact angle approach which has been widely used in other biological systems. The methodology involved taking sputum samples from twenty infected persons and from twenty uninfected persons for absorbance measurement using a digital Ultraviolet visible Spectrophotometer. The variables required for the computations with the Lifshitz formula were derived from the absorbance data. The Matlab software tools were used in the mathematical analysis of the data produced from the experiments (absorbance values). The Hamaker constants and the combined Hamaker coefficients were obtained using the values of the dielectric constant together with the Lifshitz Equation. The absolute combined Hamaker coefficients A132abs and A131abs on both infected and uninfected sputum samples gave the values of A132abs = 0.21631x10-21Joule for M-TB infected sputum and Ã132abs = 0.18825x10-21Joule for M-TB/HIV infected sputum. The significance of this result is the positive value of the absolute combined Hamaker coefficient which suggests the existence of net positive van der waals forces demonstrating an attraction between the bacteria and the macrophage. This however, implies that infection can occur. It was also shown that in the presence of HIV, the interaction energy is reduced by 13% conforming adverse effects observed in HIV patients suffering from tuberculosis.

Keywords: Absorbance, dielectric constant, Hamaker coefficient, Lifshitz formula, macrophage, Mycobacterium tuberculosis, Van der Waals forces.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1775
177 Failure to React Positively to Flood Early Warning Systems: Lessons Learned by Flood Victims from Flash Flood Disasters: The Malaysia Experience

Authors: Mohamad Sukeri Khalid, Che Su Mustaffa, Mohd Najib Marzuki, Mohd Fo’ad Sakdan, Sapora Sipon, Mohd Taib Ariffin, Shazwani Shafiai

Abstract:

This paper describes the issues relating to the role of the flash flood early warning system provided by the Malaysian Government to the communities in Malaysia, specifically during the flash flood disaster in the Cameron Highlands, Malaysia. Normally, flash flood disasters can occur as a result of heavy rainfall in an area, and that water may possibly cause flooding via streams or narrow channels. The focus of this study is the flash flood disaster which occurred on 23 October 2013 in the Cameron Highlands, and as a result the Sungai Bertam overflowed after the release of water from the Sultan Abu Bakar Dam. This release of water from the dam caused flash flooding which led to damage to properties and also the death of residents and livestock in the area. Therefore, the effort of this study is to identify the perceptions of the flash flood victims on the role of the flash flood early warning system. For the purposes of this study, data were gathered through face-to-face interviews from those flood victims who were willing to participate in this study. This approach helped the researcher to glean in-depth information about their feelings and perceptions of the role of the flash flood early warning system offered by the government. The data were analysed descriptively and the findings show that the respondents of 22 flood victims believe strongly that the flash flood early warning system was confusing and dysfunctional, and communities had failed to response positively to it. Therefore, most of the communities were not well prepared for the releasing of water from the dam which caused property damage, and 3 people were killed in the Cameron Highland flash flood disaster.

Keywords: Communities affected, disaster management, early warning system, flash flood disaster.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2825
176 Delineating Concern Ground in Block Caving – Underground Mine Using Ground Penetrating Radar

Authors: Eric Sitorus, Septian Prahastudhi, Turgod Nainggolan, Erwin Riyanto

Abstract:

Mining by block or panel caving is a mining method that takes advantage of fractures within an ore body, coupled with gravity, to extract material from a predetermined column of ore. The caving column is weakened from beneath through the use of undercutting, after which the ore breaks up and is extracted from below in a continuous cycle. The nature of this method induces cyclical stresses on the pillars of excavations as stress is built up and released over time, which has a detrimental effect on both the installed ground support and the rock mass itself. Ground support capacity, especially on the production where excavation void ratio is highest, is subjected to heavy loading. Strain above threshold of the elongation of support capacity can yield resulting in damage to excavations. Geotechnical engineers must evaluate not only the remnant capacity of ground support systems but also investigate depth of rock mass yield within pillars, backs and floors. Ground Penetrating Radar (GPR) is a geophysical method that has the ability to evaluate rock mass damage using electromagnetic waves. This paper illustrates a case study from the Grasberg mining complex where non-invasive information on the depth of damage and condition of the remaining rock mass was required. GPR with 100 MHz antenna resolution was used to obtain images of the subsurface to determine rehabilitation requirements prior to recommencing production activities. The GPR surveys were used to calibrate the reflection coefficient response of varying rock mass conditions to known Rock Quality Designation (RQD) parameters observed at the mine. The calibrated GPR survey allowed site engineers to map subsurface conditions and plan rehabilitation accordingly.

Keywords: Block caving, ground penetrating radar, reflectivity, RQD.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 669
175 Cascaded Transcritical/Supercritical CO2 Cycles and Organic Rankine Cycles to Recover Low-Temperature Waste Heat and LNG Cold Energy Simultaneously

Authors: Haoshui Yu, Donghoi Kim, Truls Gundersen

Abstract:

Low-temperature waste heat is abundant in the process industries, and large amounts of Liquefied Natural Gas (LNG) cold energy are discarded without being recovered properly in LNG terminals. Power generation is an effective way to utilize low-temperature waste heat and LNG cold energy simultaneously. Organic Rankine Cycles (ORCs) and CO2 power cycles are promising technologies to convert low-temperature waste heat and LNG cold energy into electricity. If waste heat and LNG cold energy are utilized simultaneously in one system, the performance may outperform separate systems utilizing low-temperature waste heat and LNG cold energy, respectively. Low-temperature waste heat acts as the heat source and LNG regasification acts as the heat sink in the combined system. Due to the large temperature difference between the heat source and the heat sink, cascaded power cycle configurations are proposed in this paper. Cascaded power cycles can improve the energy efficiency of the system considerably. The cycle operating at a higher temperature to recover waste heat is called top cycle and the cycle operating at a lower temperature to utilize LNG cold energy is called bottom cycle in this study. The top cycle condensation heat is used as the heat source in the bottom cycle. The top cycle can be an ORC, transcritical CO2 (tCO2) cycle or supercritical CO2 (sCO2) cycle, while the bottom cycle only can be an ORC due to the low-temperature range of the bottom cycle. However, the thermodynamic path of the tCO2 cycle and sCO2 cycle are different from that of an ORC. The tCO2 cycle and the sCO2 cycle perform better than an ORC for sensible waste heat recovery due to a better temperature match with the waste heat source. Different combinations of the tCO2 cycle, sCO2 cycle and ORC are compared to screen the best configurations of the cascaded power cycles. The influence of the working fluid and the operating conditions are also investigated in this study. Each configuration is modeled and optimized in Aspen HYSYS. The results show that cascaded tCO2/ORC performs better compared with cascaded ORC/ORC and cascaded sCO2/ORC for the case study.

Keywords: LNG cold energy, low-temperature waste heat, organic Rankine cycle, supercritical CO2 cycle, transcritical CO2 cycle.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1074
174 A Compact Via-less Ultra-Wideband Microstrip Filter by Utilizing Open-Circuit Quarter Wavelength Stubs

Authors: Muhammad Yasir Wadood, Fatemeh Babaeian

Abstract:

By developing ultra-wideband (UWB) systems, there is a high demand for UWB filters with low insertion loss, wide bandwidth, and having a planar structure which is compatible with other components of the UWB system. A microstrip interdigital filter is a great option for designing UWB filters. However, the presence of via holes in this structure creates difficulties in the fabrication procedure of the filter. Especially in the higher frequency band, any misalignment of the drilled via hole with the Microstrip stubs causes large errors in the measurement results compared to the desired results. Moreover, in this case (high-frequency designs), the line width of the stubs are very narrow, so highly precise small via holes are required to be implemented, which increases the cost of fabrication significantly. Also, in this case, there is a risk of having fabrication errors. To combat this issue, in this paper, a via-less UWB microstrip filter is proposed which is designed based on a modification of a conventional inter-digital bandpass filter. The novel approaches in this filter design are 1) replacement of each via hole with a quarter-wavelength open circuit stub to avoid the complexity of manufacturing, 2) using a bend structure to reduce the unwanted coupling effects and 3) minimising the size. Using the proposed structure, a UWB filter operating in the frequency band of 3.9-6.6 GHz (1-dB bandwidth) is designed and fabricated. The promising results of the simulation and measurement are presented in this paper. The selected substrate for these designs was Rogers RO4003 with a thickness of 20 mils. This is a common substrate in most of the industrial projects. The compact size of the proposed filter is highly beneficial for applications which require a very miniature size of hardware.

Keywords: Band-pass filters, inter-digital filter, microstrip, via-less.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 834
173 Integration of Big Data to Predict Transportation for Smart Cities

Authors: Sun-Young Jang, Sung-Ah Kim, Dongyoun Shin

Abstract:

The Intelligent transportation system is essential to build smarter cities. Machine learning based transportation prediction could be highly promising approach by delivering invisible aspect visible. In this context, this research aims to make a prototype model that predicts transportation network by using big data and machine learning technology. In detail, among urban transportation systems this research chooses bus system.  The research problem that existing headway model cannot response dynamic transportation conditions. Thus, bus delay problem is often occurred. To overcome this problem, a prediction model is presented to fine patterns of bus delay by using a machine learning implementing the following data sets; traffics, weathers, and bus statues. This research presents a flexible headway model to predict bus delay and analyze the result. The prototyping model is composed by real-time data of buses. The data are gathered through public data portals and real time Application Program Interface (API) by the government. These data are fundamental resources to organize interval pattern models of bus operations as traffic environment factors (road speeds, station conditions, weathers, and bus information of operating in real-time). The prototyping model is designed by the machine learning tool (RapidMiner Studio) and conducted tests for bus delays prediction. This research presents experiments to increase prediction accuracy for bus headway by analyzing the urban big data. The big data analysis is important to predict the future and to find correlations by processing huge amount of data. Therefore, based on the analysis method, this research represents an effective use of the machine learning and urban big data to understand urban dynamics.

Keywords: Big data, bus headway prediction, machine learning, public transportation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562
172 Adapting Tools for Text Monitoring and for Scenario Analysis Related to the Field of Social Disasters

Authors: Svetlana Cojocaru, Mircea Petic, Inga Titchiev

Abstract:

Humanity faces more and more often with different social disasters, which in turn can generate new accidents and catastrophes. To mitigate their consequences, it is important to obtain early possible signals about the events which are or can occur and to prepare the corresponding scenarios that could be applied. Our research is focused on solving two problems in this domain: identifying signals related that an accident occurred or may occur and mitigation of some consequences of disasters. To solve the first problem, methods of selecting and processing texts from global network Internet are developed. Information in Romanian is of special interest for us. In order to obtain the mentioned tools, we should follow several steps, divided into preparatory stage and processing stage. Throughout the first stage, we manually collected over 724 news articles and classified them into 10 categories of social disasters. It constitutes more than 150 thousand words. Using this information, a controlled vocabulary of more than 300 keywords was elaborated, that will help in the process of classification and identification of the texts related to the field of social disasters. To solve the second problem, the formalism of Petri net has been used. We deal with the problem of inhabitants’ evacuation in useful time. The analysis methods such as reachability or coverability tree and invariants technique to determine dynamic properties of the modeled systems will be used. To perform a case study of properties of extended evacuation system by adding time, the analysis modules of PIPE such as Generalized Stochastic Petri Nets (GSPN) Analysis, Simulation, State Space Analysis, and Invariant Analysis have been used. These modules helped us to obtain the average number of persons situated in the rooms and the other quantitative properties and characteristics related to its dynamics.

Keywords: Lexicon of disasters, modelling, Petri nets, text annotation, social disasters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1157
171 Improved Thermal Comfort and Sensation with Occupant Control of Ceiling Personalized Ventilation System: A Lab Study

Authors: Walid Chakroun, Sorour Alotaibi, Nesreen Ghaddar, Kamel Ghali

Abstract:

This study aims at determining the extent to which occupant control of microenvironment influences, improves thermal sensation and comfort, and saves energy in spaces equipped with ceiling personalized ventilation (CPV) system assisted by chair fans (CF) and desk fans (DF) in 2 experiments in a climatic chamber equipped with two-station CPV systems, one that allows control of fan flow rate and the other is set to the fan speed of the selected participant in control. Each experiment included two participants each entering the cooled space from transitional environment at a conventional mixed ventilation (MV) at 24 °C. For CPV diffuser, fresh air was delivered at a rate of 20 Cubic feet per minute (CFM) and a temperature of 16 °C while the recirculated air was delivered at the same temperature but at a flow rate 150 CFM. The macroclimate air of the space was at 26 °C. The full speed flow rates for both the CFs and DFs were at 5 CFM and 20 CFM, respectively. Occupant 1 was allowed to operate the CFs or the DFs at (1/3 of the full speed, 2/3 of the full speed, and the full speed) while occupant 2 had no control on the fan speed and their fan speed was selected by occupant 1. Furthermore, a parametric study was conducted to study the effect of increasing the fresh air flow rate on the occupants’ thermal comfort and whole body sensations. The results showed that most occupants in the CPV+CFs, who did not control the CF flow rate, felt comfortable 6 minutes. The participants, who controlled the CF speeds, felt comfortable in around 24 minutes because they were preoccupied with the CFs. For the DF speed control experiments, most participants who did not control the DFs felt comfortable within the first 8 minutes. Similarly to the CPV+CFs, the participants who controlled the DF flow rates felt comfortable at around 26 minutes. When the CPV system was either supported by CFs or DFs, 93% of participants in both cases reached thermal comfort. Participants in the parametric study felt more comfortable when the fresh air flow rate was low, and felt cold when as the flow rate increased.

Keywords: Thermal comfort, thermal sensation, predicted mean vote, thermal environment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 574
170 Combined Source and Channel Coding for Image Transmission Using Enhanced Turbo Codes in AWGN and Rayleigh Channel

Authors: N. S. Pradeep, M. Balasingh Moses, V. Aarthi

Abstract:

Any signal transmitted over a channel is corrupted by noise and interference. A host of channel coding techniques has been proposed to alleviate the effect of such noise and interference. Among these Turbo codes are recommended, because of increased capacity at higher transmission rates and superior performance over convolutional codes. The multimedia elements which are associated with ample amount of data are best protected by Turbo codes. Turbo decoder employs Maximum A-posteriori Probability (MAP) and Soft Output Viterbi Decoding (SOVA) algorithms. Conventional Turbo coded systems employ Equal Error Protection (EEP) in which the protection of all the data in an information message is uniform. Some applications involve Unequal Error Protection (UEP) in which the level of protection is higher for important information bits than that of other bits. In this work, enhancement to the traditional Log MAP decoding algorithm is being done by using optimized scaling factors for both the decoders. The error correcting performance in presence of UEP in Additive White Gaussian Noise channel (AWGN) and Rayleigh fading are analyzed for the transmission of image with Discrete Cosine Transform (DCT) as source coding technique. This paper compares the performance of log MAP, Modified log MAP (MlogMAP) and Enhanced log MAP (ElogMAP) algorithms used for image transmission. The MlogMAP algorithm is found to be best for lower Eb/N0 values but for higher Eb/N0 ElogMAP performs better with optimized scaling factors. The performance comparison of AWGN with fading channel indicates the robustness of the proposed algorithm. According to the performance of three different message classes, class3 would be more protected than other two classes. From the performance analysis, it is observed that ElogMAP algorithm with UEP is best for transmission of an image compared to Log MAP and MlogMAP decoding algorithms.

Keywords: AWGN, BER, DCT, Fading, MAP, UEP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1678
169 Cybersecurity for Digital Twins in the Built Environment: Research Landscape, Industry Attitudes and Future Direction

Authors: Kaznah Alshammari, Thomas Beach, Yacine Rezgui

Abstract:

Technological advances in the construction sector are helping to make smart cities a reality by means of Cyber-Physical Systems (CPS). CPS integrate information and the physical world through the use of Information Communication Technologies (ICT). An increasingly common goal in the built environment is to integrate Building Information Models (BIM) with Internet of Things (IoT) and sensor technologies using CPS. Future advances could see the adoption of digital twins, creating new opportunities for CPS using monitoring, simulation and optimisation technologies. However, researchers often fail to fully consider the security implications. To date, it is not widely possible to assimilate BIM data and cybersecurity concepts and, therefore, security has thus far been overlooked. This paper reviews the empirical literature concerning IoT applications in the built environment and discusses real-world applications of the IoT intended to enhance construction practices, people’s lives and bolster cybersecurity. Specifically, this research addresses two research questions: (a) How suitable are the current IoT and CPS security stacks to address the cybersecurity threats facing digital twins in the context of smart buildings and districts? and (b) What are the current obstacles to tackling cybersecurity threats to the built environment CPS? To answer these questions, this paper reviews the current state-of-the-art research concerning digital twins in the built environment, the IoT, BIM, urban cities and cybersecurity. The results of the findings of this study confirmed the importance of using digital twins in both IoT and BIM. Also, eight reference zones across Europe have gained special recognition for their contributions to the advancement of IoT science. Therefore, this paper evaluates the use of digital twins in CPS to arrive at recommendations for expanding BIM specifications to facilitate IoT compliance, bolster cybersecurity and integrate digital twin and city standards in the smart cities of the future.

Keywords: BIM, cybersecurity, digital twins, IoT, urban cities.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 880
168 Bus Transit Demand Modeling and Fare Structure Analysis of Kabul City

Authors: Ramin Mirzada, Takuya Maruyama

Abstract:

Kabul is the heart of political, commercial, cultural, educational and social life in Afghanistan and the fifth fastest growing city in the world. Minimum income inclined most of Kabul residents to use public transport, especially buses, although there is no proper bus system, beside that there is no proper fare exist in Kabul city Due to wars. From 1992 to 2001 during civil wars, Kabul suffered damage and destruction of its transportation facilities including pavements, sidewalks, traffic circles, drainage systems, traffic signs and signals, trolleybuses and almost all of the public transport system (e.g. Millie bus). This research is mainly focused on Kabul city’s transportation system. In this research, the data used have been gathered by Japan International Cooperation Agency (JICA) in 2008 and this data will be used to find demand and fare structure, additionally a survey was done in 2016 to find satisfaction level of Kabul residents for fare structure. Aim of this research is to observe the demand for Large Buses, compare to the actual supply from the government, analyze the current fare structure and compare it with the proposed fare (distance based fare) structure which has already been analyzed. Outcome of this research shows that the demand of Kabul city residents for the public transport (Large Buses) exceeds from the current supply, so that current public transportation (Large Buses) is not sufficient to serve public transport in Kabul city, worth to be mentioned, that in order to overcome this problem, there is no need to build new roads or exclusive way for buses. This research proposes government to change the fare from fixed fare to distance based fare, invest on public transportation and increase the number of large buses so that the current demand for public transport is met.

Keywords: Transportation, planning, public transport, large buses, fixed fare, distance based fare, Kabul, Afghanistan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1664
167 Introductory Design Optimisation of a Machine Tool using a Virtual Machine Concept

Authors: Johan Wall, Johan Fredin, Anders Jönsson, Göran Broman

Abstract:

Designing modern machine tools is a complex task. A simulation tool to aid the design work, a virtual machine, has therefore been developed in earlier work. The virtual machine considers the interaction between the mechanics of the machine (including structural flexibility) and the control system. This paper exemplifies the usefulness of the virtual machine as a tool for product development. An optimisation study is conducted aiming at improving the existing design of a machine tool regarding weight and manufacturing accuracy at maintained manufacturing speed. The problem can be categorised as constrained multidisciplinary multiobjective multivariable optimisation. Parameters of the control and geometric quantities of the machine are used as design variables. This results in a mix of continuous and discrete variables and an optimisation approach using a genetic algorithm is therefore deployed. The accuracy objective is evaluated according to international standards. The complete systems model shows nondeterministic behaviour. A strategy to handle this based on statistical analysis is suggested. The weight of the main moving parts is reduced by more than 30 per cent and the manufacturing accuracy is improvement by more than 60 per cent compared to the original design, with no reduction in manufacturing speed. It is also shown that interaction effects exist between the mechanics and the control, i.e. this improvement would most likely not been possible with a conventional sequential design approach within the same time, cost and general resource frame. This indicates the potential of the virtual machine concept for contributing to improved efficiency of both complex products and the development process for such products. Companies incorporating such advanced simulation tools in their product development could thus improve its own competitiveness as well as contribute to improved resource efficiency of society at large.

Keywords: Machine tools, Mechatronics, Non-deterministic, Optimisation, Product development, Virtual machine

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1967
166 The Social Dynamics of Pandemics: A Clinical Sociological Analysis of Precautions and Risks

Authors: C. Ardil

Abstract:

The COVID-19 pandemic has revealed the complex and multifaceted relationship between societal structures and public health, emphasizing the need for a holistic approach to understanding pandemic responses. This study utilizes a clinical sociological perspective to analyze the social impacts of pandemics, with a particular focus on how social determinants such as income, education, race, and geographical location influence vulnerability and resilience. It explores the critical role of risk perception, communication strategies, and community dynamics in shaping public adherence to precautionary measures like mask-wearing, social distancing, and vaccination. By examining the ways in which social norms, structural inequalities, and trust in institutions affect public behavior, this study provides insights into the challenges of managing health crises in diverse communities. Comparative case studies and policy analysis are employed to highlight the variations in pandemic responses across different countries and regions, illustrating the importance of coordinated strategies and community-based interventions. The findings underscore that effective pandemic response requires addressing underlying social inequities, fostering community cohesion, and ensuring equitable access to healthcare and information. This study contributes to a deeper understanding of the broader societal implications of pandemics and offers recommendations for building more resilient, inclusive public health systems capable of mitigating the impact of future global health emergencies.

Keywords: Behavioral medicine, clinical sociology, community health, COVID-19, COVID-19 pandemic, epidemiology, infectious diseases, pandemics, precautions, psychology, public health, risks, social determinants, social dynamics, social psychiatry, social psychology, socioeconomic status, structural functionalism

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 33
165 Data Centers’ Temperature Profile Simulation Optimized by Finite Elements and Discretization Methods

Authors: José Alberto García Fernández, Zhimin Du, Xinqiao Jin

Abstract:

Nowadays, data center industry faces strong challenges for increasing the speed and data processing capacities while at the same time is trying to keep their devices a suitable working temperature without penalizing that capacity. Consequently, the cooling systems of this kind of facilities use a large amount of energy to dissipate the heat generated inside the servers, and developing new cooling techniques or perfecting those already existing would be a great advance in this type of industry. The installation of a temperature sensor matrix distributed in the structure of each server would provide the necessary information for collecting the required data for obtaining a temperature profile instantly inside them. However, the number of temperature probes required to obtain the temperature profiles with sufficient accuracy is very high and expensive. Therefore, other less intrusive techniques are employed where each point that characterizes the server temperature profile is obtained by solving differential equations through simulation methods, simplifying data collection techniques but increasing the time to obtain results. In order to reduce these calculation times, complicated and slow computational fluid dynamics simulations are replaced by simpler and faster finite element method simulations which solve the Burgers‘ equations by backward, forward and central discretization techniques after simplifying the energy and enthalpy conservation differential equations. The discretization methods employed for solving the first and second order derivatives of the obtained Burgers‘ equation after these simplifications are the key for obtaining results with greater or lesser accuracy regardless of the characteristic truncation error.

Keywords: Burgers’ equations, CFD simulation, data center, discretization methods, FEM simulation, temperature profile.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 516
164 Modeling a Multinomial Logit Model of Intercity Travel Mode Choice Behavior for All Trips in Libya

Authors: Manssour A. Abdulsalam Bin Miskeen, Ahmed Mohamed Alhodairi, Riza Atiq Abdullah Bin O. K. Rahmat

Abstract:

In the planning point of view, it is essential to have mode choice, due to the massive amount of incurred in transportation systems. The intercity travellers in Libya have distinct features, as against travellers from other countries, which includes cultural and socioeconomic factors. Consequently, the goal of this study is to recognize the behavior of intercity travel using disaggregate models, for projecting the demand of nation-level intercity travel in Libya. Multinomial Logit Model for all the intercity trips has been formulated to examine the national-level intercity transportation in Libya. The Multinomial logit model was calibrated using nationwide revealed preferences (RP) and stated preferences (SP) survey. The model was developed for deference purpose of intercity trips (work, social and recreational). The variables of the model have been predicted based on maximum likelihood method. The data needed for model development were obtained from all major intercity corridors in Libya. The final sample size consisted of 1300 interviews. About two-thirds of these data were used for model calibration, and the remaining parts were used for model validation. This study, which is the first of its kind in Libya, investigates the intercity traveler’s mode-choice behavior. The intercity travel mode-choice model was successfully calibrated and validated. The outcomes indicate that, the overall model is effective and yields higher precision of estimation. The proposed model is beneficial, due to the fact that, it is receptive to a lot of variables, and can be employed to determine the impact of modifications in the numerous characteristics on the need for various travel modes. Estimations of the model might also be of valuable to planners, who can estimate possibilities for various modes and determine the impact of unique policy modifications on the need for intercity travel.

Keywords: Multinomial logit model, improved intercity transport, intercity mode-choice behavior, disaggregate analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7867
163 Turbine Follower Control Strategy Design Based on Developed FFPP Model

Authors: Ali Ghaffari, Mansour Nikkhah Bahrami, Hesam Parsa

Abstract:

In this paper a comprehensive model of a fossil fueled power plant (FFPP) is developed in order to evaluate the performance of a newly designed turbine follower controller. Considering the drawbacks of previous works, an overall model is developed to minimize the error between each subsystem model output and the experimental data obtained at the actual power plant. The developed model is organized in two main subsystems namely; Boiler and Turbine. Considering each FFPP subsystem characteristics, different modeling approaches are developed. For economizer, evaporator, superheater and reheater, first order models are determined based on principles of mass and energy conservation. Simulations verify the accuracy of the developed models. Due to the nonlinear characteristics of attemperator, a new model, based on a genetic-fuzzy systems utilizing Pittsburgh approach is developed showing a promising performance vis-à-vis those derived with other methods like ANFIS. The optimization constraints are handled utilizing penalty functions. The effect of increasing the number of rules and membership functions on the performance of the proposed model is also studied and evaluated. The turbine model is developed based on the equation of adiabatic expansion. Parameters of all evaluated models are tuned by means of evolutionary algorithms. Based on the developed model a fuzzy PI controller is developed. It is then successfully implemented in the turbine follower control strategy of the plant. In this control strategy instead of keeping control parameters constant, they are adjusted on-line with regard to the error and the error rate. It is shown that the response of the system improves significantly. It is also shown that fuel consumption decreases considerably.

Keywords: Attemperator, Evolutionary algorithms, Fossil fuelled power plant (FFPP), Fuzzy set theory, Gain scheduling

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1792
162 Estimating the Costs of Conservation in Multiple Output Agricultural Setting

Authors: T. Chaiechi, N. Stoeckl

Abstract:

Scarcity of resources for biodiversity conservation gives rise to the need of strategic investment with priorities given to the cost of conservation. While the literature provides abundant methodological options for biodiversity conservation; estimating true cost of conservation remains abstract and simplistic, without recognising dynamic nature of the cost. Some recent works demonstrate the prominence of economic theory to inform biodiversity decisions, particularly on the costs and benefits of biodiversity however, the integration of the concept of true cost into biodiversity actions and planning are very slow to come by, and specially on a farm level. Conservation planning studies often use area as a proxy for costs neglecting different land values as well as protected areas. These literature consider only heterogeneous benefits while land costs are considered homogenous. Analysis with the assumption of cost homogeneity results in biased estimation; since not only it doesn’t address the true total cost of biodiversity actions and plans, but also it fails to screen out lands that are more (or less) expensive and/or difficult (or more suitable) for biodiversity conservation purposes, hindering validity and comparability of the results. Economies of scope” is one of the other most neglected aspects in conservation literature. The concept of economies of scope introduces the existence of cost complementarities within a multiple output production system and it suggests a lower cost during the concurrent production of multiple outputs by a given farm. If there are, indeed, economies of scope then simplistic representation of costs will tend to overestimate the true cost of conservation leading to suboptimal outcomes. The aim of this paper, therefore, is to provide first road review of the various theoretical ways in which economies of scope are likely to occur of how they might occur in conservation. Consequently, the paper addresses gaps that have to be filled in future analysis.

Keywords: Cost, biodiversity conservation, Multi-output production systems, Empirical techniques.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2206
161 A Temporal QoS Ontology for ERTMS/ETCS

Authors: Marc Sango, Olimpia Hoinaru, Christophe Gransart, Laurence Duchien

Abstract:

Ontologies offer a means for representing and sharing information in many domains, particularly in complex domains. For example, it can be used for representing and sharing information of System Requirement Specification (SRS) of complex systems like the SRS of ERTMS/ETCS written in natural language. Since this system is a real-time and critical system, generic ontologies, such as OWL and generic ERTMS ontologies provide minimal support for modeling temporal information omnipresent in these SRS documents. To support the modeling of temporal information, one of the challenges is to enable representation of dynamic features evolving in time within a generic ontology with a minimal redesign of it. The separation of temporal information from other information can help to predict system runtime operation and to properly design and implement them. In addition, it is helpful to provide a reasoning and querying techniques to reason and query temporal information represented in the ontology in order to detect potential temporal inconsistencies. To address this challenge, we propose a lightweight 3-layer temporal Quality of Service (QoS) ontology for representing, reasoning and querying over temporal and non-temporal information in a complex domain ontology. Representing QoS entities in separated layers can clarify the distinction between the non QoS entities and the QoS entities in an ontology. The upper generic layer of the proposed ontology provides an intuitive knowledge of domain components, specially ERTMS/ETCS components. The separation of the intermediate QoS layer from the lower QoS layer allows us to focus on specific QoS Characteristics, such as temporal or integrity characteristics. In this paper, we focus on temporal information that can be used to predict system runtime operation. To evaluate our approach, an example of the proposed domain ontology for handover operation, as well as a reasoning rule over temporal relations in this domain-specific ontology, are presented.

Keywords: System Requirement Specification, ERTMS/ETCS, Temporal Ontologies, Domain Ontologies.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3135
160 Stress Analysis of Hexagonal Element for Precast Concrete Pavements

Authors: J. Novak, A. Kohoutkova, V. Kristek, J. Vodicka, M. Sramek

Abstract:

While the use of cast-in-place concrete for an airfield and highway pavement overlay is very common, the application of precast concrete elements is very limited today. The main reasons consist of high production costs and complex structural behavior. Despite that, several precast concrete systems have been developed and tested with the aim to provide a system with rapid construction. The contribution deals with the reinforcement design of a hexagonal element developed for a proposed airfield pavement system. The sub-base course of the system is composed of compacted recycled concrete aggregates and fiber reinforced concrete with recycled aggregates place on top of it. The selected element belongs to a group of precast concrete elements which are being considered for the construction of a surface course. Both high costs of full-scale experiments and the need to investigate various elements force to simulate their behavior in a numerical analysis software by using finite element method instead of performing expensive experiments. The simulation of the selected element was conducted on a nonlinear model in order to obtain such results which could fully compensate results from experiments. The main objective was to design reinforcement of the precast concrete element subject to quasi-static loading from airplanes with respect to geometrical imperfections, manufacturing imperfections, tensile stress in reinforcement, compressive stress in concrete and crack width. The obtained findings demonstrate that the position and the presence of imperfection in a pavement highly affect the stress distribution in the precast concrete element. The precast concrete element should be heavily reinforced to fulfill all the demands. Using under-reinforced concrete elements would lead to the formation of wide cracks and cracks permanently open.

Keywords: Imperfection, numerical simulation, pavement, precast concrete element, reinforcement design, stress analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 761
159 Load Forecasting in Microgrid Systems with R and Cortana Intelligence Suite

Authors: F. Lazzeri, I. Reiter

Abstract:

Energy production optimization has been traditionally very important for utilities in order to improve resource consumption. However, load forecasting is a challenging task, as there are a large number of relevant variables that must be considered, and several strategies have been used to deal with this complex problem. This is especially true also in microgrids where many elements have to adjust their performance depending on the future generation and consumption conditions. The goal of this paper is to present a solution for short-term load forecasting in microgrids, based on three machine learning experiments developed in R and web services built and deployed with different components of Cortana Intelligence Suite: Azure Machine Learning, a fully managed cloud service that enables to easily build, deploy, and share predictive analytics solutions; SQL database, a Microsoft database service for app developers; and PowerBI, a suite of business analytics tools to analyze data and share insights. Our results show that Boosted Decision Tree and Fast Forest Quantile regression methods can be very useful to predict hourly short-term consumption in microgrids; moreover, we found that for these types of forecasting models, weather data (temperature, wind, humidity and dew point) can play a crucial role in improving the accuracy of the forecasting solution. Data cleaning and feature engineering methods performed in R and different types of machine learning algorithms (Boosted Decision Tree, Fast Forest Quantile and ARIMA) will be presented, and results and performance metrics discussed.

Keywords: Time-series, features engineering methods for forecasting, energy demand forecasting, Azure machine learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1290
158 Influence of Organic Modifier Loading on Particle Dispersion of Biodegradable Polycaprolactone/Montmorillonite Nanocomposites

Authors: O. I. H. Dimitry, N. A. Mansour, A. L. G. Saad

Abstract:

Natural sodium montmorillonite (NaMMT), Cloisite Na+ and two organophilic montmorillonites (OMMTs), Cloisites 20A and 15A were used. Polycaprolactone (PCL)/MMT composites containing 1, 3, 5, and 10 wt% of Cloisite Na+ and PCL/OMMT nanocomposites containing 5 and 10 wt% of Cloisites 20A and 15A were prepared via solution intercalation technique to study the influence of organic modifier loading on particle dispersion of PCL/ NaMMT composites. Thermal stabilities of the obtained composites were characterized by thermal analysis using the thermogravimetric analyzer (TGA) which showed that in the presence of nitrogen flow the incorporation of 5 and 10 wt% of filler brings some decrease in PCL thermal stability in the sequence: Cloisite Na+>Cloisite 15A > Cloisite 20A, while in the presence of air flow these fillers scarcely influenced the thermoxidative stability of PCL by slightly accelerating the process. The interaction between PCL and silicate layers was studied by Fourier transform infrared (FTIR) spectroscopy which confirmed moderate interactions between nanometric silicate layers and PCL segments. The electrical conductivity (σ) which describes the ionic mobility of the systems was studied as a function of temperature and showed that σ of PCL was enhanced on increasing the modifier loading at filler content of 5 wt%, especially at higher temperatures in the sequence: Cloisite Na+<Cloisite 20A<Cloisite 15A, and was then decreased to some extent with a further increase to 10 wt%. The activation energy Eσ obtained from the dependency of σ on temperature using Arrhenius equation was found to be lowest for the nanocomposite containing 5 wt% of Cloisite 15A. The dispersed behavior of clay in PCL matrix was evaluated by X-ray diffraction (XRD) and scanning electron microscopy (SEM) analyses which revealed partial intercalated structures in PCL/NaMMT composites and semi-intercalated/semi-exfoliated structures in PCL/OMMT nanocomposites containing 5 wt% of Cloisite 20A or Cloisite 15A.

Keywords: Polycaprolactone, organoclay, nanocomposite, montmorillonite, electrical conductivity, activation energy, exfoliation, intercalation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1125
157 Simulation and Analysis of Passive Parameters of Building in eQuest: A Case Study in Istanbul, Turkey

Authors: Mahdiyeh Zafaranchi

Abstract:

With rapid development of urbanization and improvement of living standards in the world, energy consumption and carbon emissions of the building sector are expected to increase in the near future; because of that, energy-saving issues have become more important among the engineers. Besides, the building sector is a major contributor to energy consumption and carbon emissions. The concept of efficient building appeared as a response to the need for reducing energy demand in this sector which has the main purpose of shifting from standard buildings to low-energy buildings. Although energy-saving should happen in all steps of a building during the life cycle (material production, construction, demolition), the main concept of efficient energy building is saving energy during the life expectancy of a building by using passive and active systems, and should not sacrifice comfort and quality to reach these goals. The main aim of this study is to investigate passive strategies (do not need energy consumption or use renewable energy) to achieve energy-efficient buildings. Energy retrofit measures were explored by eQuest software using a case study as a base model. The study investigates predictive accuracy for the major factors like thermal transmittance (U-value) of the material, windows, shading devices, thermal insulation, rate of the exposed envelope, window/wall ration, lighting system in the energy consumption of the building. The base model was located in Istanbul, Turkey. The impact of eight passive parameters on energy consumption had been indicated. After analyzing the base model by eQuest, a final scenario was suggested which had a good energy performance. The results showed a decrease in the U-values of materials, the rate of exposing buildings, and windows had a significant effect on energy consumption. Finally, savings in electric consumption of about 10.5%, and gas consumption by about 8.37% in the suggested model were achieved annually.

Keywords: Efficient building, electric and gas consumption, eQuest, passive parameters.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 774
156 Hydrogen Production at the Forecourt from Off-Peak Electricity and Its Role in Balancing the Grid

Authors: Abdulla Rahil, Rupert Gammon, Neil Brown

Abstract:

The rapid growth of renewable energy sources and their integration into the grid have been motivated by the depletion of fossil fuels and environmental issues. Unfortunately, the grid is unable to cope with the predicted growth of renewable energy which would lead to its instability. To solve this problem, energy storage devices could be used. Electrolytic hydrogen production from an electrolyser is considered a promising option since it is a clean energy source (zero emissions). Choosing flexible operation of an electrolyser (producing hydrogen during the off-peak electricity period and stopping at other times) could bring about many benefits like reducing the cost of hydrogen and helping to balance the electric systems. This paper investigates the price of hydrogen during flexible operation compared with continuous operation, while serving the customer (hydrogen filling station) without interruption. The optimization algorithm is applied to investigate the hydrogen station in both cases (flexible and continuous operation). Three different scenarios are tested to see whether the off-peak electricity price could enhance the reduction of the hydrogen cost. These scenarios are: Standard tariff (1 tier system) during the day (assumed 12 p/kWh) while still satisfying the demand for hydrogen; using off-peak electricity at a lower price (assumed 5 p/kWh) and shutting down the electrolyser at other times; using lower price electricity at off-peak times and high price electricity at other times. This study looks at Derna city, which is located on the coast of the Mediterranean Sea (32° 46′ 0 N, 22° 38′ 0 E) with a high potential for wind resource. Hourly wind speed data which were collected over 24½ years from 1990 to 2014 were in addition to data on hourly radiation and hourly electricity demand collected over a one-year period, together with the petrol station data.

Keywords: Hydrogen filling station off-peak electricity, renewable energy, off-peak electricity, electrolytic hydrogen.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1263
155 Multi-Objective Optimization of Gas Turbine Power Cycle

Authors: Mohsen Nikaein

Abstract:

Because of importance of energy, optimization of power generation systems is necessary. Gas turbine cycles are suitable manner for fast power generation, but their efficiency is partly low. In order to achieving higher efficiencies, some propositions are preferred such as recovery of heat from exhaust gases in a regenerator, utilization of intercooler in a multistage compressor, steam injection to combustion chamber and etc. However thermodynamic optimization of gas turbine cycle, even with above components, is necessary. In this article multi-objective genetic algorithms are employed for Pareto approach optimization of Regenerative-Intercooling-Gas Turbine (RIGT) cycle. In the multiobjective optimization a number of conflicting objective functions are to be optimized simultaneously. The important objective functions that have been considered for optimization are entropy generation of RIGT cycle (Ns) derives using Exergy Analysis and Gouy-Stodola theorem, thermal efficiency and the net output power of RIGT Cycle. These objectives are usually conflicting with each other. The design variables consist of thermodynamic parameters such as compressor pressure ratio (Rp), excess air in combustion (EA), turbine inlet temperature (TIT) and inlet air temperature (T0). At the first stage single objective optimization has been investigated and the method of Non-dominated Sorting Genetic Algorithm (NSGA-II) has been used for multi-objective optimization. Optimization procedures are performed for two and three objective functions and the results are compared for RIGT Cycle. In order to investigate the optimal thermodynamic behavior of two objectives, different set, each including two objectives of output parameters, are considered individually. For each set Pareto front are depicted. The sets of selected decision variables based on this Pareto front, will cause the best possible combination of corresponding objective functions. There is no superiority for the points on the Pareto front figure, but they are superior to any other point. In the case of three objective optimization the results are given in tables.

Keywords: Exergy, Entropy Generation, Brayton Cycle, DesignParameters, Optimization, Genetic Algorithm, Multi-Objective.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2525
154 Environmental Accounting Practice: Analyzing the Extent and Qualification of Environmental Disclosures of Turkish Companies Located in BIST-XKURY Index

Authors: Raif Parlakkaya, Mustafa Nihat Demirci, Mehmet Nuri Salur

Abstract:

Environmental pollution has detrimental effects on the quality of our life and its scope has reached such an extent that measures are being taken both at the national and international levels to reduce, prevent and mitigate its impact on social, economic and political spheres. Therefore, awareness of environmental problems has been increasing among stakeholders and accordingly among companies. It is seen that corporate reporting is expanding beyond environmental performance. Primary purpose of publishing an environmental report is to provide specific audiences with useful, meaningful information. This paper is intended to analyze the extent and qualification of environmental disclosures of Turkish publicly quoted firms and see how it varies from one sector to another. The data for the study were collected from annual activity reports of companies, listed on the corporate governance index (BIST-XKURY) of Istanbul Stock Exchange. Content analysis was the research methodology used to measure the extent of environmental disclosure. Accordingly, 2015 annual activity reports of companies that carry out business in some particular fields were acquired from Capital Market Board, websites of Public Disclosure Platform and companies’ own websites. These reports were categorized into five main aspects: Environmental policies, environmental management systems, environmental protection and conservation activities, environmental awareness and information on environmental lawsuits. Subsequently, each component was divided into several variables related to what each firm is supposed to disclose about environmental information. In this context, the nature and scope of the information disclosed on each item were assessed according to five different ways (N.I: No Information; G.E.: General Explanations; Q.E.: Qualitative Detailed Explanations; N.E.: Quantitative (numerical) Detailed Explanations; Q.&N.E.: Both Qualitative and Quantitative Explanations).

Keywords: Environmental accounting, disclosure, corporate governance, content analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1105
153 Effect of Impact Angle on Erosive Abrasive Wear of Ductile and Brittle Materials

Authors: Ergin Kosa, Ali Göksenli

Abstract:

Erosion and abrasion are wear mechanisms reducing the lifetime of machine elements like valves, pump and pipe systems. Both wear mechanisms are acting at the same time, causing a “Synergy” effect, which leads to a rapid damage of the surface. Different parameters are effective on erosive abrasive wear rate. In this study effect of particle impact angle on wear rate and wear mechanism of ductile and brittle materials was investigated. A new slurry pot was designed for experimental investigation. As abrasive particle, silica sand was used. Particle size was ranking between 200- 500 μm. All tests were carried out in a sand-water mixture of 20% concentration for four hours. Impact velocities of the particles were 4.76 m/s. As ductile material steel St 37 with Vickers Hardness Number (VHN) of 245 and quenched St 37 with 510 VHN was used as brittle material. After wear tests, morphology of the eroded surfaces were investigated for better understanding of the wear mechanisms acting at different impact angles by using Scanning Electron Microscope. The results indicated that wear rate of ductile material was higher than brittle material. Maximum wear rate was observed by ductile material at a particle impact angle of 300 and decreased further by an increase in attack angle. Maximum wear rate by brittle materials was by impact angle of 450 and decreased further up to 900. Ploughing was the dominant wear mechanism by ductile material. Microcracks on the surface were detected by ductile materials, which are nucleation centers for crater formation. Number of craters decreased and depth of craters increased by ductile materials by attack angle higher than 300. Deformation wear mechanism was observed by brittle materials. Number and depth of pits decreased by brittle materials by impact angles higher than 450. At the end it is concluded that wear rate could not be directly related to impact angle of particles due to the different reaction of ductile and brittle materials.

Keywords: Erosive wear, particle impact angle, silica sand, wear rate, ductile-brittle material.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3023
152 A Probabilistic Reinforcement-Based Approach to Conceptualization

Authors: Hadi Firouzi, Majid Nili Ahmadabadi, Babak N. Araabi

Abstract:

Conceptualization strengthens intelligent systems in generalization skill, effective knowledge representation, real-time inference, and managing uncertain and indefinite situations in addition to facilitating knowledge communication for learning agents situated in real world. Concept learning introduces a way of abstraction by which the continuous state is formed as entities called concepts which are connected to the action space and thus, they illustrate somehow the complex action space. Of computational concept learning approaches, action-based conceptualization is favored because of its simplicity and mirror neuron foundations in neuroscience. In this paper, a new biologically inspired concept learning approach based on the probabilistic framework is proposed. This approach exploits and extends the mirror neuron-s role in conceptualization for a reinforcement learning agent in nondeterministic environments. In the proposed method, instead of building a huge numerical knowledge, the concepts are learnt gradually from rewards through interaction with the environment. Moreover the probabilistic formation of the concepts is employed to deal with uncertain and dynamic nature of real problems in addition to the ability of generalization. These characteristics as a whole distinguish the proposed learning algorithm from both a pure classification algorithm and typical reinforcement learning. Simulation results show advantages of the proposed framework in terms of convergence speed as well as generalization and asymptotic behavior because of utilizing both success and failures attempts through received rewards. Experimental results, on the other hand, show the applicability and effectiveness of the proposed method in continuous and noisy environments for a real robotic task such as maze as well as the benefits of implementing an incremental learning scenario in artificial agents.

Keywords: Concept learning, probabilistic decision making, reinforcement learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1527
151 Towards an Enhanced Quality of IPTV Media Server Architecture over Software Defined Networking

Authors: Esmeralda Hysenbelliu

Abstract:

The aim of this paper is to present the QoE (Quality of Experience) IPTV SDN-based media streaming server enhanced architecture for configuring, controlling, management and provisioning the improved delivery of IPTV service application with low cost, low bandwidth, and high security. Furthermore, it is given a virtual QoE IPTV SDN-based topology to provide an improved IPTV service based on QoE Control and Management of multimedia services functionalities. Inside OpenFlow SDN Controller there are enabled in high flexibility and efficiency Service Load-Balancing Systems; based on the Loading-Balance module and based on GeoIP Service. This two Load-balancing system improve IPTV end-users Quality of Experience (QoE) with optimal management of resources greatly. Through the key functionalities of OpenFlow SDN controller, this approach produced several important features, opportunities for overcoming the critical QoE metrics for IPTV Service like achieving incredible Fast Zapping time (Channel Switching time) < 0.1 seconds. This approach enabled Easy and Powerful Transcoding system via FFMPEG encoder. It has the ability to customize streaming dimensions bitrates, latency management and maximum transfer rates ensuring delivering of IPTV streaming services (Audio and Video) in high flexibility, low bandwidth and required performance. This QoE IPTV SDN-based media streaming architecture unlike other architectures provides the possibility of Channel Exchanging between several IPTV service providers all over the word. This new functionality brings many benefits as increasing the number of TV channels received by end –users with low cost, decreasing stream failure time (Channel Failure time < 0.1 seconds) and improving the quality of streaming services.

Keywords: Improved QoE, OpenFlow SDN controller, IPTV service application, softwarization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1030
150 Information Security Risk Management in IT-Based Process Virtualization: A Methodological Design Based on Action Research

Authors: Jefferson Camacho Mejía, Jenny Paola Forero Pachón, Luis Carlos Gómez Flórez

Abstract:

Action research is a qualitative research methodology, which leads the researcher to delve into the problems of a community in order to understand its needs in depth and finally, to propose actions that lead to a change of social paradigm. Although this methodology had its beginnings in the human sciences, it has attracted increasing interest and acceptance in the field of information systems research since the 1990s. The countless possibilities offered nowadays by the use of Information Technologies (IT) in the development of different socio-economic activities have meant a change of social paradigm and the emergence of the so-called information and knowledge society. According to this, governments, large corporations, small entrepreneurs and in general, organizations of all kinds are using IT to virtualize their processes, taking them from the physical environment to the digital environment. However, there is a potential risk for organizations related with exposing valuable information without an appropriate framework for protecting it. This paper shows progress in the development of a methodological design to manage the information security risks associated with the IT-based processes virtualization, by applying the principles of the action research methodology and it is the result of a systematic review of the scientific literature. This design consists of seven fundamental stages. These are distributed in the three stages described in the action research methodology: 1) Observe, 2) Analyze and 3) Take actions. Finally, this paper aims to offer an alternative tool to traditional information security management methodologies with a view to being applied specifically in the planning stage of IT-based process virtualization in order to foresee risks and to establish security controls before formulating IT solutions in any type of organization.

Keywords: Action research, information security, information technology, methodological design, process virtualization, risk management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 959
149 The Genesis of the Anomalous Sernio Fan, Valtellina, Northern Italy

Authors: E. De Finis, P. Gattinoni, L. Scesi

Abstract:

Massive rock avalanches formed some of the largest landslide deposits on Earth and they represent one of the major geohazards in high-relief mountains. This paper interprets a very large sedimentary fan (the Sernio fan, Valtellina, Northern Italy), located 20 Km SW from Val Pola Rock avalanche (1987), as the deposit of a partial collapse of a Deep Seated Gravitational Slope Deformation (DSGSD), afterwards eroded and buried by debris flows. The proposed emplacement sequence has been reconstructed based on geomorphological, structural and mechanical evidences. The Sernio fan is actually considered anomalous with reference to the very high ratio between the fan area (≈ 4.5km2) and the basin area (≈ 3km2). The morphology of the fan area is characterised by steep slopes (dip ≈ 20%) and the fan apex is extended for 1.8 km inside the small catchment basin. This sedimentary fan was originated by a landslide that interested a part of a large deep-seated gravitational slope deformation, involving a wide area of about 55 km². The main controlling factor is tectonic and it is related to the proximity to regional fault systems and the consequent occurrence of fault weak rocks (GSI locally lower than 10 with compressive stress lower than 20MPa). Moreover, the fan deposit shows sedimentary evidences of recent debris flow events. The best current explanation of the Sernio fan involves an initial failure of some hundreds of Mm3. The run-out was quite limited because of the morphology of Valtellina’s valley floor, and the deposit filled the main valley forming a landslide dam, as confirmed by the lacustrine deposits detected upstream the fan. Nowadays the debris flow events represent the main hazard in the study area.

Keywords: Anomalous sedimentary fans, debris flow, deep seated gravitational slope deformation, Italy, rock avalanche.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1751