Search results for: event methodology
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6347

Search results for: event methodology

5777 Study of the Effect of the Continuous Electric Field on the Rd Cancer Cell Line by Response Surface Methodology

Authors: Radia Chemlal, Salim Mehenni, Dahbia Leila Anes-boulahbal, Mohamed Kherat, Nabil Mameri

Abstract:

The application of the electric field is considered to be a very promising method in cancer therapy. Indeed, cancer cells are very sensitive to the electric field, although the cellular response is not entirely clear. The tests carried out consisted in subjecting the RD cell line under the effect of the continuous electric field while varying certain parameters (voltage, exposure time, and cell concentration). The response surface methodology (RSM) was used to assess the effect of the chosen parameters, as well as the existence of interactions between them. The results obtained showed that the voltage, the cell concentration as well as the interaction between voltage and exposure time have an influence on the mortality rate of the RD cell line.

Keywords: continuous electric field, RD cancer cell line, RSM, voltage

Procedia PDF Downloads 102
5776 ΕSW01: A Methodology for Approaching the Design of Interior Spaces

Authors: Eirini Krasaki

Abstract:

This paper addresses the problem of designing spaces in a consistently changing environment. Space is considered as a totality of forces that coexist in the same place. Forces form the identity of space and characterize the entities that coexist within the same totality. Interior space is considered as a totality of forces which develop within an envelope. This research focuses on the formation of the tripole space-forces-totality and studies the relation of this tripole to the interior space. The point of departure for this investigation has been set the historic center of Athens, a city center where the majority of building mass is unused. The objective of the study is to connect the development of interior spaces to the alterations of the conceptions that form the built environment. The research focuses on Evripidou street, an axis around which expand both commercial and residential centers. Along Evripidou street, three case studies elaborate: a) In case study 01, Evripidou street is examined as a megastructure in which totalities of interior spaces develop. b) In case study 02, a particular group of entities (polykatoikia) that expand in Evripidou street is investigated. c) In case study 03, a particular group of entities (apartment) that derives from a specific envelope is investigated. Throughout the studies and comparisons of different scales, a design methodology that addresses the design of interior space in relation to the dynamics of the built environment is evolved.

Keywords: methodology, research by design, interior, envelope, dynamics

Procedia PDF Downloads 170
5775 Smart Water Main Inspection and Condition Assessment Using a Systematic Approach for Pipes Selection

Authors: Reza Moslemi, Sebastien Perrier

Abstract:

Water infrastructure deterioration can result in increased operational costs owing to increased repair needs and non-revenue water and consequently cause a reduced level of service and customer service satisfaction. Various water main condition assessment technologies have been introduced to the market in order to evaluate the level of pipe deterioration and to develop appropriate asset management and pipe renewal plans. One of the challenges for any condition assessment and inspection program is to determine the percentage of the water network and the combination of pipe segments to be inspected in order to obtain a meaningful representation of the status of the entire water network with a desirable level of accuracy. Traditionally, condition assessment has been conducted by selecting pipes based on age or location. However, this may not necessarily offer the best approach, and it is believed that by using a smart sampling methodology, a better and more reliable estimate of the condition of a water network can be achieved. This research investigates three different sampling methodologies, including random, stratified, and systematic. It is demonstrated that selecting pipes based on the proposed clustering and sampling scheme can considerably improve the ability of the inspected subset to represent the condition of a wider network. With a smart sampling methodology, a smaller data sample can provide the same insight as a larger sample. This methodology offers increased efficiency and cost savings for condition assessment processes and projects.

Keywords: condition assessment, pipe degradation, sampling, water main

Procedia PDF Downloads 146
5774 Flow Field Analysis of a Liquid Ejector Pump Using Embedded Large Eddy Simulation Methodology

Authors: Qasim Zaheer, Jehanzeb Masud

Abstract:

The understanding of entrainment and mixing phenomenon in the ejector pump is of pivotal importance for designing and performance estimation. In this paper, the existence of turbulent vortical structures due to Kelvin-Helmholtz instability at the free surface between the motive and the entrained fluids streams are simulated using Embedded LES methodology. The efficacy of Embedded LES for simulation of complex flow field of ejector pump is evaluated using ANSYS Fluent®. The enhanced mixing and entrainment process due to breaking down of larger eddies into smaller ones as a consequence of Vortex Stretching phenomenon is captured in this study. Moreover, the flow field characteristics of ejector pump like pressure velocity fields and mass flow rates are analyzed and validated against the experimental results.

Keywords: Kelvin Helmholtz instability, embedded LES, complex flow field, ejector pump

Procedia PDF Downloads 288
5773 Reinventing Education Systems: Towards an Approach Based on Universal Values and Digital Technologies

Authors: Ilyes Athimni, Mouna Bouzazi, Mongi Boulehmi, Ahmed Ferchichi

Abstract:

The principles of good governance, universal values, and digitization are among the tools to fight corruption and improve the quality of service delivery. In recent years, these tools have become one of the most controversial topics in the field of education and a concern of many international organizations and institutions against the problem of corruption. Corruption in the education sector, particularly in higher education, has negative impacts on the quality of education systems and on the quality of administrative or educational services. Currently, the health crisis due to the spread of the COVID-19 pandemic reveals the difficulties encountered by education systems in most countries of the world. Due to the poor governance of these systems, many educational institutions were unable to continue working remotely. To respond to these problems encountered by most education systems in many countries of the world, our initiative is to propose a methodology to reinvent education systems based on global values and digital technologies. This methodology includes a work strategy for educational institutions, whether in the provision of administrative services or in the teaching method, based on information and communication technologies (ICTs), intelligence artificial, and intelligent agents. In addition, we will propose a supervisory law that will be implemented and monitored by intelligent agents to improve accountability, transparency, and accountability in educational institutions. On the other hand, we will implement and evaluate a field experience by applying the proposed methodology in the operation of an educational institution and comparing it to the traditional methodology through the results of teaching an educational program. With these specifications, we can reinvent quality education systems. We also expect the results of our proposal to play an important role at local, regional, and international levels in motivating governments of countries around the world to change their university governance policies.

Keywords: artificial intelligence, corruption in education, distance learning, education systems, ICTs, intelligent agents, good governance

Procedia PDF Downloads 205
5772 The Interaction of Job Involvement and Organizational Citizenship Behavior on Well-Being

Authors: Yu-Chen Wei

Abstract:

This study integrated the need fulfillment theory and affective event theory to investigate the effects of the interaction of job involvement and organizational citizenship behavior (OCB) on well-being. Data from 196 paired samples of employees and their supervisors in one supplementary school in Taiwan were analyzed. This study found that while neither job involvement nor OCB directly affects well-being, the interaction of job involvement and OCB can predict well-being. The findings of this study suggest that management can assist employees in improving their well-being by balancing job involvement and OCB.

Keywords: job involvement, organizational citizenship behavior, well-being, need fulfillment

Procedia PDF Downloads 85
5771 Urban Growth Analysis Using Multi-Temporal Satellite Images, Non-stationary Decomposition Methods and Stochastic Modeling

Authors: Ali Ben Abbes, ImedRiadh Farah, Vincent Barra

Abstract:

Remotely sensed data are a significant source for monitoring and updating databases for land use/cover. Nowadays, changes detection of urban area has been a subject of intensive researches. Timely and accurate data on spatio-temporal changes of urban areas are therefore required. The data extracted from multi-temporal satellite images are usually non-stationary. In fact, the changes evolve in time and space. This paper is an attempt to propose a methodology for changes detection in urban area by combining a non-stationary decomposition method and stochastic modeling. We consider as input of our methodology a sequence of satellite images I1, I2, … In at different periods (t = 1, 2, ..., n). Firstly, a preprocessing of multi-temporal satellite images is applied. (e.g. radiometric, atmospheric and geometric). The systematic study of global urban expansion in our methodology can be approached in two ways: The first considers the urban area as one same object as opposed to non-urban areas (e.g. vegetation, bare soil and water). The objective is to extract the urban mask. The second one aims to obtain a more knowledge of urban area, distinguishing different types of tissue within the urban area. In order to validate our approach, we used a database of Tres Cantos-Madrid in Spain, which is derived from Landsat for a period (from January 2004 to July 2013) by collecting two frames per year at a spatial resolution of 25 meters. The obtained results show the effectiveness of our method.

Keywords: multi-temporal satellite image, urban growth, non-stationary, stochastic model

Procedia PDF Downloads 423
5770 Navigating Government Finance Statistics: Effortless Retrieval and Comparative Analysis through Data Science and Machine Learning

Authors: Kwaku Damoah

Abstract:

This paper presents a methodology and software application (App) designed to empower users in accessing, retrieving, and comparatively exploring data within the hierarchical network framework of the Government Finance Statistics (GFS) system. It explores the ease of navigating the GFS system and identifies the gaps filled by the new methodology and App. The GFS, embodies a complex Hierarchical Network Classification (HNC) structure, encapsulating institutional units, revenues, expenses, assets, liabilities, and economic activities. Navigating this structure demands specialized knowledge, experience, and skill, posing a significant challenge for effective analytics and fiscal policy decision-making. Many professionals encounter difficulties deciphering these classifications, hindering confident utilization of the system. This accessibility barrier obstructs a vast number of professionals, students, policymakers, and the public from leveraging the abundant data and information within the GFS. Leveraging R programming language, Data Science Analytics and Machine Learning, an efficient methodology enabling users to access, navigate, and conduct exploratory comparisons was developed. The machine learning Fiscal Analytics App (FLOWZZ) democratizes access to advanced analytics through its user-friendly interface, breaking down expertise barriers.

Keywords: data science, data wrangling, drilldown analytics, government finance statistics, hierarchical network classification, machine learning, web application.

Procedia PDF Downloads 60
5769 Predicting the Turbulence Intensity, Excess Energy Available and Potential Power Generated by Building Mounted Wind Turbines over Four Major UK City

Authors: Emejeamara Francis

Abstract:

The future of potentials wind energy applications within suburban/urban areas are currently faced with various problems. These include insufficient assessment of urban wind resource, and the effectiveness of commercial gust control solutions as well as unavailability of effective and cheaper valuable tools for scoping the potentials of urban wind applications within built-up environments. In order to achieve effective assessment of the potentials of urban wind installations, an estimation of the total energy that would be available to them were effective control systems to be used, and evaluating the potential power to be generated by the wind system is required. This paper presents a methodology of predicting the power generated by a wind system operating within an urban wind resource. This method was developed by using high temporal resolution wind measurements from eight potential sites within the urban and suburban environment as inputs to a vertical axis wind turbine multiple stream tube model. A relationship between the unsteady performance coefficient obtained from the stream tube model results and turbulence intensity was demonstrated. Hence, an analytical methodology for estimating the unsteady power coefficient at a potential turbine site is proposed. This is combined with analytical models that were developed to predict the wind speed and the excess energy (EEC) available in estimating the potential power generated by wind systems at different heights within a built environment. Estimates of turbulence intensities, wind speed, EEC and turbine performance based on the current methodology allow a more complete assessment of available wind resource and potential urban wind projects. This methodology is applied to four major UK cities namely Leeds, Manchester, London and Edinburgh and the potential to map the turbine performance at different heights within a typical urban city is demonstrated.

Keywords: small-scale wind, turbine power, urban wind energy, turbulence intensity, excess energy content

Procedia PDF Downloads 270
5768 Mapping the Turbulence Intensity and Excess Energy Available to Small Wind Systems over 4 Major UK Cities

Authors: Francis C. Emejeamara, Alison S. Tomlin, James Gooding

Abstract:

Due to the highly turbulent nature of urban air flows, and by virtue of the fact that turbines are likely to be located within the roughness sublayer of the urban boundary layer, proposed urban wind installations are faced with major challenges compared to rural installations. The challenge of operating within turbulent winds can however, be counteracted by the development of suitable gust tracking solutions. In order to assess the cost effectiveness of such controls, a detailed understanding of the urban wind resource, including its turbulent characteristics, is required. Estimating the ambient turbulence and total kinetic energy available at different control response times is essential in evaluating the potential performance of wind systems within the urban environment should effective control solutions be employed. However, high resolution wind measurements within the urban roughness sub-layer are uncommon, and detailed CFD modelling approaches are too computationally expensive to apply routinely on a city wide scale. This paper therefore presents an alternative semi-empirical methodology for estimating the excess energy content (EEC) present in the complex and gusty urban wind. An analytical methodology for predicting the total wind energy available at a potential turbine site is proposed by assessing the relationship between turbulence intensities and EEC, for different control response times. The semi-empirical model is then incorporated with an analytical methodology that was initially developed to predict mean wind speeds at various heights within the built environment based on detailed mapping of its aerodynamic characteristics. Based on the current methodology, additional estimates of turbulence intensities and EEC allow a more complete assessment of the available wind resource. The methodology is applied to 4 UK cities with results showing the potential of mapping turbulence intensities and the total wind energy available at different heights within each city. Considering the effect of ambient turbulence and choice of wind system, the wind resource over neighbourhood regions (of 250 m uniform resolution) and building rooftops within the 4 cities were assessed with results highlighting the promise of mapping potential turbine sites within each city.

Keywords: excess energy content, small-scale wind, turbulence intensity, urban wind energy, wind resource assessment

Procedia PDF Downloads 469
5767 Spatial Interpolation Technique for the Optimisation of Geometric Programming Problems

Authors: Debjani Chakraborty, Abhijit Chatterjee, Aishwaryaprajna

Abstract:

Posynomials, a special type of polynomials, having singularities, pose difficulties while solving geometric programming problems. In this paper, a methodology has been proposed and used to obtain extreme values for geometric programming problems by nth degree polynomial interpolation technique. Here the main idea to optimise the posynomial is to fit a best polynomial which has continuous gradient values throughout the range of the function. The approximating polynomial is smoothened to remove the discontinuities present in the feasible region and the objective function. This spatial interpolation method is capable to optimise univariate and multivariate geometric programming problems. An example is solved to explain the robustness of the methodology by considering a bivariate nonlinear geometric programming problem. This method is also applicable for signomial programming problem.

Keywords: geometric programming problem, multivariate optimisation technique, posynomial, spatial interpolation

Procedia PDF Downloads 359
5766 Applying Serious Game Design Frameworks to Existing Games for Integration of Custom Learning Objectives

Authors: Jonathan D. Moore, Mark G. Reith, David S. Long

Abstract:

Serious games (SGs) have been shown to be an effective teaching tool in many contexts. Because of the success of SGs, several design frameworks have been created to expedite the process of making original serious games to teach specific learning objectives (LOs). Even with these frameworks, the time required to create a custom SG from conception to implementation can range from months to years. Furthermore, it is even more difficult to design a game framework that allows an instructor to create customized game variants supporting multiple LOs within the same field. This paper proposes a refactoring methodology to apply the theoretical principles from well-established design frameworks to a pre-existing serious game. The expected result is a generalized game that can be quickly customized to teach LOs not originally targeted by the game. This methodology begins by describing the general components in a game, then uses a combination of two SG design frameworks to extract the teaching elements present in the game. The identified teaching elements are then used as the theoretical basis to determine the range of LOs that can be taught by the game. This paper evaluates the proposed methodology by presenting a case study of refactoring the serious game Battlespace Next (BSN) to teach joint military capabilities. The range of LOs that can be taught by the generalized BSN are identified, and examples of creating custom LOs are given. Survey results from users of the generalized game are also provided. Lastly, the expected impact of this work is discussed and a road map for future work and evaluation is presented.

Keywords: serious games, learning objectives, game design, learning theory, game framework

Procedia PDF Downloads 109
5765 Remote Training with Self-Assessment in Electrical Engineering

Authors: Zoja Raud, Valery Vodovozov

Abstract:

The paper focuses on the distance laboratory organisation for training the electrical engineering staff and students in the fields of electrical drive and power electronics. To support online knowledge acquisition and professional enhancement, new challenges in remote education based on an active learning approach with self-assessment have been emerged by the authors. Following the literature review and explanation of the improved assessment methodology, the concept and technological basis of the labs arrangement are presented. To decrease the gap between the distance study of the up-to-date equipment and other educational activities in electrical engineering, the improvements in the following-up the learners’ progress and feedback composition are introduced. An authoring methodology that helps to personalise knowledge acquisition and enlarge Web-based possibilities is described. Educational management based on self-assessment is discussed.

Keywords: advanced training, active learning, distance learning, electrical engineering, remote laboratory, self-assessment

Procedia PDF Downloads 322
5764 A Methodology for Automatic Diversification of Document Categories

Authors: Dasom Kim, Chen Liu, Myungsu Lim, Su-Hyeon Jeon, ByeoungKug Jeon, Kee-Young Kwahk, Namgyu Kim

Abstract:

Recently, numerous documents including unstructured data and text have been created due to the rapid increase in the usage of social media and the Internet. Each document is usually provided with a specific category for the convenience of the users. In the past, the categorization was performed manually. However, in the case of manual categorization, not only can the accuracy of the categorization be not guaranteed but the categorization also requires a large amount of time and huge costs. Many studies have been conducted towards the automatic creation of categories to solve the limitations of manual categorization. Unfortunately, most of these methods cannot be applied to categorizing complex documents with multiple topics because the methods work by assuming that one document can be categorized into one category only. In order to overcome this limitation, some studies have attempted to categorize each document into multiple categories. However, they are also limited in that their learning process involves training using a multi-categorized document set. These methods therefore cannot be applied to multi-categorization of most documents unless multi-categorized training sets are provided. To overcome the limitation of the requirement of a multi-categorized training set by traditional multi-categorization algorithms, we previously proposed a new methodology that can extend a category of a single-categorized document to multiple categorizes by analyzing relationships among categories, topics, and documents. In this paper, we design a survey-based verification scenario for estimating the accuracy of our automatic categorization methodology.

Keywords: big data analysis, document classification, multi-category, text mining, topic analysis

Procedia PDF Downloads 265
5763 Numerical Methodology to Support the Development of a Double Chamber Syringe

Authors: Lourenço Bastos, Filipa Carneiro, Bruno Vale, Rita Marques Joana Silva, Ricardo Freitas, Ângelo Marques, Sara Cortez, Alberta Coelho, Pedro Parreira, Liliana Sousa, Anabela Salgueiro, Bruno Silva

Abstract:

The process of flushing is considered to be an adequate technique to reduce the risk of infection during the clinical practice of venous catheterization. Nonetheless, there is still a lack of adhesion to this method, in part due to the complexity of this procedure. The project SeringaDuo aimed to develop an innovative double-chamber syringe for intravenous sequential administration of drugs and serums. This device served the purpose of improving the adherence to the practice, through the reduction of manipulations needed, which also improves patient safety, and though the promotion of flushing practice by health professionals, by simplifying this task. To assist on the development of this innovative syringe, a numerical methodology was developed and validated in order to predict the syringe’s mechanical and flow behavior during the fluids’ loading and administration phases, as well as to allow the material behavior evaluation during its production. For this, three commercial numerical simulation software was used, namely ABAQUS, ANSYS/FLUENT, and MOLDFLOW. This methodology aimed to evaluate the concepts feasibility and to optimize the geometries of the syringe’s components, creating this way an iterative process for product development based on numerical simulations, validated by the production of prototypes. Through this methodology, it was possible to achieve a final design that fulfils all the characteristics and specifications defined. This iterative process based on numerical simulations is a powerful tool for product development that allows obtaining fast and accurate results without the strict need for prototypes. An iterative process can be implemented, consisting of consecutive constructions and evaluations of new concepts, to obtain an optimized solution, which fulfils all the predefined specifications and requirements.

Keywords: Venous catheterization, flushing, syringe, numerical simulation

Procedia PDF Downloads 162
5762 Designing Creative Events with Deconstructivism Approach

Authors: Maryam Memarian, Mahmood Naghizadeh

Abstract:

Deconstruction is an approach that is entirely incompatible with the traditional prevalent architecture. Considering the fact that this approach attempts to put architecture in sharp contrast with its opposite events and transpires with attending to the neglected and missing aspects of architecture and deconstructing its stable structures. It also recklessly proceeds beyond the existing frameworks and intends to create a different and more efficient prospect for space. The aim of deconstruction architecture is to satisfy both the prospective and retrospective visions as well as takes into account all tastes of the present in order to transcend time. Likewise, it ventures to fragment the facts and symbols of the past and extract new concepts from within their heart, which coincide with today’s circumstances. Since this approach is an attempt to surpass the limits of the prevalent architecture, it can be employed to design places in which creative events occur and imagination and ambition flourish. Thought-provoking artistic events can grow and mature in such places and be represented in the best way possible to all people. The concept of event proposed in the plan grows out of the interaction between space and creation. In addition to triggering surprise and high impressions, it is also considered as a bold journey into the suspended realms of the traditional conflicts in architecture such as architecture-landscape, interior-exterior, center-margin, product-process, and stability-instability. In this project, at first, through interpretive-historical research method and examining the inputs and data collection, recognition and organizing takes place. After evaluating the obtained data using deductive reasoning, the data is eventually interpreted. Given the fact that the research topic is in its infancy and there is not a similar case in Iran with limited number of corresponding instances across the world, the selected topic helps to shed lights on the unrevealed and neglected parts in architecture. Similarly, criticizing, investigating and comparing specific and highly prized cases in other countries with the project under study can serve as an introduction into this architecture style.

Keywords: anti-architecture, creativity, deconstruction, event

Procedia PDF Downloads 315
5761 The Methodology of System Modeling of Mechatronic Systems

Authors: Lakhoua Najeh

Abstract:

Aims of the work: After a presentation of the functionality of an example of a mechatronic system which is a paint mixer system, we present the concepts of modeling and safe operation. This paper briefly discusses how to model and protect the functioning of a mechatronic system relying mainly on functional analysis and safe operation techniques. Methods: For the study of an example of a mechatronic system, we use methods for external functional analysis that illustrate the relationships between a mechatronic system and its external environment. Thus, we present the Safe-Structured Analysis Design Technique method (Safe-SADT) which allows the representation of a mechatronic system. A model of operating safety and automation is proposed. This model enables us to use a functional analysis technique of the mechatronic system based on the GRAFCET (Graphe Fonctionnel de Commande des Etapes et Transitions: Step Transition Function Chart) method; study of the safe operation of the mechatronic system based on the Safe-SADT method; automation of the mechatronic system based on a software tool. Results: The expected results are to propose a model and safe operation of a mechatronic system. This methodology enables us to analyze the relevance of the different models based on Safe-SADT and GRAFCET in relation to the control and monitoring functions and to study the means allowing exploiting their synergy. Conclusion: In order to propose a general model of a mechatronic system, a model of analysis, safety operation and automation of a mechatronic system has been developed. This is how we propose to validate this methodology through a case study of a paint mixer system.

Keywords: mechatronic systems, system modeling, safe operation, Safe-SADT

Procedia PDF Downloads 236
5760 Life Time Improvement of Clamp Structural by Using Fatigue Analysis

Authors: Pisut Boonkaew, Jatuporn Thongsri

Abstract:

In hard disk drive manufacturing industry, the process of reducing an unnecessary part and qualifying the quality of part before assembling is important. Thus, clamp was designed and fabricated as a fixture for holding in testing process. Basically, testing by trial and error consumes a long time to improve. Consequently, the simulation was brought to improve the part and reduce the time taken. The problem is the present clamp has a low life expectancy because of the critical stress that occurred. Hence, the simulation was brought to study the behavior of stress and compressive force to improve the clamp expectancy with all probability of designs which are present up to 27 designs, which excluding the repeated designs. The probability was calculated followed by the full fractional rules of six sigma methodology which was provided correctly. The six sigma methodology is a well-structured method for improving quality level by detecting and reducing the variability of the process. Therefore, the defective will be decreased while the process capability increasing. This research focuses on the methodology of stress and fatigue reduction while compressive force still remains in the acceptable range that has been set by the company. In the simulation, ANSYS simulates the 3D CAD with the same condition during the experiment. Then the force at each distance started from 0.01 to 0.1 mm will be recorded. The setting in ANSYS was verified by mesh convergence methodology and compared the percentage error with the experimental result; the error must not exceed the acceptable range. Therefore, the improved process focuses on degree, radius, and length that will reduce stress and still remain in the acceptable force number. Therefore, the fatigue analysis will be brought as the next process in order to guarantee that the lifetime will be extended by simulating through ANSYS simulation program. Not only to simulate it, but also to confirm the setting by comparing with the actual clamp in order to observe the different of fatigue between both designs. This brings the life time improvement up to 57% compared with the actual clamp in the manufacturing. This study provides a precise and trustable setting enough to be set as a reference methodology for the future design. Because of the combination and adaptation from the six sigma method, finite element, fatigue and linear regressive analysis that lead to accurate calculation, this project will able to save up to 60 million dollars annually.

Keywords: clamp, finite element analysis, structural, six sigma, linear regressive analysis, fatigue analysis, probability

Procedia PDF Downloads 232
5759 Copula Autoregressive Methodology for Simulation of Solar Irradiance and Air Temperature Time Series for Solar Energy Forecasting

Authors: Andres F. Ramirez, Carlos F. Valencia

Abstract:

The increasing interest in renewable energies strategies application and the path for diminishing the use of carbon related energy sources have encouraged the development of novel strategies for integration of solar energy into the electricity network. A correct inclusion of the fluctuating energy output of a photovoltaic (PV) energy system into an electric grid requires improvements in the forecasting and simulation methodologies for solar energy potential, and the understanding not only of the mean value of the series but the associated underlying stochastic process. We present a methodology for synthetic generation of solar irradiance (shortwave flux) and air temperature bivariate time series based on copula functions to represent the cross-dependence and temporal structure of the data. We explore the advantages of using this nonlinear time series method over traditional approaches that use a transformation of the data to normal distributions as an intermediate step. The use of copulas gives flexibility to represent the serial variability of the real data on the simulation and allows having more control on the desired properties of the data. We use discrete zero mass density distributions to assess the nature of solar irradiance, alongside vector generalized linear models for the bivariate time series time dependent distributions. We found that the copula autoregressive methodology used, including the zero mass characteristics of the solar irradiance time series, generates a significant improvement over state of the art strategies. These results will help to better understand the fluctuating nature of solar energy forecasting, the underlying stochastic process, and quantify the potential of a photovoltaic (PV) energy generating system integration into a country electricity network. Experimental analysis and real data application substantiate the usage and convenience of the proposed methodology to forecast solar irradiance time series and solar energy across northern hemisphere, southern hemisphere, and equatorial zones.

Keywords: copula autoregressive, solar irradiance forecasting, solar energy forecasting, time series generation

Procedia PDF Downloads 312
5758 Computational Assistance of the Research, Using Dynamic Vector Logistics of Processes for Critical Infrastructure Subjects Continuity

Authors: Urbánek Jiří J., Krahulec Josef, Urbánek Jiří F., Johanidesová Jitka

Abstract:

These Computational assistance for the research and modelling of critical infrastructure subjects continuity deal with this paper. It enables us the using of prevailing operation system MS Office (SmartArt...) for mathematical models, using DYVELOP (Dynamic Vector Logistics of Processes) method. It serves for crisis situations investigation and modelling within the organizations of critical infrastructure. In the first part of the paper, it will be introduced entities, operators and actors of DYVELOP method. It uses just three operators of Boolean algebra and four types of the entities: the Environments, the Process Systems, the Cases and the Controlling. The Process Systems (PrS) have five “brothers”: Management PrS, Transformation PrS, Logistic PrS, Event PrS and Operation PrS. The Cases have three “sisters”: Process Cell Case, Use Case and Activity Case. They all need for the controlling of their functions special Ctrl actors, except ENV – it can do without Ctrl. Model´s maps are named the Blazons and they are able mathematically - graphically express the relationships among entities, actors and processes. In the second part of this paper, the rich blazons of DYVELOP method will be used for the discovering and modelling of the cycling cases and their phases. The blazons need live PowerPoint presentation for better comprehension of this paper mission. The crisis management of energetic crisis infrastructure organization is obliged to use the cycles for successful coping of crisis situations. Several times cycling of these cases is a necessary condition for the encompassment of the both the emergency event and the mitigation of organization´s damages. Uninterrupted and continuous cycling process bring for crisis management fruitfulness and it is a good indicator and controlling actor of organizational continuity and its sustainable development advanced possibilities. The research reliable rules are derived for the safety and reliable continuity of energetic critical infrastructure organization in the crisis situation.

Keywords: blazons, computational assistance, DYVELOP method, critical infrastructure

Procedia PDF Downloads 375
5757 A Model of Human Security: A Comparison of Vulnerabilities and Timespace

Authors: Anders Troedsson

Abstract:

For us humans, risks are intimately linked to human vulnerabilities - where there is vulnerability, there is potentially insecurity, and risk. Reducing vulnerability through compensatory measures means increasing security and decreasing risk. The paper suggests that a meaningful way to approach the study of risks (including threats, assaults, crisis etc.), is to understand the vulnerabilities these external phenomena evoke in humans. As is argued, the basis of risk evaluation, as well as responses, is the more or less subjective perception by the individual person, or a group of persons, exposed to the external event or phenomena in question. This will be determined primarily by the vulnerability or vulnerabilities that the external factor are perceived to evoke. In this way, risk perception is primarily an inward dynamic, rather than an outward one. Therefore, a route towards an understanding of the perception of risks, is a closer scrutiny of the vulnerabilities which they can evoke, thereby approaching an understanding of what in the paper is called the essence of risk (including threat, assault etc.), or that which a certain perceived risk means to an individual or group of individuals. As a necessary basis for gauging the wide spectrum of potential risks and their meaning, the paper proposes a model of human vulnerabilities, drawing from i.a. a long tradition of needs theory. In order to account for the subjectivity factor, which mediates between the innate vulnerabilities on the one hand, and the event or phenomenon out there on the other hand, an ensuing ontological discussion about the timespace characteristics of risk/threat/assault as perceived by humans leads to the positing of two dimensions. These two dimensions are applied on the vulnerabilities, resulting in a modelling effort featuring four realms of vulnerabilities which are related to each other and together represent a dynamic whole. In approaching the problem of risk perception, the paper thus defines the relevant realms of vulnerabilities, depicting them as a dynamic whole. With reference to a substantial body of literature and a growing international policy trend since the 1990s, this model is put in the language of human security - a concept relevant not only for international security studies and policy, but also for other academic disciplines and spheres of human endeavor.

Keywords: human security, timespace, vulnerabilities, risk perception

Procedia PDF Downloads 328
5756 Estimation and Forecasting with a Quantile AR Model for Financial Returns

Authors: Yuzhi Cai

Abstract:

This talk presents a Bayesian approach to quantile autoregressive (QAR) time series model estimation and forecasting. We establish that the joint posterior distribution of the model parameters and future values is well defined. The associated MCMC algorithm for parameter estimation and forecasting converges to the posterior distribution quickly. We also present a combining forecasts technique to produce more accurate out-of-sample forecasts by using a weighted sequence of fitted QAR models. A moving window method to check the quality of the estimated conditional quantiles is developed. We verify our methodology using simulation studies and then apply it to currency exchange rate data. An application of the method to the USD to GBP daily currency exchange rates will also be discussed. The results obtained show that an unequally weighted combining method performs better than other forecasting methodology.

Keywords: combining forecasts, MCMC, quantile modelling, quantile forecasting, predictive density functions

Procedia PDF Downloads 342
5755 Structural Identification for Layered Composite Structures through a Wave and Finite Element Methodology

Authors: Rilwan Kayode Apalowo, Dimitrios Chronopoulos

Abstract:

An approach for identifying the geometric and material characteristics of layered composite structures through an inverse wave and finite element methodology is proposed. These characteristics are obtained through multi-frequency single shot measurements. However, it is established that the frequency regime of the measurements does not matter, meaning that both ultrasonic and structural dynamics frequency spectra can be employed. Taking advantage of a full FE (finite elements) description of the periodic composite, the scheme is able to account for arbitrarily complex structures. In order to demonstrate the robustness of the presented scheme, it is applied to a sandwich composite panel and results are compared with that of experimental characterization techniques. Excellent agreement is obtained with the experimental measurements.

Keywords: structural identification, non-destructive evaluation, finite elements, wave propagation, layered structures, ultrasound

Procedia PDF Downloads 132
5754 Study of Biofuel Produced by Babassu Oil Fatty Acids Esterification

Authors: F. A. F. da Ponte, J. Q. Malveira, I. A. Maciel, M. C. G. Albuquerque

Abstract:

In this work aviation, biofuel production was studied by fatty acids (C6 to C16) esterification. The process variables in heterogeneous catalysis were evaluated using an experimental design. Temperature and reaction time were the studied parameters, and the methyl esters content was the response of the experimental design. An ion exchange resin was used as a heterogeneous catalyst. The process optimization was carried out using response surface methodology (RSM) and polynomial model of second order. Results show that the most influential variables on the linear coefficient of each effect studied were temperature and reaction time. The best result of methyl esters conversion in the experimental design was under the conditions: 10% wt of catalyst; 100 °C and 4 hours of reaction. The best-achieved conversion was 96.5% wt of biofuel.

Keywords: esterification, ion-exchange resins, response surface methodology, biofuel

Procedia PDF Downloads 489
5753 Combination Approach Using Experiments and Optimal Experimental Design to Optimize Chemical Concentration in Alkali-Surfactant-Polymer Process

Authors: H. Tai Pham, Bae Wisup, Sungmin Jung, Ivan Efriza, Ratna Widyaningsih, Byung Un Min

Abstract:

The middle-phase-microemulsion in Alkaline-Surfactant-Polymer (ASP) solution and oil play important roles in the success of an ASP flooding process. The high quality microemulsion phase has ultralow interfacial tensions and it can increase oil recovery. The research used optimal experimental design and response-surface-methodology to predict the optimum concentration of chemicals in ASP solution for maximum microemulsion quality. Secondly, this optimal ASP formulation was implemented in core flooding test to investigate the effective injection volume. As the results, the optimum concentration of surfactants in the ASP solution is 0.57 wt.% and the highest effective injection volume is 19.33% pore volume.

Keywords: optimize, ASP, response surface methodology, solubilization ratio

Procedia PDF Downloads 342
5752 Posterior Circulation Ischemic Strokes in Olympic and Division 1 Wrestlers

Authors: Christen Kutz

Abstract:

Objective: The aim of this study is to review a case series of 4 high-level Olympic and Division 1 wrestlers who experienced debilitating posterior circulation ischemic strokes during or after a competitive wrestling event and to identify risk factors, etiology and outcomes of stroke in young, healthy elite wrestlers. Background: Stroke occurs in one in 10,000 people under age 64. In young adults, the most common causes of stroke are cardiac embolism, hypercoagulable state, and vasculopathy. One-third of these strokes occur in young, fit individuals. There is little published literature about ischemic strokes that occur in wrestlers. Based on the nature of wrestling, the risk of injury or dissection to neurovascular structures may be a possible theory, but very few case reports exist. Methodology: 4 wrestlers under the age of 44 with a known history of ischemic stroke participated in individual interviews either in person or virtually. Each of the wrestlers provided their demographic information, wrestling background, clinical presentation at the time of stroke, imaging results, identification of potential risk factors, acute treatment and recovery. Results: 3 white male Division 1 wrestlers (2 Lehigh University, 1 Lock Haven University) and 1 black male 2008 Olympian experienced posterior circulation strokes. Case #1 felt a “pop” while wrestling (lateral medullary infarct, possible vertebral artery dissection); Case #2 awoke with severe vertigo, sweating, and vomiting after wrestling the previous day (left cerebellar infarct, (+) protein S deficiency); Case #3 severe vertigo, ataxia, and sensation of impending doom after wrestling earlier that week (left cerebellar infarct, hypoplastic left vertebral artery (+) anti-cardiolipin antibodies). Case #4 severe dizziness, confusion (left cerebellar stroke, vertebral artery dissection, small PFO). Conclusion: 3 wrestlers were started on anti-platelet therapy, risk factors were modified, and returned to their sport. 1 wrestler was placed on anti-coagulation and retired from competition.

Keywords: stroke, wrestling, Olympic, posterior circulation

Procedia PDF Downloads 70
5751 Polynomially Adjusted Bivariate Density Estimates Based on the Saddlepoint Approximation

Authors: S. B. Provost, Susan Sheng

Abstract:

An alternative bivariate density estimation methodology is introduced in this presentation. The proposed approach involves estimating the density function associated with the marginal distribution of each of the two variables by means of the saddlepoint approximation technique and applying a bivariate polynomial adjustment to the product of these density estimates. Since the saddlepoint approximation is utilized in the context of density estimation, such estimates are determined from empirical cumulant-generating functions. In the univariate case, the saddlepoint density estimate is itself adjusted by a polynomial. Given a set of observations, the coefficients of the polynomial adjustments are obtained from the sample moments. Several illustrative applications of the proposed methodology shall be presented. Since this approach relies essentially on a determinate number of sample moments, it is particularly well suited for modeling massive data sets.

Keywords: density estimation, empirical cumulant-generating function, moments, saddlepoint approximation

Procedia PDF Downloads 275
5750 Investigation of Topic Modeling-Based Semi-Supervised Interpretable Document Classifier

Authors: Dasom Kim, William Xiu Shun Wong, Yoonjin Hyun, Donghoon Lee, Minji Paek, Sungho Byun, Namgyu Kim

Abstract:

There have been many researches on document classification for classifying voluminous documents automatically. Through document classification, we can assign a specific category to each unlabeled document on the basis of various machine learning algorithms. However, providing labeled documents manually requires considerable time and effort. To overcome the limitations, the semi-supervised learning which uses unlabeled document as well as labeled documents has been invented. However, traditional document classifiers, regardless of supervised or semi-supervised ones, cannot sufficiently explain the reason or the process of the classification. Thus, in this paper, we proposed a methodology to visualize major topics and class components of each document. We believe that our methodology for visualizing topics and classes of each document can enhance the reliability and explanatory power of document classifiers.

Keywords: data mining, document classifier, text mining, topic modeling

Procedia PDF Downloads 395
5749 Design of a Laboratory Test for InvestigatingPermanent Deformation of Asphalt

Authors: Esmaeil Ahmadinia, Frank Bullen, Ron Ayers

Abstract:

Many concerns have been raised in recent years about the adequacy of existing creep test methods for evaluating rut-resistance of asphalt mixes. Many researchers believe the main reason for the creep tests being unable to duplicate field results is related to a lack of a realistic confinement for laboratory specimens. In-situ asphalt under axle loads is surrounded by a mass of asphalt, which provides stress-strain generated confinement. However, most existing creep tests are largely unconfined in their nature. It has been hypothesised that by providing a degree of confinement, representative of field conditions, in a creep test, it could be possible to establish a better correlation between the field and laboratory. In this study, a new methodology is explored where confinement for asphalt specimens is provided. The proposed methodology is founded on the current Australian test method, adapted to provide simulated field conditions through the provision of sample confinement.

Keywords: asphalt mixture, creep test, confinements, permanent deformation

Procedia PDF Downloads 315
5748 Analysis of Critical Success Factors of Six Sigma in Pakistani Small and Medium-Sized Enterprises

Authors: Zanjbeel Tabassum, Cahit Ali Bayraktar, Asfa Muhammad Din, Murat Durucu

Abstract:

Six Sigma is a widely adapted quality improvement methodology applied throughout the world. Through this paper, an attempt has been made to identify Critical Success Factors (CSF) for successful implementation of Six Sigma in Pakistani Small and Medium-sized Enterprises (SMEs). A survey methodology was used to collect the data from SMEs in Pakistan. The results of this exploratory empirical research reflect the importance of different CSFs of Six Sigma implementation in SMEs in Pakistan. On the basis of extracted factors, a framework has been proposed for successful Six Sigma implementation in Pakistani SMEs. This study will provide a base for Pakistani SMEs and future researchers working in Six Sigma implementation and help them to prepare a road map to eradicate the hurdles in Six Sigma implementation.

Keywords: critical success factors, small medium enterprises (SMEs), six sigma, Pakistan

Procedia PDF Downloads 352