Search results for: calibration data requirements
26796 RFID Logistic Management with Cold Chain Monitoring: Cold Store Case Study
Authors: Mira Trebar
Abstract:
Logistics processes of perishable food in the supply chain include the distribution activities and the real time temperature monitoring to fulfil the cold chain requirements. The paper presents the use of RFID (Radio Frequency Identification) technology as an identification tool of receiving and shipping activities in the cold store. At the same time, the use of RFID data loggers with temperature sensors is presented to observe and store the temperatures for the purpose of analyzing the processes and having the history data available for traceability purposes and efficient recall management.Keywords: logistics, warehouse, RFID device, cold chain
Procedia PDF Downloads 63126795 A Stochastic Volatility Model for Optimal Market-Making
Authors: Zubier Arfan, Paul Johnson
Abstract:
The electronification of financial markets and the rise of algorithmic trading has sparked a lot of interest from the mathematical community, for the market making-problem in particular. The research presented in this short paper solves the classic stochastic control problem in order to derive the strategy for a market-maker. It also shows how to calibrate and simulate the strategy with real limit order book data for back-testing. The ambiguity of limit-order priority in back-testing is dealt with by considering optimistic and pessimistic priority scenarios. The model, although it does outperform a naive strategy, assumes constant volatility, therefore, is not best suited to the LOB data. The Heston model is introduced to describe the price and variance process of the asset. The Trader's constant absolute risk aversion utility function is optimised by numerically solving a 3-dimensional Hamilton-Jacobi-Bellman partial differential equation to find the optimal limit order quotes. The results show that the stochastic volatility market-making model is more suitable for a risk-averse trader and is also less sensitive to calibration error than the constant volatility model.Keywords: market-making, market-microsctrucure, stochastic volatility, quantitative trading
Procedia PDF Downloads 15026794 Modelling of Meandering River Dynamics in Colombia: A Case Study of the Magdalena River
Authors: Laura Isabel Guarin, Juliana Vargas, Philippe Chang
Abstract:
The analysis and study of Open Channel flow dynamics for River applications has been based on flow modelling using discreet numerical models based on hydrodynamic equations. The overall spatial characteristics of rivers, i.e. its length to depth to width ratio generally allows one to correctly disregard processes occurring in the vertical or transverse dimensions thus imposing hydrostatic pressure conditions and considering solely a 1D flow model along the river length. Through a calibration process an accurate flow model may thus be developed allowing for channel study and extrapolation of various scenarios. The Magdalena River in Colombia is a large river basin draining the country from South to North with 1550 km with 0.0024 average slope and 275 average width across. The river displays high water level fluctuation and is characterized by a series of meanders. The city of La Dorada has been affected over the years by serious flooding in the rainy and dry seasons. As the meander is evolving at a steady pace repeated flooding has endangered a number of neighborhoods. This study has been undertaken in pro of correctly model flow characteristics of the river in this region in order to evaluate various scenarios and provide decision makers with erosion control measures options and a forecasting tool. Two field campaigns have been completed over the dry and rainy seasons including extensive topographical and channel survey using Topcon GR5 DGPS and River Surveyor ADCP. Also in order to characterize the erosion process occurring through the meander, extensive suspended and river bed samples were retrieved as well as soil perforation over the banks. Hence based on DEM ground digital mapping survey and field data a 2DH flow model was prepared using the Iber freeware based on the finite volume method in a non-structured mesh environment. The calibration process was carried out comparing available historical data of nearby hydrologic gauging station. Although the model was able to effectively predict overall flow processes in the region, its spatial characteristics and limitations related to pressure conditions did not allow for an accurate representation of erosion processes occurring over specific bank areas and dwellings. As such a significant helical flow has been observed through the meander. Furthermore, the rapidly changing channel cross section as a consequence of severe erosion has hindered the model’s ability to provide decision makers with a valid up to date planning tool.Keywords: erosion, finite volume method, flow dynamics, flow modelling, meander
Procedia PDF Downloads 31926793 The Reality of Engineering Education in the Kingdom of Saudi Arabia and Its Suitainability to The Requirements of The Labor Market
Authors: Hamad Albadr
Abstract:
With the development that has occurred in the orientation of universities from liability cognitive and maintain the culture of the community to responsibility job formation graduates to work according to the needs of the community development; representing universities in today's world, the prime motivator for the wheel of development in the community and find appropriate solutions to the problems they are facing and adapt to the demands of the changing environment. In this paper review of the reality of engineering education in the Kingdom of Saudi Arabia and its suitability to the requirements of the labor market, where they will be looking at the university as a system administrator educational using System Analysis Approach as one of the methods of modern management to analyze the performance of organizations and institutions, administrative and quality assessment. According to this approach is to deal with the system as a set of subsystems as components of the main divided into : input, process, and outputs, and the surrounding environment, will also be used research descriptive method and analytical , to gather information, data and analysis answers of the study population that consisting of a random sample of the beneficiaries of these services that the universities provided that about 500 professionals about employment in the business sector.Keywords: universities in Saudi Arabia, engineering education, labor market, administrative, quality assessment
Procedia PDF Downloads 34126792 Method for Selecting and Prioritising Smart Services in Manufacturing Companies
Authors: Till Gramberg, Max Kellner, Erwin Gross
Abstract:
This paper presents a comprehensive investigation into the topic of smart services and IIoT-Platforms, focusing on their selection and prioritization in manufacturing organizations. First, a literature review is conducted to provide a basic understanding of the current state of research in the area of smart services. Based on discussed and established definitions, a definition approach for this paper is developed. In addition, value propositions for smart services are identified based on the literature and expert interviews. Furthermore, the general requirements for the provision of smart services are presented. Subsequently, existing approaches for the selection and development of smart services are identified and described. In order to determine the requirements for the selection of smart services, expert opinions from successful companies that have already implemented smart services are collected through semi-structured interviews. Based on the results, criteria for the evaluation of existing methods are derived. The existing methods are then evaluated according to the identified criteria. Furthermore, a novel method for the selection of smart services in manufacturing companies is developed, taking into account the identified criteria and the existing approaches. The developed concept for the method is verified in expert interviews. The method includes a collection of relevant smart services identified in the literature. The actual relevance of the use cases in the industrial environment was validated in an online survey. The required data and sensors are assigned to the smart service use cases. The value proposition of the use cases is evaluated in an expert workshop using different indicators. Based on this, a comparison is made between the identified value proposition and the required data, leading to a prioritization process. The prioritization process follows an established procedure for evaluating technical decision-making processes. In addition to the technical requirements, the prioritization process includes other evaluation criteria such as the economic benefit, the conformity of the new service offering with the company strategy, or the customer retention enabled by the smart service. Finally, the method is applied and validated in an industrial environment. The results of these experiments are critically reflected upon and an outlook on future developments in the area of smart services is given. This research contributes to a deeper understanding of the selection and prioritization process as well as the technical considerations associated with smart service implementation in manufacturing organizations. The proposed method serves as a valuable guide for decision makers, helping them to effectively select the most appropriate smart services for their specific organizational needs.Keywords: smart services, IIoT, industrie 4.0, IIoT-platform, big data
Procedia PDF Downloads 8826791 Publish/Subscribe Scientific Workflow Interoperability Framework (PS-SWIF) Architecture and Design
Authors: Ahmed Alqaoud
Abstract:
This paper describes Publish/Subscribe Scientific Workflow Interoperability Framework (PS-SWIF) architecture and its components that collectively provide interoperability between heterogeneous scientific workflow systems. Requirements to achieve interoperability are identified. This paper also provides a detailed investigation and design of models and solutions for system requirements, and considers how workflow interoperability models provided by Workflow Management Coalition (WfMC) can be achieved using the PS-SWIF system.Keywords: publish/subscribe, scientific workflow, web services, workflow interoperability
Procedia PDF Downloads 30726790 Edmonton Urban Growth Model as a Support Tool for the City Plan Growth Scenarios Development
Authors: Sinisa J. Vukicevic
Abstract:
Edmonton is currently one of the youngest North American cities and has achieved significant growth over the past 40 years. Strong urban shift requires a new approach to how the city is envisioned, planned, and built. This approach is evidence-based scenario development, and an urban growth model was a key support tool in framing Edmonton development strategies, developing urban policies, and assessing policy implications. The urban growth model has been developed using the Metronamica software platform. The Metronamica land use model evaluated the dynamic of land use change under the influence of key development drivers (population and employment), zoning, land suitability, and land and activity accessibility. The model was designed following the Big City Moves ideas: become greener as we grow, develop a rebuildable city, ignite a community of communities, foster a healing city, and create a city of convergence. The Big City Moves were converted to three development scenarios: ‘Strong Central City’, ‘Node City’, and ‘Corridor City’. Each scenario has a narrative story that expressed scenario’s high level goal, scenario’s approach to residential and commercial activities, to transportation vision, and employment and environmental principles. Land use demand was calculated for each scenario according to specific density targets. Spatial policies were analyzed according to their level of importance within the policy set definition for the specific scenario, but also through the policy measures. The model was calibrated on the way to reproduce known historical land use pattern. For the calibration, we used 2006 and 2011 land use data. The validation is done independently, which means we used the data we did not use for the calibration. The model was validated with 2016 data. In general, the modeling process contain three main phases: ‘from qualitative storyline to quantitative modelling’, ‘model development and model run’, and ‘from quantitative modelling to qualitative storyline’. The model also incorporates five spatial indicators: distance from residential to work, distance from residential to recreation, distance to river valley, urban expansion and habitat fragmentation. The major finding of this research could be looked at from two perspectives: the planning perspective and technology perspective. The planning perspective evaluates the model as a tool for scenario development. Using the model, we explored the land use dynamic that is influenced by a different set of policies. The model enables a direct comparison between the three scenarios. We explored the similarities and differences of scenarios and their quantitative indicators: land use change, population change (and spatial allocation), job allocation, density (population, employment, and dwelling unit), habitat connectivity, proximity to objects of interest, etc. From the technology perspective, the model showed one very important characteristic: the model flexibility. The direction for policy testing changed many times during the consultation process and model flexibility in applying all these changes was highly appreciated. The model satisfied our needs as scenario development and evaluation tool, but also as a communication tool during the consultation process.Keywords: urban growth model, scenario development, spatial indicators, Metronamica
Procedia PDF Downloads 9526789 Microsimulation of Potential Crashes as a Road Safety Indicator
Authors: Vittorio Astarita, Giuseppe Guido, Vincenzo Pasquale Giofre, Alessandro Vitale
Abstract:
Traffic microsimulation has been used extensively to evaluate consequences of different traffic planning and control policies in terms of travel time delays, queues, pollutant emissions, and every other common measured performance while at the same time traffic safety has not been considered in common traffic microsimulation packages as a measure of performance for different traffic scenarios. Vehicle conflict techniques that were introduced at intersections in the early traffic researches carried out at the General Motor laboratory in the USA and in the Swedish traffic conflict manual have been applied to vehicles trajectories simulated in microscopic traffic simulators. The concept is that microsimulation can be used as a base for calculating the number of conflicts that will define the safety level of a traffic scenario. This allows engineers to identify unsafe road traffic maneuvers and helps in finding the right countermeasures that can improve safety. Unfortunately, most commonly used indicators do not consider conflicts between single vehicles and roadside obstacles and barriers. A great number of vehicle crashes take place with roadside objects or obstacles. Only some recent proposed indicators have been trying to address this issue. This paper introduces a new procedure based on the simulation of potential crash events for the evaluation of safety levels in microsimulation traffic scenarios, which takes into account also potential crashes with roadside objects and barriers. The procedure can be used to define new conflict indicators. The proposed simulation procedure generates with the random perturbation of vehicle trajectories a set of potential crashes which can be evaluated accurately in terms of DeltaV, the energy of the impact, and/or expected number of injuries or casualties. The procedure can also be applied to real trajectories giving birth to new surrogate safety performance indicators, which can be considered as “simulation-based”. The methodology and a specific safety performance indicator are described and applied to a simulated test traffic scenario. Results indicate that the procedure is able to evaluate safety levels both at the intersection level and in the presence of roadside obstacles. The procedure produces results that are expressed in the same unity of measure for both vehicle to vehicle and vehicle to roadside object conflicts. The total energy for a square meter of all generated crash can be used and is shown on the map, for the test network, after the application of a threshold to evidence the most dangerous points. Without any detailed calibration of the microsimulation model and without any calibration of the parameters of the procedure (standard values have been used), it is possible to identify dangerous points. A preliminary sensitivity analysis has shown that results are not dependent on the different energy thresholds and different parameters of the procedure. This paper introduces a specific new procedure and the implementation in the form of a software package that is able to assess road safety, also considering potential conflicts with roadside objects. Some of the principles that are at the base of this specific model are discussed. The procedure can be applied on common microsimulation packages once vehicle trajectories and the positions of roadside barriers and obstacles are known. The procedure has many calibration parameters and research efforts will have to be devoted to make confrontations with real crash data in order to obtain the best parameters that have the potential of giving an accurate evaluation of the risk of any traffic scenario.Keywords: road safety, traffic, traffic safety, traffic simulation
Procedia PDF Downloads 13526788 Knowledge Management and Tourism: An Exploratory Study Applied to Travel Agents in Egypt
Authors: Mohammad Soliman, Mohamed A. Abou-Shouk
Abstract:
Knowledge management focuses on the development, storage, retrieval, and dissemination of information and expertise. It has become an important tool to improve performance in tourism enterprises. This includes improving decision-making, developing customer services, and increasing sales and profits. Knowledge management adoption depends on human, organizational and technological factors. This study aims to explore the concept of knowledge management in travel agents in Egypt. It explores the requirements of adoption and its impact on performance in these agencies. The study targets Category A travel agents in Egypt. The population of the study encompasses Category A travel agents having online presence. An online questionnaire is used to collect data from managers of travel agents. This study is useful for travel agents who are in urgent need to restructure their intermediary role and support their survival in the global travel market. The study sheds light on the requirements of adoption and the expected impact on performance. This could help travel agents identify their situation and the determine the extent to which they are ready to adopt knowledge management. This study is contributing to knowledge by providing insights from the tourism sector in a developing country where the concept of knowledge management is still in its infancy stages.Keywords: knowledge management, knowledge management adoption, performance, travel agents
Procedia PDF Downloads 39726787 CyberSecurity Malaysia: Towards Becoming a National Certification Body for Information Security Management Systems Internal Auditors
Authors: M. S. Razana, Z. W. Shafiuddin
Abstract:
Internal auditing is one of the most important activities for organizations that implement information security management systems (ISMS). The purpose of internal audits is to ensure the ISMS implementation is in accordance to the ISO/IEC 27001 standard and the organization’s own requirements for its ISMS. Competent internal auditors are the main element that contributes to the effectiveness of internal auditing activities. To realize this need, CyberSecurity Malaysia is now in the process of becoming a certification body that certifies ISMS internal auditors. The certification scheme will assess the competence of internal auditors in generic knowledge and skills in management systems, and also in ISMS-specific knowledge and skills. The certification assessment is based on the ISO/IEC 19011 Guidelines for auditing management systems, ISO/IEC 27007 Guidelines for information security management systems auditing and ISO/IEC 27001 Information security management systems requirements. The certification scheme complies with the ISO/IEC 17024 General requirements for bodies operating certification systems of persons. Candidates who pass the exam will be certified as an ISMS Internal Auditor, whose competency will be evaluated every three years.Keywords: ISMS internal audit, ISMS internal auditor, ISO/IEC 17024, competence, certification
Procedia PDF Downloads 23526786 Development of Alpha Spectroscopy Method with Solid State Nuclear Track Detector Using Aluminium Thin Films
Authors: Nidal Dwaikat
Abstract:
This work presents the development of alpha spectroscopy method with Solid-state nuclear track detectors using aluminum thin films. The resolution of this method is high, and it is able to discriminate between alpha particles at different incident energy. It can measure the exact number of alpha particles at specific energy without needing a calibration of alpha track diameter versus alpha energy. This method was tested by using Cf-252 alpha standard source at energies 5.11 Mev, 3.86 MeV and 2.7 MeV, which produced by the variation of detector -standard source distance. On front side, two detectors were covered with two Aluminum thin films and the third detector was kept uncovered. The thickness of Aluminum thin films was selected carefully (using SRIM 2013) such that one of the films will block the lower two alpha particles (3.86 MeV and 2.7 MeV) and the alpha particles at higher energy (5.11 Mev) can penetrate the film and reach the detector’s surface. The second thin film will block alpha particles at lower energy of 2.7 MeV and allow alpha particles at higher two energies (5.11 Mev and 3.86 MeV) to penetrate and produce tracks. For uncovered detector, alpha particles at three different energies can produce tracks on it. For quality assurance and accuracy, the detectors were mounted on thick enough copper substrates to block exposure from the backside. The tracks on the first detector are due to alpha particles at energy of 5.11 MeV. The difference between the tracks number on the first detector and the tracks number on the second detector is due to alpha particles at energy of 3.8 MeV. Finally, by subtracting the tracks number on the second detector from the tracks number on the third detector (uncovered), we can find the tracks number due to alpha particles at energy 2.7 MeV. After knowing the efficiency calibration factor, we can exactly calculate the activity of standard source.Keywords: aluminium thin film, alpha particles, copper substrate, CR-39 detector
Procedia PDF Downloads 36526785 Implementing Quality Function Deployment Tool for a Customer Driven New Product Development in a Kuwait SME
Authors: Asma AlQahtani, Jumana AlHadad, Maryam AlQallaf, Shoug AlHasan
Abstract:
New product development (NPD) is the complete process of bringing a new product to the customer by integrating the two broad divisions; one involving the idea generation, product design and detail engineering; and the other involving market research and marketing analysis. It is a common practice for companies to undertake some of these tasks simultaneously (concurrent engineering) and also consider them as an ongoing process (continuous development). The current study explores the framework and methodology for a new product development process utilizing the Quality Function Deployment (QFD) tool for bringing the customer opinion into the product development process. An elaborate customer survey with focus groups in the region was carried out to ensure that customer requirements are integrated into new products as early as the design stage including identifying the recognition of need for the new product. A QFD Matrix (House of Quality) was prepared that links customer requirements to product engineering requirements and a feasibility study and risk assessment exercise was carried out for a Small and Medium Enterprise (SME) in Kuwait for development of the new product. SMEs in Kuwait, particularly in manufacturing sector are mainly focused on serving the local demand, and often lack of product quality adversely affects the ability of the companies to compete on a regional/global basis. Further, lack of focus on identifying customer requirements often deters SMEs to envisage the idea of a New Product Development. The current study therefore focuses in utilizing QFD Matrix right from the conceptual design to detail design and to some extent, extending the link this to design of the manufacturing system. The outcome of the project resulted in a development of the prototype for a new molded product which can ensure consistency between the customer’s requirements and the measurable characteristics of the product. The Engineering Economics and Cost studies were also undertaken to analyse the viability of the new product, the results of which was also linked to the successful implementation of the initial QFD Matrix.Keywords: Quality Function Deployment, QFD Matrix, new product development, NPD, Kuwait SMEs, prototype development
Procedia PDF Downloads 41426784 A New Social Vulnerability Index for Evaluating Social Vulnerability to Climate Change at the Local Scale
Authors: Cuong V Nguyen, Ralph Horne, John Fien, France Cheong
Abstract:
Social vulnerability to climate change is increasingly being acknowledged, and proposals to measure and manage it are emerging. Building upon this work, this paper proposes an approach to social vulnerability assessment using a new mechanism to aggregate and account for causal relationships among components of a Social Vulnerability Index (SVI). To operationalize this index, the authors propose a means to develop an appropriate primary dataset, through application of a specifically-designed household survey questionnaire. The data collection and analysis, including calibration and calculation of the SVI is demonstrated through application in case study city in central coastal Vietnam. The calculation of SVI at the fine-grained local neighbourhood scale provides high resolution in vulnerability assessment, and also obviates the need for secondary data, which may be unavailable or problematic, particularly at the local scale in developing countries. The SVI household survey is underpinned by the results of a Delphi survey, an in-depth interview and focus group discussions with local environmental professionals and community members. The research reveals inherent limitations of existing SVIs but also indicates the potential for their use in assessing social vulnerability and making decisions associated with responding to climate change at the local scale.Keywords: climate change, local scale, social vulnerability, social vulnerability index
Procedia PDF Downloads 43526783 Augmented Tourism: Definitions and Design Principles
Authors: Eric Hawkinson
Abstract:
After designing and implementing several iterations of implementations of augmented reality (AR) in tourism, this paper takes a deep look into design principles and implementation strategies of using AR at destination tourism settings. The study looks to define augmented tourism from past implementations as well as several cases, uses designed and implemented for tourism. The discussion leads to formation of frameworks and best practices for AR as well as virtual reality( VR) to be used in tourism settings. Some main affordances include guest autonomy, customized experiences, visitor data collection and increased electronic word-of-mouth generation for promotion purposes. Some challenges found include the need for high levels of technology infrastructure, low adoption rates or ‘buy-in’ rates, high levels of calibration and customization, and the need for maintenance and support services. Some suggestions are given as to how to leverage the affordances and meet the challenges of implementing AR for tourism.Keywords: augmented tourism, augmented reality, eTourism, virtual tourism, tourism design
Procedia PDF Downloads 37026782 About the Case Portfolio Management Algorithms and Their Applications
Authors: M. Chumburidze, N. Salia, T. Namchevadze
Abstract:
This work deal with case processing problems in business. The task of strategic credit requirements management of cases portfolio is discussed. The information model of credit requirements in a binary tree diagram is considered. The algorithms to solve issues of prioritizing clusters of cases in business have been investigated. An implementation of priority queues to support case management operations has been presented. The corresponding pseudo codes for the programming application have been constructed. The tools applied in this development are based on binary tree ordering algorithms, optimization theory, and business management methods.Keywords: credit network, case portfolio, binary tree, priority queue, stack
Procedia PDF Downloads 15026781 Comparison Between a Droplet Digital PCR and Real Time PCR Method in Quantification of HBV DNA
Authors: Surangrat Srisurapanon, Chatchawal Wongjitrat, Navin Horthongkham, Ruengpung Sutthent
Abstract:
HBV infection causes a potential serious public health problem. The ability to detect the HBV DNA concentration is of the importance and improved continuously. By using quantitative Polymerase Chain Reaction (qPCR), several factors in standardized; source of material, calibration standard curve and PCR efficiency are inconsistent. Digital PCR (dPCR) is an alternative PCR-based technique for absolute quantification using Poisson's statistics without requiring a standard curve. Therefore, the aim of this study is to compare the data set of HBV DNA generated between dPCR and qPCR methods. All samples were quantified by Abbott’s real time PCR and 54 samples with 2 -6 log10 HBV DNA were selected for comparison with dPCR. Of these 54 samples, there were two outlier samples defined as negative by dPCR. Of these two, samples were defined as negative by dPCR, whereas 52 samples were positive by both the tests. The difference between the two assays was less than 0.25 log IU/mL in 24/52 samples (46%) of paired samples; less than 0.5 log IU/mL in 46/52 samples (88%) and less than 1 log in 50/52 samples (96%). The correlation coefficient was r=0.788 and P-value <0.0001. Comparison to qPCR, data generated by dPCR tend to be the overestimation in the sample with low HBV DNA concentration and underestimated in the sample with high viral load. The variation in DNA by dPCR measurement might be due to the pre-amplification bias, template. Moreover, a minor drawback of dPCR is the large quantity of DNA had to be used when compare to the qPCR. Since the technology is relatively new, the limitations of this assay will be improved.Keywords: hepatitis B virus, real time PCR, digital PCR, DNA quantification
Procedia PDF Downloads 48126780 Data-Driven Simulations Tools for Der and Battery Rich Power Grids
Authors: Ali Moradiamani, Samaneh Sadat Sajjadi, Mahdi Jalili
Abstract:
Power system analysis has been a major research topic in the generation and distribution sections, in both industry and academia, for a long time. Several load flow and fault analysis scenarios have been normally performed to study the performance of different parts of the grid in the context of, for example, voltage and frequency control. Software tools, such as PSCAD, PSSE, and PowerFactory DIgSILENT, have been developed to perform these analyses accurately. Distribution grid had been the passive part of the grid and had been known as the grid of consumers. However, a significant paradigm shift has happened with the emergence of Distributed Energy Resources (DERs) in the distribution level. It means that the concept of power system analysis needs to be extended to the distribution grid, especially considering self sufficient technologies such as microgrids. Compared to the generation and transmission levels, the distribution level includes significantly more generation/consumption nodes thanks to PV rooftop solar generation and battery energy storage systems. In addition, different consumption profile is expected from household residents resulting in a diverse set of scenarios. Emergence of electric vehicles will absolutely make the environment more complicated considering their charging (and possibly discharging) requirements. These complexities, as well as the large size of distribution grids, create challenges for the available power system analysis software. In this paper, we study the requirements of simulation tools in the distribution grid and how data-driven algorithms are required to increase the accuracy of the simulation results.Keywords: smart grids, distributed energy resources, electric vehicles, battery storage systsms, simulation tools
Procedia PDF Downloads 10326779 Requirements to Establish a Taxi Sharing System in an Urban Area
Authors: Morteza Ahmadpur, Ilgin Gokasar, Saman Ghaffarian
Abstract:
That Transportation system plays an important role in management of societies is an undeniable fact and it is one of the most challenging issues in human beings routine life. But by increasing the population in urban areas, the demand for transportation modes also increase. Accordingly, it is obvious that more flexible and dynamic transportation system is required to satisfy peoples’ requirements. Nowadays, there is significant increase in number of environmental issues all over the world which is because of human activities. New technological achievements bring new horizons for humans and so they changed the life style of humans in every aspect of their life and transportation is not an exception. By using new technology, societies can modernize their transportation system and increase the feasibility of their system. Real–time Taxi sharing systems is one of the novel and most modern systems all over the world. For establishing this kind of system in an urban area it is required to use the most advanced technologies in a transportation system. GPS navigation devices, computers and social networks are just some parts of this kind of system. Like carpooling, real-time taxi sharing is one of the best ways to better utilize the empty seats in most cars and taxis, thus decreasing energy consumption and transport costs. It can serve areas not covered by a public transit system and act as a transit feeder service. Taxi sharing is also capable of serving one-time trips, not only recurrent commute trips or scheduled trips. In this study, we describe the requirements and parameters that we need to establish a useful real-time ride sharing system for an urban area. The parameters and requirements of this study can be used in any urban area.Keywords: transportation, intelligent transportation systems, ride-sharing, taxi sharing
Procedia PDF Downloads 42726778 Assessment of Climate Change Impact on Meteorological Droughts
Authors: Alireza Nikbakht Shahbazi
Abstract:
There are various factors that affect climate changes; drought is one of those factors. Investigation of efficient methods for estimating climate change impacts on drought should be assumed. The aim of this paper is to investigate climate change impacts on drought in Karoon3 watershed located south-western Iran in the future periods. The atmospheric general circulation models (GCM) data under Intergovernmental Panel on Climate Change (IPCC) scenarios should be used for this purpose. In this study, watershed drought under climate change impacts will be simulated in future periods (2011 to 2099). Standard precipitation index (SPI) as a drought index was selected and calculated using mean monthly precipitation data in Karoon3 watershed. SPI was calculated in 6, 12 and 24 months periods. Statistical analysis on daily precipitation and minimum and maximum daily temperature was performed. LRAS-WG5 was used to determine the feasibility of future period's meteorological data production. Model calibration and verification was performed for the base year (1980-2007). Meteorological data simulation for future periods under General Circulation Models and climate change IPCC scenarios was performed and then the drought status using SPI under climate change effects analyzed. Results showed that differences between monthly maximum and minimum temperature will decrease under climate change and spring precipitation shall increase while summer and autumn rainfall shall decrease. The precipitation occurs mainly between January and May in future periods and summer or autumn precipitation decline and lead up to short term drought in the study region. Normal and wet SPI category is more frequent in B1 and A2 emissions scenarios than A1B.Keywords: climate change impact, drought severity, drought frequency, Karoon3 watershed
Procedia PDF Downloads 24026777 Estimating Water Balance at Beterou Watershed, Benin Using Soil and Water Assessment Tool (SWAT) Model
Authors: Ella Sèdé Maforikan
Abstract:
Sustained water management requires quantitative information and the knowledge of spatiotemporal dynamics of hydrological system within the basin. This can be achieved through the research. Several studies have investigated both surface water and groundwater in Beterou catchment. However, there are few published papers on the application of the SWAT modeling in Beterou catchment. The objective of this study was to evaluate the performance of SWAT to simulate the water balance within the watershed. The inputs data consist of digital elevation model, land use maps, soil map, climatic data and discharge records. The model was calibrated and validated using the Sequential Uncertainty Fitting (SUFI2) approach. The calibrated started from 1989 to 2006 with four years warming up period (1985-1988); and validation was from 2007 to 2020. The goodness of the model was assessed using five indices, i.e., Nash–Sutcliffe efficiency (NSE), the ratio of the root means square error to the standard deviation of measured data (RSR), percent bias (PBIAS), the coefficient of determination (R²), and Kling Gupta efficiency (KGE). Results showed that SWAT model successfully simulated river flow in Beterou catchment with NSE = 0.79, R2 = 0.80 and KGE= 0.83 for the calibration process against validation process that provides NSE = 0.78, R2 = 0.78 and KGE= 0.85 using site-based streamflow data. The relative error (PBIAS) ranges from -12.2% to 3.1%. The parameters runoff curve number (CN2), Moist Bulk Density (SOL_BD), Base Flow Alpha Factor (ALPHA_BF), and the available water capacity of the soil layer (SOL_AWC) were the most sensitive parameter. The study provides further research with uncertainty analysis and recommendations for model improvement and provision of an efficient means to improve rainfall and discharges measurement data.Keywords: watershed, water balance, SWAT modeling, Beterou
Procedia PDF Downloads 5526776 The Role of Microfinance in Economic Development
Authors: Babak Salekmahdy
Abstract:
Microfinance is often seen as a means of repairing credit markets and unleashing the potential contribution of impoverished people who rely on self-employment. Since the 1990s, the microfinance industry has expanded rapidly, opening the path for additional kinds of social entrepreneurship and social investment. However, current data indicate relatively few average consumer effects, opposing pushback against microfinance. This research reconsiders microfinance statements, stressing the variety of data on impacts and the essential (but limited) role of reimbursements. The report finishes by explaining a shift in thinking: from microfinance as a strictly defined enterprise finance to microfinance as a more widely defined home finance. Microfinance, under this perspective, provides advantages by providing liquidity for various requirements rather than just by increasing income.Keywords: microfinance, small business, economic development, credit markets
Procedia PDF Downloads 8226775 Developing API Economy: Associating Value to APIs and Microservices in an Enterprise
Authors: Mujahid Sultan
Abstract:
The IT industry has seen many transformations in the Software Development Life Cycle (SDLC) methodologies and development approaches. SDLCs range from waterfall to agile, and the development approaches from monolith to microservices. Management, orchestration, and monetization of microservices have created an API economy in the modern enterprise. There are two approaches to API design, code first and design first. Design first is gaining popularity in the industry as this allows capturing the API needs from the stakeholders rather than the development teams guesstimating the needs and associating a monetary value with the APIs and microservices. In this publication, we describe an approach to organizing and creating stakeholder needs and requirements for designing microservices and APIs.Keywords: requirements engineering, enterprise architecture, APIs, microservices, DevOps, continuous delivery, continuous integration, stakeholder viewpoints
Procedia PDF Downloads 19126774 Searchable Encryption in Cloud Storage
Authors: Ren Junn Hwang, Chung-Chien Lu, Jain-Shing Wu
Abstract:
Cloud outsource storage is one of important services in cloud computing. Cloud users upload data to cloud servers to reduce the cost of managing data and maintaining hardware and software. To ensure data confidentiality, users can encrypt their files before uploading them to a cloud system. However, retrieving the target file from the encrypted files exactly is difficult for cloud server. This study proposes a protocol for performing multikeyword searches for encrypted cloud data by applying k-nearest neighbor technology. The protocol ranks the relevance scores of encrypted files and keywords, and prevents cloud servers from learning search keywords submitted by a cloud user. To reduce the costs of file transfer communication, the cloud server returns encrypted files in order of relevance. Moreover, when a cloud user inputs an incorrect keyword and the number of wrong alphabet does not exceed a given threshold; the user still can retrieve the target files from cloud server. In addition, the proposed scheme satisfies security requirements for outsourced data storage.Keywords: fault-tolerance search, multi-keywords search, outsource storage, ranked search, searchable encryption
Procedia PDF Downloads 38326773 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman Electricity Transmission Company
Authors: Rahma Al Balushi
Abstract:
Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS dept. This paper will describe in detail the GIS data submission process and the journey to develop the current process. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting, and data alterations salso aided to reduce the missing attributes of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the year 2017 and the year 2021. Overall, concluding that by governance, asset information & GIS department can control GIS data process; collect, properly record, and manage asset data and information within OETC network. This control extends to other applications and systems integrated with/related to GIS systems.Keywords: asset management ISO55001, standard procedures process, governance, geodatabase, NOC, CMMS
Procedia PDF Downloads 20726772 Development of GIS-Based Geotechnical Guidance Maps for Prediction of Soil Bearing Capacity
Authors: Q. Toufeeq, R. Kauser, U. R. Jamil, N. Sohaib
Abstract:
Foundation design of a structure needs soil investigation to avoid failures due to settlements. This soil investigation is expensive and time-consuming. Developments of new residential societies involve huge leveling of large sites that is accompanied by heavy land filling. Poor practices of land fill for deep depths cause differential settlements and consolidations of underneath soil that sometimes result in the collapse of structures. The extent of filling remains unknown to the individual developer unless soil investigation is carried out. Soil investigation cannot be performed on each available site due to involved costs. However, fair estimate of bearing capacity can be made if such tests are already done in the surrounding areas. The geotechnical guidance maps can provide a fair assessment of soil properties. Previously, GIS-based approaches have been used to develop maps using extrapolation and interpolations techniques for bearing capacities, underground recharge, soil classification, geological hazards, landslide hazards, socio-economic, and soil liquefaction mapping. Standard penetration test (SPT) data of surrounding sites were already available. Google Earth is used for digitization of collected data. Few points were considered for data calibration and validation. Resultant Geographic information system (GIS)-based guidance maps are helpful to anticipate the bearing capacity in the real estate industry.Keywords: bearing capacity, soil classification, geographical information system, inverse distance weighted, radial basis function
Procedia PDF Downloads 13526771 Investigation of Information Security Incident Management Based on International Standard ISO/IEC 27002 in Educational Hospitals in 2014
Authors: Nahid Tavakoli, Asghar Ehteshami, Akbar Hassanzadeh, Fatemeh Amini
Abstract:
Introduction: The Information security incident management guidelines was been developed to help hospitals to meet their information security event and incident management requirements. The purpose of this Study was to investigate on Information Security Incident Management in Isfahan’s educational hospitals in accordance to ISO/IEC 27002 standards. Methods: This was a cross-sectional study to investigate on Information Security Incident Management of educational hospitals in 2014. Based on ISO/IEC 27002 standards, two checklists were applied to check the compliance with standards on Reporting Information Security Events and Weakness and Management of Information Security Incidents and Improvements. One inspector was trained to carry out the assessments in the hospitals. The data was analyzed by SPSS. Findings: In general the score of compliance Information Security Incident Management requirements in two steps; Reporting Information Security Events and Weakness and Management of Information Security Incidents and Improvements was %60. There was the significant difference in various compliance levels among the hospitals (p-value26770 KBASE Technological Framework - Requirements
Authors: Ivan Stanev, Maria Koleva
Abstract:
Automated software development issues are addressed in this paper. Layers and packages of a Common Platform for Automated Programming (CPAP) are defined based on Service Oriented Architecture, Cloud computing, Knowledge based automated software engineering (KBASE) and Method of automated programming. Tools of seven leading companies (AWS of Amazon, Azure of Microsoft, App Engine of Google, vCloud of VMWare, Bluemix of IBM, Helion of HP, OCPaaS of Oracle) are analyzed in the context of CPAP. Based on the results of the analysis CPAP requirements are formulatedKeywords: automated programming, cloud computing, knowledge based software engineering, service oriented architecture
Procedia PDF Downloads 30126769 The Analysis of Regulation on Sustainability in the Financial Sector in Lithuania
Authors: Dalia Kubiliūtė
Abstract:
Lithuania is known as a trusted location for global business institutions, and it attracts investors with it’s competitive environment for financial service providers. Along with the aspiration to offer a strong results-oriented and innovations-driven environment for financial service providers, Lithuanian regulatory authorities consistently implement the European Union's high regulatory standards for financial activities, including sustainability-related disclosures. Since European Union directed its policy towards transition to a climate-neutral, green, competitive, and inclusive economy, additional regulatory requirements for financial market participants are adopted: disclosure of sustainable activities, transparency, prevention of greenwashing, etc. The financial sector is one of the key factors influencing the implementation of sustainability objectives in European Union policies and mitigating the negative effects of climate change –public funds are not enough to make a significant impact on sustainable investments, therefore directing public and private capital to green projects may help to finance the necessary changes. The topic of the study is original and has not yet been widely analyzed in Lithuanian legal discourse. There are used quantitative and qualitative methodologies, logical, systematic, and critical analysis principles; hence the aim of this study is to reveal the problem of the implementation of the regulation on sustainability in the Lithuanian financial sector. Additional regulatory requirements could cause serious changes in financial business operations: additional funds, employees, and time have to be dedicated in order for the companies could implement these regulations. Lack of knowledge and data on how to implement new regulatory requirements towards sustainable reporting causes a lot of uncertainty for financial market participants. And for some companies, it might even be an essential point in terms of business continuity. It is considered that the supervisory authorities should find a balance between financial market needs and legal regulation.Keywords: financial, legal, regulatory, sustainability
Procedia PDF Downloads 10226768 GIS Data Governance: GIS Data Submission Process for Build-in Project, Replacement Project at Oman electricity Transmission Company
Authors: Rahma Saleh Hussein Al Balushi
Abstract:
Oman Electricity Transmission Company's (OETC) vision is to be a renowned world-class transmission grid by 2025, and one of the indications of achieving the vision is obtaining Asset Management ISO55001 certification, which required setting out a documented Standard Operating Procedures (SOP). Hence, documented SOP for the Geographical information system data process has been established. Also, to effectively manage and improve OETC power transmission, asset data and information need to be governed as such by Asset Information & GIS department. This paper will describe in detail the current GIS data submission process and the journey for developing it. The methodology used to develop the process is based on three main pillars, which are system and end-user requirements, Risk evaluation, data availability, and accuracy. The output of this paper shows the dramatic change in the used process, which results subsequently in more efficient, accurate, and updated data. Furthermore, due to this process, GIS has been and is ready to be integrated with other systems as well as the source of data for all OETC users. Some decisions related to issuing No objection certificates (NOC) for excavation permits and scheduling asset maintenance plans in Computerized Maintenance Management System (CMMS) have been made consequently upon GIS data availability. On the Other hand, defining agreed and documented procedures for data collection, data systems update, data release/reporting and data alterations has also contributed to reducing the missing attributes and enhance data quality index of GIS transmission data. A considerable difference in Geodatabase (GDB) completeness percentage was observed between the years 2017 and year 2022. Overall, concluding that by governance, asset information & GIS department can control the GIS data process; collect, properly record, and manage asset data and information within the OETC network. This control extends to other applications and systems integrated with/related to GIS systems.Keywords: asset management ISO55001, standard procedures process, governance, CMMS
Procedia PDF Downloads 12526767 Lean Environmental Management Integration System (LEMIS) Framework Development
Authors: A. P. Puvanasvaran, Suresh A. L. Vasu, N. Norazlin
Abstract:
The Lean Environmental Management Integration System (LEMIS) framework development is integration between lean core element and ISO 14001. The curiosity on the relationship between continuous improvement and sustainability of lean implementation has influenced this study toward LEMIS. Characteristic of ISO 14001 standard clauses and core elements of lean principles are explored from past studies and literature reviews. Survey was carried out on ISO 14001 certified companies to examine continual improvement by implementing the ISO 14001 standard. The study found that there is a significant and positive relationship between Lean Principles: value, value stream, flow, pull and perfection with the ISO 14001 requirements. LEMIS is significant to support the continuous improvement and sustainability. The integration system can be implemented to any manufacturing company. It gives awareness on the importance on why organizations need to sustain its Environmental management system. At the meanwhile, the lean principle can be adapted in order to streamline daily activities of the company. Throughout the study, it had proven that there is no sacrifice or trade-off between lean principles with ISO 14001 requirements. The framework developed in the study can be further simplified in the future, especially the method of crossing each sub requirements of ISO 14001 standard with the core elements of Lean principles in this study.Keywords: LEMIS, ISO 14001, integration, framework
Procedia PDF Downloads 406