Search results for: claim verification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 859

Search results for: claim verification

739 Transient Simulation Using SPACE for ATLAS Facility to Investigate the Effect of Heat Loss on Major Parameters

Authors: Suhib A. Abu-Seini, Kyung-Doo Kim

Abstract:

A heat loss model for ATLAS facility was introduced using SPACE code predefined correlations and various dialing factors. As all previous simulations were carried out using a heat loss free input; the facility was considered to be completely insulated and the core power was reduced by the experimentally measured values of heat loss to compensate to the account for the loss of heat, this study will consider heat loss throughout the simulation. The new heat loss model will be affecting SPACE code simulation as heat being leaked out of the system throughout a transient will alter many parameters corresponding to temperature and temperature difference. For that, a Station Blackout followed by a multiple Steam Generator Tube Rupture accident will be simulated using both the insulated system approach and the newly introduced heat loss input of the steady state. Major parameters such as system temperatures, pressure values, and flow rates to be put into comparison and various analysis will be suggested upon it as the experimental values will not be the reference to validate the expected outcome. This study will not only show the significance of heat loss consideration in the processes of prevention and mitigation of various incidents, design basis and beyond accidents as it will give a detailed behavior of ATLAS facility during both processes of steady state and major transient, but will also present a verification of how credible the data acquired of ATLAS are; since heat loss values for steady state were already mismatched between SPACE simulation results and ATLAS data acquiring system. Acknowledgement- This work was supported by the Korean institute of Energy Technology Evaluation and Planning (KETEP) and the Ministry of Trade, Industry & Energy (MOTIE) of the Republic of Korea.

Keywords: ATLAS, heat loss, simulation, SPACE, station blackout, steam generator tube rupture, verification

Procedia PDF Downloads 203
738 A Microwave Heating Model for Endothermic Reaction in the Cement Industry

Authors: Sofia N. Gonçalves, Duarte M. S. Albuquerque, José C. F. Pereira

Abstract:

Microwave technology has been gaining importance in contributing to decarbonization processes in high energy demand industries. Despite the several numerical models presented in the literature, a proper Verification and Validation exercise is still lacking. This is important and required to evaluate the physical process model accuracy and adequacy. Another issue addresses impedance matching, which is an important mechanism used in microwave experiments to increase electromagnetic efficiency. Such mechanism is not available in current computational tools, thus requiring an external numerical procedure. A numerical model was implemented to study the continuous processing of limestone with microwave heating. This process requires the material to be heated until a certain temperature that will prompt a highly endothermic reaction. Both a 2D and 3D model were built in COMSOL Multiphysics to solve the two-way coupling between Maxwell and Energy equations, along with the coupling between both heat transfer phenomena and limestone endothermic reaction. The 2D model was used to study and evaluate the required numerical procedure, being also a benchmark test, allowing other authors to implement impedance matching procedures. To achieve this goal, a controller built in MATLAB was used to continuously matching the cavity impedance and predicting the required energy for the system, thus successfully avoiding energy inefficiencies. The 3D model reproduces realistic results and therefore supports the main conclusions of this work. Limestone was modeled as a continuous flow under the transport of concentrated species, whose material and kinetics properties were taken from literature. Verification and Validation of the coupled model was taken separately from the chemical kinetic model. The chemical kinetic model was found to correctly describe the chosen kinetic equation by comparing numerical results with experimental data. A solution verification was made for the electromagnetic interface, where second order and fourth order accurate schemes were found for linear and quadratic elements, respectively, with numerical uncertainty lower than 0.03%. Regarding the coupled model, it was demonstrated that the numerical error would diverge for the heat transfer interface with the mapped mesh. Results showed numerical stability for the triangular mesh, and the numerical uncertainty was less than 0.1%. This study evaluated limestone velocity, heat transfer, and load influence on thermal decomposition and overall process efficiency. The velocity and heat transfer coefficient were studied with the 2D model, while different loads of material were studied with the 3D model. Both models demonstrated to be highly unstable when solving non-linear temperature distributions. High velocity flows exhibited propensity to thermal runways, and the thermal efficiency showed the tendency to stabilize for the higher velocities and higher filling ratio. Microwave efficiency denoted an optimal velocity for each heat transfer coefficient, pointing out that electromagnetic efficiency is a consequence of energy distribution uniformity. The 3D results indicated the inefficient development of the electric field for low filling ratios. Thermal efficiencies higher than 90% were found for the higher loads and microwave efficiencies up to 75% were accomplished. The 80% fill ratio was demonstrated to be the optimal load with an associated global efficiency of 70%.

Keywords: multiphysics modeling, microwave heating, verification and validation, endothermic reactions modeling, impedance matching, limestone continuous processing

Procedia PDF Downloads 116
737 Stability Design by Geometrical Nonlinear Analysis Using Equivalent Geometric Imperfections

Authors: S. Fominow, C. Dobert

Abstract:

The present article describes the research that deals with the development of equivalent geometric imperfections for the stability design of steel members considering lateral-torsional buckling. The application of these equivalent imperfections takes into account the stiffness-reducing effects due to inelasticity and residual stresses, which lead to a reduction of the load carrying capacity of slender members and structures. This allows the application of a simplified design method, that is performed in three steps. Application of equivalent geometric imperfections, determination of internal forces using geometrical non-linear analysis (GNIA) and verification of the cross-section resistance at the most unfavourable location. All three verification steps are closely related and influence the results. The derivation of the equivalent imperfections was carried out in several steps. First, reference lateral-torsional buckling resistances for various rolled I-sections, slenderness grades, load shapes and steel grades were determined. This was done either with geometric and material non-linear analysis with geometrical imperfections and residual stresses (GMNIA) or for standard cases based on the equivalent member method. With the aim of obtaining identical lateral-torsional buckling resistances as the reference resistances from the application of the design method, the required sizes for equivalent imperfections were derived. For this purpose, a program based on the FEM method has been developed. Based on these results, several proposals for the specification of equivalent geometric imperfections have been developed. These differ in the shape of the applied equivalent geometric imperfection, the model of the cross-sectional resistance and the steel grade. The proposed design methods allow a wide range of applications and a reliable calculation of the lateral-torsional buckling resistances, as comparisons between the calculated resistances and the reference resistances have shown.

Keywords: equivalent geometric imperfections, GMNIA, lateral-torsional buckling, non-linear finite element analysis

Procedia PDF Downloads 131
736 Simulation and Performance Evaluation of Transmission Lines with Shield Wire Segmentation against Atmospheric Discharges Using ATPDraw

Authors: Marcio S. da Silva, Jose Mauricio de B. Bezerra, Antonio E. de A. Nogueira

Abstract:

This paper aims to make a performance analysis of shield wire transmission lines against atmospheric discharges when it is made the option of sectioning the shield wire and verify if the tolerability of the change. As a goal of this work, it was established to make complete modeling of a transmission line in the ATPDraw program with shield wire grounded in all the towers and in some towers. The methodology used to make the proposed evaluation was to choose an actual transmission line that served as a case study. From the choice of transmission line and verification of all its topology and materials, complete modeling of the line using the ATPDraw software was performed. Then several atmospheric discharges were simulated by striking the grounded shield wires in each tower. These simulations served to identify the behavior of the existing line against atmospheric discharges. After this first analysis, the same line was reconsidered with shield wire segmentation. The shielding wire segmentation technique aims to reduce induced losses in shield wires and is adopted in some transmission lines in Brazil. With the same conditions of atmospheric discharge the transmission line, this time with shield wire segmentation was again evaluated. The results obtained showed that it is possible to obtain similar performances against atmospheric discharges between a shield wired line in multiple towers and the same line with shield wire segmentation if some precautions are adopted as verification of the ground resistance of the wire segmented shield, adequacy of the maximum length of the segmented gap, evaluation of the separation length of the electrodes of the insulator spark, among others. As a conclusion, it is verified that since the correct assessment and adopted the correct criteria of adjustment a transmission line with shielded wire segmentation can perform very similar to the traditional use with multiple earths. This solution contributes in a very important way to the reduction of energy losses in transmission lines.

Keywords: atmospheric discharges, ATPDraw, shield wire, transmission lines

Procedia PDF Downloads 144
735 Verification of the Supercavitation Phenomena: Investigation of the Cavity Parameters and Drag Coefficients for Different Types of Cavitator

Authors: Sezer Kefeli, Sertaç Arslan

Abstract:

Supercavitation is a pressure dependent process which gives opportunity to eliminate the wetted surface effects on the underwater vehicle due to the differences of viscosity and velocity effects between liquid (freestream) and gas phase. Cavitation process occurs depending on rapid pressure drop or temperature rising in liquid phase. In this paper, pressure based cavitation is investigated due to the fact that is encountered in the underwater world, generally. Basically, this vapor-filled pressure based cavities are unstable and harmful for any underwater vehicle because these cavities (bubbles or voids) lead to intense shock waves while collapsing. On the other hand, supercavitation is a desired and stabilized phenomena than general pressure based cavitation. Supercavitation phenomena offers the idea of minimizing form drag, and thus supercavitating vehicles are revived. When proper circumstances are set up, which are either increasing the operating speed of the underwater vehicle or decreasing the pressure difference between free stream and artificial pressure, the continuity of the supercavitation is obtainable. There are 2 types of supercavitation to obtain stable and continuous supercavitation, and these are called as natural and artificial supercavitation. In order to generate natural supercavitation, various mechanical structures are discovered, which are called as cavitators. In literature, a lot of cavitator types are studied either experimentally or numerically on a CFD platforms with intent to observe natural supercavitation since the 1900s. In this paper, firstly, experimental results are obtained, and trend lines are generated based on supercavitation parameters in terms of cavitation number (), form drag coefficientC_D, dimensionless cavity diameter (d_m/d_c), and length (L_c/d_c). After that, natural cavitation verification studies are carried out for disk and cone shape cavitators. In addition, supercavitation parameters are numerically analyzed at different operating conditions, and CFD results are fitted into trend lines of experimental results. The aims of this paper are to generate one generally accepted drag coefficient equation for disk and cone cavitators at different cavitator half angle and investigation of the supercavitation parameters with respect to cavitation number. Moreover, 165 CFD analysis are performed at different cavitation numbers on FLUENT version 21R2. Five different cavitator types are modeled on SCDM with respect tocavitator’s half angles. After that, CFD database is generated depending on numerical results, and new trend lines are generated based on supercavitation parameters. These trend lines are compared with experimental results. Finally, the generally accepted drag coefficient equation and equations of supercavitation parameters are generated.

Keywords: cavity envelope, CFD, high speed underwater vehicles, supercavitation, supercavitating flows, supercavitation parameters, drag reduction, viscous force elimination, natural cavitation verification

Procedia PDF Downloads 108
734 VeriFy: A Solution to Implement Autonomy Safely and According to the Rules

Authors: Michael Naderhirn, Marco Pavone

Abstract:

Problem statement, motivation, and aim of work: So far, the development of control algorithms was done by control engineers in a way that the controller would fit a specification by testing. When it comes to the certification of an autonomous car in highly complex scenarios, the challenge is much higher since such a controller must mathematically guarantee to implement the rules of the road while on the other side guarantee aspects like safety and real time executability. What if it becomes reality to solve this demanding problem by combining Formal Verification and System Theory? The aim of this work is to present a workflow to solve the above mentioned problem. Summary of the presented results / main outcomes: We show the usage of an English like language to transform the rules of the road into system specification for an autonomous car. The language based specifications are used to define system functions and interfaces. Based on that a formal model is developed which formally correctly models the specifications. On the other side, a mathematical model describing the systems dynamics is used to calculate the systems reachability set which is further used to determine the system input boundaries. Then a motion planning algorithm is applied inside the system boundaries to find an optimized trajectory in combination with the formal specification model while satisfying the specifications. The result is a control strategy which can be applied in real time independent of the scenario with a mathematical guarantee to satisfy a predefined specification. We demonstrate the applicability of the method in simulation driving scenarios and a potential certification. Originality, significance, and benefit: To the authors’ best knowledge, it is the first time that it is possible to show an automated workflow which combines a specification in an English like language and a mathematical model in a mathematical formal verified way to synthesizes a controller for potential real time applications like autonomous driving.

Keywords: formal system verification, reachability, real time controller, hybrid system

Procedia PDF Downloads 217
733 Process of Analysis, Evaluation and Verification of the 'Real' Redevelopment of the Public Open Space at the Neighborhood’s Stairs: Case Study of Serres, Greece

Authors: Ioanna Skoufali

Abstract:

The present study is directed towards adaptation to climate change closely related to the phenomenon of the urban heat island (UHI). This issue is widespread and common to different urban realities, but particularly in Mediterranean cities that are characterized by dense urban. The attention of this work of redevelopment of the open space is focused on mitigation techniques aiming to solve local problems such as microclimatic parameters and the conditions of thermal comfort in summer, related to urban morphology. This quantitative analysis, evaluation, and verification survey involves the methodological elaboration applied in a real study case by Serres, through the experimental support of the ENVImet Pro V4.1 and BioMet software developed: i) in two phases concerning the anteoperam (phase a1 # 2013) and the post-operam (phase a2 # 2016); ii) in scenario A (+ 25% of green # 2017). The first study tends to identify the main intervention strategies, namely: the application of cool pavements, the increase of green surfaces, the creation of water surface and external fans; moreover, it obtains the minimum results achieved by the National Program 'Bioclimatic improvement project for public open space', EPPERAA (ESPA 2007-2013) related to the four environmental parameters illustrated below: the TAir = 1.5 o C, the TSurface = 6.5 o C, CDH = 30% and PET = 20%. In addition, the second study proposes a greater potential for improvement than postoperam intervention by increasing the vegetation within the district towards the SW/SE. The final objective of this in-depth design is to be transferable in homogeneous cases of urban regeneration processes with obvious effects on the efficiency of microclimatic mitigation and thermal comfort.

Keywords: cool pavements, microclimate parameters (TAir, Tsurface, Tmrt, CDH), mitigation strategies, outdoor thermal comfort (PET & UTCI)

Procedia PDF Downloads 171
732 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges

Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch

Abstract:

Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.

Keywords: big data interpretation, datathon, systems toxicology, verification

Procedia PDF Downloads 258
731 Fusion of Finger Inner Knuckle Print and Hand Geometry Features to Enhance the Performance of Biometric Verification System

Authors: M. L. Anitha, K. A. Radhakrishna Rao

Abstract:

With the advent of modern computing technology, there is an increased demand for developing recognition systems that have the capability of verifying the identity of individuals. Recognition systems are required by several civilian and commercial applications for providing access to secured resources. Traditional recognition systems which are based on physical identities are not sufficiently reliable to satisfy the security requirements due to the use of several advances of forgery and identity impersonation methods. Recognizing individuals based on his/her unique physiological characteristics known as biometric traits is a reliable technique, since these traits are not transferable and they cannot be stolen or lost. Since the performance of biometric based recognition system depends on the particular trait that is utilized, the present work proposes a fusion approach which combines Inner knuckle print (IKP) trait of the middle, ring and index fingers with the geometrical features of hand. The hand image captured from a digital camera is preprocessed to find finger IKP as region of interest (ROI) and hand geometry features. Geometrical features are represented as the distances between different key points and IKP features are extracted by applying local binary pattern descriptor on the IKP ROI. The decision level AND fusion was adopted, which has shown improvement in performance of the combined scheme. The proposed approach is tested on the database collected at our institute. Proposed approach is of significance since both hand geometry and IKP features can be extracted from the palm region of the hand. The fusion of these features yields a false acceptance rate of 0.75%, false rejection rate of 0.86% for verification tests conducted, which is less when compared to the results obtained using individual traits. The results obtained confirm the usefulness of proposed approach and suitability of the selected features for developing biometric based recognition system based on features from palmar region of hand.

Keywords: biometrics, hand geometry features, inner knuckle print, recognition

Procedia PDF Downloads 193
730 Numerical Investigation on Anchored Sheet Pile Quay Wall with Separated Relieving Platform

Authors: Mahmoud Roushdy, Mohamed El Naggar, Ahmed Yehia Abdelaziz

Abstract:

Anchored sheet pile has been used worldwide as front quay walls for decades. With the increase in vessel drafts and weights, those sheet pile walls need to be upgraded by increasing the depth of the dredging line in front of the wall. A system has recently been used to increase the depth in front of the wall by installing a separated platform supported on a deep foundation (so called Relieving Platform) behind the sheet pile wall. The platform is structurally separated from the front wall. This paper presents a numerical investigation utilizing finite element analysis on the behavior of separated relieve platforms installed within existing anchored sheet pile quay walls. The investigation was done in two steps: a verification step followed by a parametric study. In the verification step, the numerical model was verified based on field measurements performed by others. The validated model was extended within the parametric study to a series of models with different backfill soils, separation gap width, and number of pile rows supporting the platform. The results of the numerical investigation show that using stiff clay as backfill soil (neglecting consolidation) gives better performance for the front wall and the first pile row adjacent to sandy backfills. The degree of compaction of the sandy backfill slightly increases lateral deformations but reduces bending moment acting on pile rows, while the effect is minor on the front wall. In addition, the increase in the separation gap width gradually increases bending moments on the front wall regardless of the backfill soil type, while this effect is reversed on pile rows (gradually decrease). Finally, the paper studies the possibility of reducing the number of pile rows along with the separation to take advantage of the positive separation effect on piles.

Keywords: anchored sheet pile, relieving platform, separation gap, upgrade quay wall

Procedia PDF Downloads 59
729 Freedom, Thought, and the Will: A Philosophical Reconstruction of Muhammad Iqbal’s Conception of Human Agency

Authors: Anwar ul Haq

Abstract:

Muhammad Iqbal was arguably the most significant South Asian Islamic philosopher of the last two centuries. While he is the most revered philosopher of the region, particularly in Pakistan, he is probably the least studied philosopher outside the region. The paper offers a philosophical reconstruction of Iqbal’s view of human agency; it has three sections. Section 1 focuses on Iqbal’s starting point of reflection in practical philosophy (inspired by Kant): our consciousness of ourselves as free agents. The paper brings out Iqbal’s continuity with Kant but also his divergence, in particular his non-Kantian view that we possess a non-sensory intuition of ourselves as free personal causes. It also offer an argument on Iqbal’s behalf for this claim, which is meant as a defense against a Kantian objection to the possibility of intuition of freedom and a skeptic’s challenge to the possibility of freedom in general. Remaining part of the paper offers a reconstruction of Iqbal’s two preconditions of the possibility of free agency. Section 2 discusses the first precondition, namely, the unity of consciousness involved in thought (this is a precondition of agency whether or not it is free). The unity has two aspects, a quantitative (or numerical) aspect and a qualitative (or rational) one. Section 2 offers a defense of these two aspects of the unity of consciousness presupposed by agency by focusing, with Iqbal, on the case of inference.Section 3 discusses a second precondition of the possibility of free agency, that thought and will must be identical in a free agent. Iqbal offers this condition in relief against Bergson’s view. Bergson (on Iqbal’s reading of him) argues that freedom of the will is possible only if the will’s ends are entirely its own and are wholly undetermined by anything from without, not even by thought. Iqbal observes that Bergson’s position ends in an insurmountable dualism of will and thought. Bergson’s view, Iqbal argues in particular, rests on an untenable conception of what an end consists in. An end, correctly understood, is framed by a thinking faculty, the intellect, and not by an extra-rational faculty. The present section outlines Iqbal’s argument for this claim, which rests on the premise that ends possess a certain unity which is intrinsic to particular ends and holds together different ends, and this unity is none other than the quantitative and qualitative unity of a thinking consciousness but in its practical application. Having secured the rational origin of ends, Iqbal argues that a free will must be identical with thought, or else it will be determined from without and won’t be free on that account. Freedom of the self is not a freedom from thought but a freedom in thought: it involves the ability to live a thoughtful life.

Keywords: iqbal, freedom, will, self

Procedia PDF Downloads 39
728 Modeling of Cf-252 and PuBe Neutron Sources by Monte Carlo Method in Order to Develop Innovative BNCT Therapy

Authors: Marta Błażkiewicz, Adam Konefał

Abstract:

Currently, boron-neutron therapy is carried out mainly with the use of a neutron beam generated in research nuclear reactors. This fact limits the possibility of realization of a BNCT in centers distant from the above-mentioned reactors. Moreover, the number of active nuclear reactors in operation in the world is decreasing due to the limited lifetime of their operation and the lack of new installations. Therefore, the possibilities of carrying out boron-neutron therapy based on the neutron beam from the experimental reactor are shrinking. However, the use of nuclear power reactors for BNCT purposes is impossible due to the infrastructure not intended for radiotherapy. Therefore, a serious challenge is to find ways to perform boron-neutron therapy based on neutrons generated outside the research nuclear reactor. This work meets this challenge. Its goal is to develop a BNCT technique based on commonly available neutron sources such as Cf-252 and PuBe, which will enable the above-mentioned therapy in medical centers unrelated to nuclear research reactors. Advances in the field of neutron source fabrication make it possible to achieve strong neutron fluxes. The current stage of research focuses on the development of virtual models of the above-mentioned sources using the Monte Carlo simulation method. In this study, the GEANT4 tool was used, including the model for simulating neutron-matter interactions - High Precision Neutron. Models of neutron sources were developed on the basis of experimental verification based on the activation detectors method with the use of indium foil and the cadmium differentiation method allowing to separate the indium activation contribution from thermal and resonance neutrons. Due to the large number of factors affecting the result of the verification experiment, the 10% discrepancy between the simulation and experiment results was accepted.

Keywords: BNCT, virtual models, neutron sources, monte carlo, GEANT4, neutron activation detectors, gamma spectroscopy

Procedia PDF Downloads 160
727 Parallel Version of Reinhard’s Color Transfer Algorithm

Authors: Abhishek Bhardwaj, Manish Kumar Bajpai

Abstract:

An image with its content and schema of colors presents an effective mode of information sharing and processing. By changing its color schema different visions and prospect are discovered by the users. This phenomenon of color transfer is being used by Social media and other channel of entertainment. Reinhard et al’s algorithm was the first one to solve this problem of color transfer. In this paper, we make this algorithm efficient by introducing domain parallelism among different processors. We also comment on the factors that affect the speedup of this problem. In the end by analyzing the experimental data we claim to propose a novel and efficient parallel Reinhard’s algorithm.

Keywords: Reinhard et al’s algorithm, color transferring, parallelism, speedup

Procedia PDF Downloads 583
726 Fragile States as the Fertile Ground for Non-State Actors: Colombia and Somalia

Authors: Giorgi Goguadze, Jakub Zajączkowski

Abstract:

This paper is written due to overview the connection between fragile states and non-state actors, we should take into account that fragile states may vary from weak, failing and failed. In this paper we will discuss about two countries, one of them is weak (Colombia/ second one is already failed- Somalia. We will try to understand what feeds ill non-state actors such as: terrorist organizations, criminal entities and other cells in these countries, what threats are they representing and how to eliminate these dangers in both national and international scope. This paper is mainly based on literature overview and personal attitude and doesn’t claim to be in scientific chain.

Keywords: fragile States, terrorism, tribalism, Somalia

Procedia PDF Downloads 344
725 Review of Studies on Agility in Knowledge Management

Authors: Ferdi Sönmez, Başak Buluz

Abstract:

Agility in Knowledge Management (AKM) tries to capture agility requirements and their respective answers within the framework of knowledge and learning for organizations. Since it is rather a new construct, it is difficult to claim that it has been sufficiently discussed and analyzed in practical and theoretical realms. Like the term ‘agile learning’, it is also commonly addressed in the software development and information technology fields and across the related areas where those technologies can be applied. The organizational perspective towards AKM, seems to need some more time to become scholarly mature. Nevertheless, in the literature one can come across some implicit usages of this term occasionally. This research is aimed to explore the conceptual background of agility in KM, re-conceptualize it and extend it to business applications with a special focus on e-business.

Keywords: knowledge management, agility requirements, agility, knowledge

Procedia PDF Downloads 239
724 Comparison of Different Reanalysis Products for Predicting Extreme Precipitation in the Southern Coast of the Caspian Sea

Authors: Parvin Ghafarian, Mohammadreza Mohammadpur Panchah, Mehri Fallahi

Abstract:

Synoptic patterns from surface up to tropopause are very important for forecasting the weather and atmospheric conditions. There are many tools to prepare and analyze these maps. Reanalysis data and the outputs of numerical weather prediction models, satellite images, meteorological radar, and weather station data are used in world forecasting centers to predict the weather. The forecasting extreme precipitating on the southern coast of the Caspian Sea (CS) is the main issue due to complex topography. Also, there are different types of climate in these areas. In this research, we used two reanalysis data such as ECMWF Reanalysis 5th Generation Description (ERA5) and National Centers for Environmental Prediction /National Center for Atmospheric Research (NCEP/NCAR) for verification of the numerical model. ERA5 is the latest version of ECMWF. The temporal resolution of ERA5 is hourly, and the NCEP/NCAR is every six hours. Some atmospheric parameters such as mean sea level pressure, geopotential height, relative humidity, wind speed and direction, sea surface temperature, etc. were selected and analyzed. Some different type of precipitation (rain and snow) was selected. The results showed that the NCEP/NCAR has more ability to demonstrate the intensity of the atmospheric system. The ERA5 is suitable for extract the value of parameters for specific point. Also, ERA5 is appropriate to analyze the snowfall events over CS (snow cover and snow depth). Sea surface temperature has the main role to generate instability over CS, especially when the cold air pass from the CS. Sea surface temperature of NCEP/NCAR product has low resolution near coast. However, both data were able to detect meteorological synoptic patterns that led to heavy rainfall over CS. However, due to the time lag, they are not suitable for forecast centers. The application of these two data is for research and verification of meteorological models. Finally, ERA5 has a better resolution, respect to NCEP/NCAR reanalysis data, but NCEP/NCAR data is available from 1948 and appropriate for long term research.

Keywords: synoptic patterns, heavy precipitation, reanalysis data, snow

Procedia PDF Downloads 93
723 Modeling and Analyzing the WAP Class 2 Wireless Transaction Protocol Using Event-B

Authors: Rajaa Filali, Mohamed Bouhdadi

Abstract:

This paper presents an incremental formal development of the Wireless Transaction Protocol (WTP) in Event-B. WTP is part of the Wireless Application Protocol (WAP) architectures and provides a reliable request-response service. To model and verify the protocol, we use the formal technique Event-B which provides an accessible and rigorous development method. This interaction between modelling and proving reduces the complexity and helps to eliminate misunderstandings, inconsistencies, and specification gaps. As result, verification of WTP allows us to find some deficiencies in the current specification.

Keywords: event-B, wireless transaction protocol, proof obligation, refinement, Rodin, ProB

Procedia PDF Downloads 290
722 Application of Groundwater Level Data Mining in Aquifer Identification

Authors: Liang Cheng Chang, Wei Ju Huang, You Cheng Chen

Abstract:

Investigation and research are keys for conjunctive use of surface and groundwater resources. The hydrogeological structure is an important base for groundwater analysis and simulation. Traditionally, the hydrogeological structure is artificially determined based on geological drill logs, the structure of wells, groundwater levels, and so on. In Taiwan, groundwater observation network has been built and a large amount of groundwater-level observation data are available. The groundwater level is the state variable of the groundwater system, which reflects the system response combining hydrogeological structure, groundwater injection, and extraction. This study applies analytical tools to the observation database to develop a methodology for the identification of confined and unconfined aquifers. These tools include frequency analysis, cross-correlation analysis between rainfall and groundwater level, groundwater regression curve analysis, and decision tree. The developed methodology is then applied to groundwater layer identification of two groundwater systems: Zhuoshui River alluvial fan and Pingtung Plain. The abovementioned frequency analysis uses Fourier Transform processing time-series groundwater level observation data and analyzing daily frequency amplitude of groundwater level caused by artificial groundwater extraction. The cross-correlation analysis between rainfall and groundwater level is used to obtain the groundwater replenishment time between infiltration and the peak groundwater level during wet seasons. The groundwater regression curve, the average rate of groundwater regression, is used to analyze the internal flux in the groundwater system and the flux caused by artificial behaviors. The decision tree uses the information obtained from the above mentioned analytical tools and optimizes the best estimation of the hydrogeological structure. The developed method reaches training accuracy of 92.31% and verification accuracy 93.75% on Zhuoshui River alluvial fan and training accuracy 95.55%, and verification accuracy 100% on Pingtung Plain. This extraordinary accuracy indicates that the developed methodology is a great tool for identifying hydrogeological structures.

Keywords: aquifer identification, decision tree, groundwater, Fourier transform

Procedia PDF Downloads 131
721 Performance of the Strong Stability Method in the Univariate Classical Risk Model

Authors: Safia Hocine, Zina Benouaret, Djamil A¨ıssani

Abstract:

In this paper, we study the performance of the strong stability method of the univariate classical risk model. We interest to the stability bounds established using two approaches. The first based on the strong stability method developed for a general Markov chains. The second approach based on the regenerative processes theory . By adopting an algorithmic procedure, we study the performance of the stability method in the case of exponential distribution claim amounts. After presenting numerically and graphically the stability bounds, an interpretation and comparison of the results have been done.

Keywords: Marcov chain, regenerative process, risk model, ruin probability, strong stability

Procedia PDF Downloads 291
720 Postmodern Communication Through Semiology

Authors: Mladen Milicevic

Abstract:

This paper takes a semiological approach to show, that the meaning is not located in the art object nor it is exclusively in the mind of the perceiver, but rather lies in the relationship of the two. The ultimate intention of making art is to be presented and perceived by subjective human beings. But there will be as many different interpretations of the art presented to them, as they are individuals in the audience. To support this claim, the latest research from neuroscience, cognitive psychology, and Neo-Darwinism is used. This paper draws on Richard Dawkins’ concept of memes as one of the main tools for explaining how differences get created within various socio-cultural environments. Analyzing pitfalls of the modernist worldview, the author proposes postmodern methods as more efficient ways of understanding today’s complexities in the art, culture, and the world. Deconstructing how these differences have come about, presents a possibility for the transgression of the opposing and many times adamant viewpoints.

Keywords: semiology, music, meme, postmodern

Procedia PDF Downloads 370
719 Router 1X3 - RTL Design and Verification

Authors: Nidhi Gopal

Abstract:

Routing is the process of moving a packet of data from source to destination and enables messages to pass from one computer to another and eventually reach the target machine. A router is a networking device that forwards data packets between computer networks. It is connected to two or more data lines from different networks (as opposed to a network switch, which connects data lines from one single network). This paper mainly emphasizes upon the study of router device, its top level architecture, and how various sub-modules of router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top module.

Keywords: data packets, networking, router, routing

Procedia PDF Downloads 761
718 Determiner Phrase in Persian

Authors: Reza Morad Sahraei, Roghayeh Kazeminahad

Abstract:

Surveying the structure of NP in Persian, this article tries to show that most of NP constituents are either independent of each other or they are dependent to Determiner Phrase (=DP). The writer follows a uniform minimal analysis to illustrate the structural position of relevant constituents of DP, including Possessive Phrase, Ezafat Phrase and Quantifier Phrase, under the tree diagram. The most important point of this article is the claim that NP is mostly one of the dependents of DP. Hence, the final section of the article deals with and analyzes the structure of DP in Persian. The DP analysis undertaken in this article has some advantages. It can explain the internal relevance of all DP constituents and provides them all a uniform analysis. Also, the semantic importance of Persian genitive marker and its role in parsing is borne out.

Keywords: determiner phrase (DP), ezafat phrase (Ezaf P), noun phrase(NP), possessive phrase (PossP), quantifier phrase (QP)

Procedia PDF Downloads 552
717 Execution of Joinery in Large Scale Projects: Middle East Region as a Case Study

Authors: Arsany Philip Fawzy

Abstract:

This study is going to address the hurdles of project management in the joinery field. It is widely divided into two sections; the first one will shed light on how to execute large-scale projects with a specific focus on the middle east region. It will also raise major obstacles that may face the joinery team from the site clearance and the coordination between the joinery team and the construction team. The second section is going to technically analyze the commercial side of the joinery and how to control the main cost of the project to avoid financial problems. It will also suggest empirical solutions to monitor the cost impact (e.g., Variation of contract quantity and claims).

Keywords: clearance, quality, cost, variation, claim

Procedia PDF Downloads 69
716 A Hazard Rate Function for the Time of Ruin

Authors: Sule Sahin, Basak Bulut Karageyik

Abstract:

This paper introduces a hazard rate function for the time of ruin to calculate the conditional probability of ruin for very small intervals. We call this function the force of ruin (FoR). We obtain the expected time of ruin and conditional expected time of ruin from the exact finite time ruin probability with exponential claim amounts. Then we introduce the FoR which gives the conditional probability of ruin and the condition is that ruin has not occurred at time t. We analyse the behavior of the FoR function for different initial surpluses over a specific time interval. We also obtain FoR under the excess of loss reinsurance arrangement and examine the effect of reinsurance on the FoR.

Keywords: conditional time of ruin, finite time ruin probability, force of ruin, reinsurance

Procedia PDF Downloads 362
715 Walking across the Government of Egypt: A Single Country Comparative Study of the Past and Current Condition of the Government of Egypt

Authors: Homyr L. Garcia, Jr., Anne Margaret A. Rendon, Carla Michaela B. Taguinod

Abstract:

Nothing is constant in this world but change. This is the reality wherein a lot of people fail to recognize and maybe, it is because of the fact that some see things that are happening with little value or no value at all until it’s gone. For the past years, Egypt was known for its stable government. It was able to withstand a lot of problems and crisis which challenged their country in ways which can never be imagined. In the present time, it seems like in just a snap of a finger, the said stability vanished and it was immediately replaced by a crisis which resulted to a failure in some parts of their government. In addition, this problem continued to worsen and the current situation of Egypt is just a reflection or a result of it. On the other hand, as the researchers continued to study the reasons why the government of Egypt is unstable, they concluded that there might be a possibility that they will be able to produce ways in which their country could be helped or improved. The instability of the government of Egypt is the product of combining all the problems which affects the lives of the people. Some of the reasons that the researchers found are the following: 1) unending doubts of the people regarding the ruling capacity of elected presidents, 2) removal of President Mohamed Morsi in position, 3) economic crisis, 4) a lot of protests and revolution happened, 5) resignation of the long term President Hosni Mubarak and 6) the office of the President is most likely available only to the chosen successor. Also, according to previous researches, there are two plausible scenarios for the instability of Egypt: 1) a military intervention specifically the Supreme Council of the Armed Forces or SCAF, resulting from a contested succession and 2) an Islamist push for political power which highlights the claim that religion is a hindrance towards the development of their country and government. From the eight possible reasons, the researchers decided that they will be focusing on economic crisis since the instability is more clearly seen in the country’s economy which directly affects the people and the government itself. In addition, they made a hypothesis which states that stable economy is a prerequisite towards a stable government. If they will be able to show how this claim is true by using the Social Autopsy Research Design for the qualitative method and Pearson’s correlation coefficient for the quantitative method, the researchers might be able to produce a proposal on how Egypt can stabilize their government and avoid such problems. Also, the hypothesis will be based from the Rational Action Theory which is a theory for understanding and modeling social and economy as well as individual behavior.

Keywords: Pearson’s correlation coefficient, rational action theory, social autopsy research design, supreme council of the armed forces (SCAF)

Procedia PDF Downloads 380
714 A Hyperexponential Approximation to Finite-Time and Infinite-Time Ruin Probabilities of Compound Poisson Processes

Authors: Amir T. Payandeh Najafabadi

Abstract:

This article considers the problem of evaluating infinite-time (or finite-time) ruin probability under a given compound Poisson surplus process by approximating the claim size distribution by a finite mixture exponential, say Hyperexponential, distribution. It restates the infinite-time (or finite-time) ruin probability as a solvable ordinary differential equation (or a partial differential equation). Application of our findings has been given through a simulation study.

Keywords: ruin probability, compound poisson processes, mixture exponential (hyperexponential) distribution, heavy-tailed distributions

Procedia PDF Downloads 312
713 Bug Localization on Single-Line Bugs of Apache Commons Math Library

Authors: Cherry Oo, Hnin Min Oo

Abstract:

Software bug localization is one of the most costly tasks in program repair technique. Therefore, there is a high claim for automated bug localization techniques that can monitor programmers to the locations of bugs, with slight human arbitration. Spectrum-based bug localization aims to help software developers to discover bugs rapidly by investigating abstractions of the program traces to make a ranking list of most possible buggy modules. Using the Apache Commons Math library project, we study the diagnostic accuracy using our spectrum-based bug localization metric. Our outcomes show that the greater performance of a specific similarity coefficient, used to inspect the program spectra, is mostly effective on localizing of single line bugs.

Keywords: software testing, bug localization, program spectra, bug

Procedia PDF Downloads 116
712 Extraction of Text Subtitles in Multimedia Systems

Authors: Amarjit Singh

Abstract:

In this paper, a method for extraction of text subtitles in large video is proposed. The video data needs to be annotated for many multimedia applications. Text is incorporated in digital video for the motive of providing useful information about that video. So need arises to detect text present in video to understanding and video indexing. This is achieved in two steps. First step is text localization and the second step is text verification. The method of text detection can be extended to text recognition which finds applications in automatic video indexing; video annotation and content based video retrieval. The method has been tested on various types of videos.

Keywords: video, subtitles, extraction, annotation, frames

Procedia PDF Downloads 571
711 Practical Application of Business Processes Simulation

Authors: M. Gregušová, V. Schindlerová, I. Šajdlerová, P. Mohyla, J. Kedroň

Abstract:

Company managers are always looking for more and more opportunities to succeed in today's fiercely competitive market. Maintain your place among the successful companies on the market today or come up with a revolutionary business idea; it is much more difficult than before. Each new or improved method, tools, or the approach that can improve the functioning of business processes or even the entire system is worth checking and verification. The use of simulation in the design of manufacturing systems and their management in practice is one of the ways without increased risk to find the optimal parameters of manufacturing processes and systems. The paper presents an example of using simulation to solve the bottleneck problem in concrete company.

Keywords: practical applications, business processes, systems, simulation

Procedia PDF Downloads 611
710 Peaceful Coexistence with Non-Muslims from the Perspective of Quran

Authors: Mohsen Nouraei

Abstract:

Peaceful coexistence with other religions is one of the most important matters raised the issue of religious diversity. Some people believe that the Quranic policy about the non-Muslims is based on the war and regard the reason of the progress of Islam in the early centuries as based on sword force. This article, which is written in a descriptive and analytical method, investigates this claim and evaluates it with the teachings and instructions of the Quran. The result of this paper shows that not only the teachings of the Quran do not cause the problems, but also The Quranic verses has obligated the Muslims to interact peacefully with their doctrinal opponents and exercise justice in this regard. This paper shows that the principle of interaction with non-Muslims is based on peace and coexistence, and Islam is the inspirer of religious coexistence with the followers of other religions.

Keywords: Quran, peace, religious coexistence, Christians, Jewish

Procedia PDF Downloads 387