Search results for: loss distribution approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20268

Search results for: loss distribution approach

16128 From Mathematics Project-Based Learning to Commercial Product Using Geometer’s Sketchpad (GSP)

Authors: Krongthong Khairiree

Abstract:

The purpose of this research study is to explore mathematics project-based learning approach and the use of technology in the context of school mathematics in Thailand. Data of the study were collected from 6 sample secondary schools and the students were 6-14 years old. Research findings show that through mathematics project-based learning approach and the use of GSP, students were able to make mathematics learning fun and challenging. Based on the students’ interviews they revealed that, with GSP, they were able to visualize and create graphical representations, which will enable them to develop their mathematical thinking skills, concepts and understanding. The students had fun in creating variety of graphs of functions which they can not do by drawing on graph paper. In addition, there are evidences to show the students’ abilities in connecting mathematics to real life outside the classroom and commercial products, such as weaving, patterning of broomstick, and ceramics design.

Keywords: mathematics, project-based learning, Geometer’s Sketchpad (GSP), commercial products

Procedia PDF Downloads 324
16127 Bioincision of Gmelina Arborea Roxb. Heartwood with Inonotus Dryophilus (Berk.) Murr. for Improved Chemical Uptake and Penetration

Authors: A. O. Adenaiya, S. F. Curling, O. Y. Ogunsanwo, G . A. Ormondroyd

Abstract:

Treatment of wood with chemicals in order to prolong its service life may prove difficult in some refractory wood species. This impermeability in wood is usually due to biochemical changes which occur during heartwood formation. Bioincision, which is a short-term, controlled microbial decomposition of wood, is one of the promising approaches capable of improving the amenability of refractory wood to chemical treatments. Gmelina Arborea, a mainstay timber species in Nigeria, has impermeable heartwood due to the excessive tyloses which occlude its vessels. Therefore, the chemical uptake and penetration in Gmelina arborea heartwood bioincised with Inonotus dryophilus fungus was investigated. Five mature Gmelina Arborea trees were harvested at the Departmental plantation in Ajibode, Ibadan, Nigeria and a bolt of 300 cm was obtained from the basal portion of each tree. The heartwood portion of the bolts was extracted and converted into dimensions 20 mm x 20 mm x 60 mm and subsequently conditioned (200C at 65% Relative Humidity). Twenty wood samples each were bioincised with the white-rot fungus Inonotus dryophilus (ID, 999) for 3, 5, 7 and 9 weeks using standard procedure, while a set of sterile control samples were prepared. Ten of each bioincised and control sample were pressure-treated with 5% tanalith preservative, while the other ten of each bioincised and control samples were pressure-treated with a liquid dye for easy traceability of the chemical in the wood, both using a full cell treatment process. The bioincised and control samples were evaluated for their Weight Loss before chemical treatment (WL, %), Preservative Absorption (PA, Kg/m3), Preservative Retention (PR, Kg/m3), Axial Absorption (AA, Kg/m3), Lateral Absorption (LA, Kg/m3), Axial Penetration Depth (APD, mm), Radial Penetration Depth (RPD, mm), and Tangential Penetration Depth (TPD, mm). The data obtained were analyzed using ANOVA at α0.05. Results show that the weight loss was least in the samples bioincised for three weeks (0.09%) and highest after 7 weeks of bioincision (0.48%). The samples bioincised for 3 weeks had the least PA (106.72 Kg/m3) and PR (5.87 Kg/m3), while the highest PA (134.9 Kg/m3) and PR were observed after 7 weeks of bioincision (7.42 Kg/m3). The AA ranged from 27.28 Kg/m3 (3 weeks) to 67.05 Kg/m3 (5 weeks), while the LA was least after 5 weeks of incubation (28.1 Kg/m3) and highest after 9 weeks (71.74 Kg/m3). Significantly lower APD was observed in control samples (6.97 mm) than in the samples bioincised after 9weeks (19.22 mm). The RPD increased from 0.08 mm (control samples) to 3.48 mm (5 weeks), while TPD ranged from 0.38 mm (control samples) to 0.63 mm (9 weeks), implying that liquid flow in the wood was predominantly through the axial pathway. Bioincising G. arborea heartwood with I. dryophilus fungus for 9 weeks is capable of enhancing chemical uptake and deeper penetration of chemicals in the wood through the degradation of the occluding vessel tyloses, which is accompanied by a minimal degradation of the polymeric wood constituents.

Keywords: Bioincision, chemical uptake, penetration depth, refractory wood, tyloses

Procedia PDF Downloads 88
16126 Statistical Design of Synthetic VP X-bar Control Chat Using Markov Chain Approach

Authors: Ali Akbar Heydari

Abstract:

Control charts are an important tool of statistical quality control. Thesecharts are used to detect and eliminate unwanted special causes of variation that occurred during aperiod of time. The design and operation of control charts require the determination of three design parameters: the sample size (n), the sampling interval (h), and the width coefficient of control limits (k). Thevariable parameters (VP) x-bar controlchart is the x-barchart in which all the design parameters vary between twovalues. These values are a function of the most recent process information. In fact, in the VP x-bar chart, the position of each sample point on the chart establishes the size of the next sample and the timeof its sampling. The synthetic x-barcontrol chartwhich integrates the x-bar chart and the conforming run length (CRL) chart, provides significant improvement in terms of detection power over the basic x-bar chart for all levels of mean shifts. In this paper, we introduce the syntheticVP x-bar control chart for monitoring changes in the process mean. To determine the design parameters, we used a statistical design based on the minimum out of control average run length (ARL) criteria. The optimal chart parameters of the proposed chart are obtained using the Markov chain approach. A numerical example is also done to show the performance of the proposed chart and comparing it with the other control charts. The results show that our proposed syntheticVP x-bar controlchart perform better than the synthetic x-bar controlchart for all shift parameter values. Also, the syntheticVP x-bar controlchart perform better than the VP x-bar control chart for the moderate or large shift parameter values.

Keywords: control chart, markov chain approach, statistical design, synthetic, variable parameter

Procedia PDF Downloads 143
16125 Operations Research Applications in Audit Planning and Scheduling

Authors: Abdel-Aziz M. Mohamed

Abstract:

This paper presents a state-of-the-art survey of the operations research models developed for internal audit planning. Two alternative approaches have been followed in the literature for audit planning: (1) identifying the optimal audit frequency; and (2) determining the optimal audit resource allocation. The first approach identifies the elapsed time between two successive audits, which can be presented as the optimal number of audits in a given planning horizon, or the optimal number of transactions after which an audit should be performed. It also includes the optimal audit schedule. The second approach determines the optimal allocation of audit frequency among all auditable units in the firm. In our review, we discuss both the deterministic and probabilistic models developed for audit planning. In addition, game theory models are reviewed to find the optimal auditing strategy based on the interactions between the auditors and the clients.

Keywords: operations research applications, audit frequency, audit-staff scheduling, audit planning

Procedia PDF Downloads 800
16124 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups

Authors: Naushad Mamode Khan

Abstract:

The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood based estimating methodology. The joint generalized quasilikelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQLIII) that are based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.

Keywords: longitudinal, com-Poisson, ill-conditioned, INAR(1), GLMS, GQL

Procedia PDF Downloads 341
16123 Financial Assets Return, Economic Factors and Investor's Behavioral Indicators Relationships Modeling: A Bayesian Networks Approach

Authors: Nada Souissi, Mourad Mroua

Abstract:

The main purpose of this study is to examine the interaction between financial asset volatility, economic factors and investor's behavioral indicators related to both the company's and the markets stocks for the period from January 2000 to January2020. Using multiple linear regression and Bayesian Networks modeling, results show a positive and negative relationship between investor's psychology index, economic factors and predicted stock market return. We reveal that the application of the Bayesian Discrete Network contributes to identify the different cause and effect relationships between all economic, financial variables and psychology index.

Keywords: Financial asset return predictability, Economic factors, Investor's psychology index, Bayesian approach, Probabilistic networks, Parametric learning

Procedia PDF Downloads 124
16122 Solving Flowshop Scheduling Problems with Ant Colony Optimization Heuristic

Authors: Arshad Mehmood Ch, Riaz Ahmad, Imran Ali Ch, Waqas Durrani

Abstract:

This study deals with the application of Ant Colony Optimization (ACO) approach to solve no-wait flowshop scheduling problem (NW-FSSP). ACO algorithm so developed has been coded on Matlab computer application. The paper covers detailed steps to apply ACO and focuses on judging the strength of ACO in relation to other solution techniques previously applied to solve no-wait flowshop problem. The general purpose approach was able to find reasonably accurate solutions for almost all the problems under consideration and was able to handle a fairly large spectrum of problems with far reduced CPU effort. Careful scrutiny of the results reveals that the algorithm presented results better than other approaches like Genetic algorithm and Tabu Search heuristics etc; earlier applied to solve NW-FSSP data sets.

Keywords: no-wait, flowshop, scheduling, ant colony optimization (ACO), makespan

Procedia PDF Downloads 421
16121 Convectory Policing-Reconciling Historic and Contemporary Models of Police Service Delivery

Authors: Mark Jackson

Abstract:

Description: This paper is based on an theoretical analysis of the efficacy of the dominant model of policing in western jurisdictions. Those results are then compared with a similar analysis of a traditional reactive model. It is found that neither model provides for optimal delivery of services. Instead optimal service can be achieved by a synchronous hybrid model, termed the Convectory Policing approach. Methodology and Findings: For over three decades problem oriented policing (PO) has been the dominant model for western police agencies. Initially based on the work of Goldstein during the 1970s the problem oriented framework has spawned endless variants and approaches, most of which embrace a problem solving rather than a reactive approach to policing. This has included the Area Policing Concept (APC) applied in many smaller jurisdictions in the USA, the Scaled Response Policing Model (SRPM) currently under trial in Western Australia and the Proactive Pre-Response Approach (PPRA) which has also seen some success. All of these, in some way or another, are largely based on a model that eschews a traditional reactive model of policing. Convectory Policing (CP) is an alternative model which challenges the underpinning assumptions which have seen proliferation of the PO approach in the last three decades and commences by questioning the economics on which PO is based. It is argued that in essence, the PO relies on an unstated, and often unrecognised assumption that resources will be available to meet demand for policing services, while at the same time maintaining the capacity to deploy staff to develop solutions to the problems which were ultimately manifested in those same calls for service. The CP model relies on the observations from a numerous western jurisdictions to challenge the validity of that underpinning assumption, particularly in fiscally tight environment. In deploying staff to pursue and develop solutions to underpinning problems, there is clearly an opportunity cost. Those same staff cannot be allocated to alternative duties while engaged in a problem solution role. At the same time, resources in use responding to calls for service are unavailable, while committed to that role, to pursue solutions to the problems giving rise to those same calls for service. The two approaches, reactive and PO are therefore dichotomous. One cannot be optimised while the other is being pursued. Convectory Policing is a pragmatic response to the schism between the competing traditional and contemporary models. If it is not possible to serve either model with any real rigour, it becomes necessary to taper an approach to deliver specific outcomes against which success or otherwise might be measured. CP proposes that a structured roster-driven approach to calls for service, combined with the application of what is termed a resource-effect response capacity has the potential to resolve the inherent conflict between traditional and models of policing and the expectations of the community in terms of community policing based problem solving models.

Keywords: policing, reactive, proactive, models, efficacy

Procedia PDF Downloads 468
16120 Practical Methods for Automatic MC/DC Test Cases Generation of Boolean Expressions

Authors: Sekou Kangoye, Alexis Todoskoff, Mihaela Barreau

Abstract:

Modified Condition/Decision Coverage (MC/DC) is a structural coverage criterion that aims to prove that all conditions involved in a Boolean expression can influence the result of that expression. In the context of automotive, MC/DC is highly recommended and even required for most security and safety applications testing. However, due to complex Boolean expressions that often embedded in those applications, generating a set of MC/DC compliant test cases for any of these expressions is a nontrivial task and can be time consuming for testers. In this paper we present an approach to automatically generate MC/DC test cases for any Boolean expression. We introduce novel techniques, essentially based on binary trees to quickly and optimally generate MC/DC test cases for the expressions. Thus, the approach can be used to reduce the manual testing effort of testers.

Keywords: binary trees, MC/DC, test case generation, nontrivial task

Procedia PDF Downloads 423
16119 River Habitat Modeling for the Entire Macroinvertebrate Community

Authors: Pinna Beatrice., Laini Alex, Negro Giovanni, Burgazzi Gemma, Viaroli Pierluigi, Vezza Paolo

Abstract:

Habitat models rarely consider macroinvertebrates as ecological targets in rivers. Available approaches mainly focus on single macroinvertebrate species, not addressing the ecological needs and functionality of the entire community. This research aimed to provide an approach to model the habitat of the macroinvertebrate community. The approach is based on the recently developed Flow-T index, together with a Random Forest (RF) regression, which is employed to apply the Flow-T index at the meso-habitat scale. Using different datasets gathered from both field data collection and 2D hydrodynamic simulations, the model has been calibrated in the Trebbia river (2019 campaign), and then validated in the Trebbia, Taro, and Enza rivers (2020 campaign). The three rivers are characterized by a braiding morphology, gravel riverbeds, and summer low flows. The RF model selected 12 mesohabitat descriptors as important for the macroinvertebrate community. These descriptors belong to different frequency classes of water depth, flow velocity, substrate grain size, and connectivity to the main river channel. The cross-validation R² coefficient (R²𝒸ᵥ) of the training dataset is 0.71 for the Trebbia River (2019), whereas the R² coefficient for the validation datasets (Trebbia, Taro, and Enza Rivers 2020) is 0.63. The agreement between the simulated results and the experimental data shows sufficient accuracy and reliability. The outcomes of the study reveal that the model can identify the ecological response of the macroinvertebrate community to possible flow regime alterations and to possible river morphological modifications. Lastly, the proposed approach allows extending the MesoHABSIM methodology, widely used for the fish habitat assessment, to a different ecological target community. Further applications of the approach can be related to flow design in both perennial and non-perennial rivers, including river reaches in which fish fauna is absent.

Keywords: ecological flows, macroinvertebrate community, mesohabitat, river habitat modeling

Procedia PDF Downloads 78
16118 Detection of New Attacks on Ubiquitous Services in Cloud Computing and Countermeasures

Authors: L. Sellami, D. Idoughi, P. F. Tiako

Abstract:

Cloud computing provides infrastructure to the enterprise through the Internet allowing access to cloud services at anytime and anywhere. This pervasive aspect of the services, the distributed nature of data and the wide use of information make cloud computing vulnerable to intrusions that violate the security of the cloud. This requires the use of security mechanisms to detect malicious behavior in network communications and hosts such as intrusion detection systems (IDS). In this article, we focus on the detection of intrusion into the cloud sing IDSs. We base ourselves on client authentication in the computing cloud. This technique allows to detect the abnormal use of ubiquitous service and prevents the intrusion of cloud computing. This is an approach based on client authentication data. Our IDS provides intrusion detection inside and outside cloud computing network. It is a double protection approach: The security user node and the global security cloud computing.

Keywords: cloud computing, intrusion detection system, privacy, trust

Procedia PDF Downloads 302
16117 Fusion of Finger Inner Knuckle Print and Hand Geometry Features to Enhance the Performance of Biometric Verification System

Authors: M. L. Anitha, K. A. Radhakrishna Rao

Abstract:

With the advent of modern computing technology, there is an increased demand for developing recognition systems that have the capability of verifying the identity of individuals. Recognition systems are required by several civilian and commercial applications for providing access to secured resources. Traditional recognition systems which are based on physical identities are not sufficiently reliable to satisfy the security requirements due to the use of several advances of forgery and identity impersonation methods. Recognizing individuals based on his/her unique physiological characteristics known as biometric traits is a reliable technique, since these traits are not transferable and they cannot be stolen or lost. Since the performance of biometric based recognition system depends on the particular trait that is utilized, the present work proposes a fusion approach which combines Inner knuckle print (IKP) trait of the middle, ring and index fingers with the geometrical features of hand. The hand image captured from a digital camera is preprocessed to find finger IKP as region of interest (ROI) and hand geometry features. Geometrical features are represented as the distances between different key points and IKP features are extracted by applying local binary pattern descriptor on the IKP ROI. The decision level AND fusion was adopted, which has shown improvement in performance of the combined scheme. The proposed approach is tested on the database collected at our institute. Proposed approach is of significance since both hand geometry and IKP features can be extracted from the palm region of the hand. The fusion of these features yields a false acceptance rate of 0.75%, false rejection rate of 0.86% for verification tests conducted, which is less when compared to the results obtained using individual traits. The results obtained confirm the usefulness of proposed approach and suitability of the selected features for developing biometric based recognition system based on features from palmar region of hand.

Keywords: biometrics, hand geometry features, inner knuckle print, recognition

Procedia PDF Downloads 210
16116 The Effects of Damping Devices on Displacements, Velocities and Accelerations of Structures

Authors: Radhwane Boudjelthia

Abstract:

The most recent earthquakes that occurred in the world and particularly in Algeria, have killed thousands of people and severe damage. The example that is etched in our memory is the last earthquake in the regions of Boumerdes and Algiers (Boumerdes earthquake of May 21, 2003). For all the actors involved in the building process, the earthquake is the litmus test for construction. The goal we set ourselves is to contribute to the implementation of a thoughtful approach to the seismic protection of structures. For many engineers, the most conventional approach protection works (buildings and bridges) the effects of earthquakes is to increase rigidity. This approach is not always effective, especially when there is a context that favors the phenomenon of resonance and amplification of seismic forces. Therefore, the field of earthquake engineering has made significant inroads among others catalyzed by the development of computational techniques in computer form and the use of powerful test facilities. This has led to the emergence of several innovative technologies, such as the introduction of special devices insulation between infrastructure and superstructure. This approach, commonly known as "seismic isolation" to absorb the significant efforts without the structure is damaged and thus ensuring the protection of lives and property. In addition, the restraints to the construction by the ground shaking are located mainly at the supports. With these moves, the natural period of construction is increasing, and seismic loads are reduced. Thus, there is an attenuation of the seismic movement. Likewise, the insulation of the base mechanism may be used in combination with earthquake dampers in order to control the deformation of the insulation system and the absolute displacement of the superstructure located above the isolation interface. On the other hand, only can use these earthquake dampers to reduce the oscillation amplitudes and thus reduce seismic loads. The use of damping devices represents an effective solution for the rehabilitation of existing structures. Given all these acceleration reducing means considered passive, much research has been conducted for several years to develop an active control system of the response of buildings to earthquakes.

Keywords: earthquake, building, seismic forces, displacement, resonance, response

Procedia PDF Downloads 115
16115 Community-Based Settlement Environment in Malalayang Coastal Area, Manado City

Authors: Teguh R. Hakim, Frenny F. F. Kairupan, Alberta M. Mantiri

Abstract:

The face of the coastal city is generally the same as other cities face showing the dualistic, traditional and modern, rural and urbanity, planned and unplanned, slum and high quality. Manado city is located on the northern coastal areas of the island of Sulawesi, Indonesia. Manado city is located on the northern coastal areas of the island of Sulawesi, Indonesia. Urban environmental problems ever occurred in this city, which is the impact of dualistic urban. Overcrowding, inadequate infrastructure, and limited human resources become the main cause of untidiness the coastal settlements in Malalayang. This has an impact on the activities of social, economic, public health level in the environment of coastal City of Manado, Malalayang. This is becoming a serious problem which must be tackled jointly by the government, private parties, and the community. Community-based settlement environment setup, into one solution to realize the city's coastal settlements livable. As for this research aims to analyze the involvement of local communities in arrangements of the settlement. The participatory approach of the model used in this study. Its application is mainly at macro and meso-scale (region, city, and environment) or community architecture. Model participatory approach leads more operational research approach to find a solution/answer to the problems of settlement. The participatory approach is a model for research that involves researchers and society as an object at the same time the subject of research, which in the process in addition to researching also developed other forms of participation in the design and build together. The expected results of this study were able to provide education to the community about environmental and set up a livable settlement for the sake of improving the quality of life. The study also becomes inputs to the government in applying the pattern of development that will be implemented in the future.

Keywords: arrangements the coastal environment, community participation, urban environmental problems, livable settlement

Procedia PDF Downloads 225
16114 Mechanisms Leading to the Protective Behavior of Ethanol Vapour Drying of Probiotics

Authors: Shahnaz Mansouri, Xiao Dong Chen, Meng Wai Woo

Abstract:

A new antisolvent vapour precipitation approach was used to make ultrafine submicron probiotic encapsulates. The approach uses ethanol vapour to precipitate submicron encapsulates within relatively large droplets. Surprisingly, the probiotics (Lactobacillus delbrueckii ssp. bulgaricus, Streptococcus thermophilus) showed relatively high survival even under destructive ethanolic conditions within the droplet. This unusual behaviour was deduced to be caused by the denaturation and aggregation of the milk protein forming an ethanolic protective matrix for the probiotics. Skim milk droplets which is rich in casein and contains naturally occurring minerals provided higher ethanolic protection when compared whey protein isolate and lactose droplets.

Keywords: whey, skim milk, probiotic, antisolvent, precipitation, encapsulation, denaturation, aggregation

Procedia PDF Downloads 510
16113 AS-Geo: Arbitrary-Sized Image Geolocalization with Learnable Geometric Enhancement Resizer

Authors: Huayuan Lu, Chunfang Yang, Ma Zhu, Baojun Qi, Yaqiong Qiao, Jiangqian Xu

Abstract:

Image geolocalization has great application prospects in fields such as autonomous driving and virtual/augmented reality. In practical application scenarios, the size of the image to be located is not fixed; it is impractical to train different networks for all possible sizes. When its size does not match the size of the input of the descriptor extraction model, existing image geolocalization methods usually directly scale or crop the image in some common ways. This will result in the loss of some information important to the geolocalization task, thus affecting the performance of the image geolocalization method. For example, excessive down-sampling can lead to blurred building contour, and inappropriate cropping can lead to the loss of key semantic elements, resulting in incorrect geolocation results. To address this problem, this paper designs a learnable image resizer and proposes an arbitrary-sized image geolocation method. (1) The designed learnable image resizer employs the self-attention mechanism to enhance the geometric features of the resized image. Firstly, it applies bilinear interpolation to the input image and its feature maps to obtain the initial resized image and the resized feature maps. Then, SKNet (selective kernel net) is used to approximate the best receptive field, thus keeping the geometric shapes as the original image. And SENet (squeeze and extraction net) is used to automatically select the feature maps with strong contour information, enhancing the geometric features. Finally, the enhanced geometric features are fused with the initial resized image, to obtain the final resized images. (2) The proposed image geolocalization method embeds the above image resizer as a fronting layer of the descriptor extraction network. It not only enables the network to be compatible with arbitrary-sized input images but also enhances the geometric features that are crucial to the image geolocalization task. Moreover, the triplet attention mechanism is added after the first convolutional layer of the backbone network to optimize the utilization of geometric elements extracted by the first convolutional layer. Finally, the local features extracted by the backbone network are aggregated to form image descriptors for image geolocalization. The proposed method was evaluated on several mainstream datasets, such as Pittsburgh30K, Tokyo24/7, and Places365. The results show that the proposed method has excellent size compatibility and compares favorably to recently mainstream geolocalization methods.

Keywords: image geolocalization, self-attention mechanism, image resizer, geometric feature

Procedia PDF Downloads 197
16112 Retrospective Reconstruction of Time Series Data for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

The development, operation and maintenance of Integrated Waste Management Systems (IWMS) affects essentially the sustainable concern of every region. The features of such systems have great influence on all of the components of sustainability. In order to reach the optimal way of processes, a comprehensive mapping of the variables affecting the future efficiency of the system is needed such as analysis of the interconnections among the components and modelling of their interactions. The planning of a IWMS is based fundamentally on technical and economical opportunities and the legal framework. Modelling the sustainability and operation effectiveness of a certain IWMS is not in the scope of the present research. The complexity of the systems and the large number of the variables require the utilization of a complex approach to model the outcomes and future risks. This complex method should be able to evaluate the logical framework of the factors composing the system and the interconnections between them. The authors of this paper studied the usability of the Fuzzy Cognitive Map (FCM) approach modelling the future operation of IWMS’s. The approach requires two input data set. One is the connection matrix containing all the factors affecting the system in focus with all the interconnections. The other input data set is the time series, a retrospective reconstruction of the weights and roles of the factors. This paper introduces a novel method to develop time series by content analysis.

Keywords: content analysis, factors, integrated waste management system, time series

Procedia PDF Downloads 313
16111 Anti-Corruption in Adverse Contexts: A Strategic Approach

Authors: Mushtaq H. Khan, Antonio Andreoni, Pallavi Roy

Abstract:

Developing countries are characterized by political settlements where formal rules are generally weakly enforced and widely violated. Conventional anti-corruption strategies that focus on improving the general enforcement of a rule of law and raising the costs of corruption facing individual public officials have typically delivered poor results in these contexts. Our alternative approach is to identify anti-corruption strategies that have a high impact and that are feasible to implement in these contexts. Our alternative approach identifies anti-corruption strategies from the bottom up. This involves identifying the characteristics of the corruption constraining particular development outcomes. By drawing on theories of rents and rent seeking, and theories of political settlements, we can assess the developmental impact of particular anti-corruption strategies and the feasibility of implementing these strategies. We argue that feasible anti-corruption in these contexts cannot be solely based on conventional anti-corruption strategies. In societies that have widespread rule violations, high-impact anti-corruption is only likely to be feasible if the overall strategy succeeds in aligning the interests and capabilities of powerful organizations at the sectoral level to support the enforcement of particular sets of rules. We examine four related strategies for changing these incentives and capabilities of critical stakeholders at the local or sectoral level, and we argue that this can provide a framework for organizing research on the impact and feasibility of anti-corruption activities in different priority areas in particular countries.

Keywords: anti-corruption, development, political settlements analysis, rule of law

Procedia PDF Downloads 395
16110 Dynamic Fault Tree Analysis of Dynamic Positioning System through Monte Carlo Approach

Authors: A. S. Cheliyan, S. K. Bhattacharyya

Abstract:

Dynamic Positioning System (DPS) is employed in marine vessels of the offshore oil and gas industry. It is a computer controlled system to automatically maintain a ship’s position and heading by using its own thrusters. Reliability assessment of the same can be analyzed through conventional fault tree. However, the complex behaviour like sequence failure, redundancy management and priority of failing of events cannot be analyzed by the conventional fault trees. The Dynamic Fault Tree (DFT) addresses these shortcomings of conventional Fault Tree by defining additional gates called dynamic gates. Monte Carlo based simulation approach has been adopted for the dynamic gates. This method of realistic modeling of DPS gives meaningful insight into the system reliability and the ability to improve the same.

Keywords: dynamic positioning system, dynamic fault tree, Monte Carlo simulation, reliability assessment

Procedia PDF Downloads 756
16109 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 73
16108 Music Note Detection and Dictionary Generation from Music Sheet Using Image Processing Techniques

Authors: Muhammad Ammar, Talha Ali, Abdul Basit, Bakhtawar Rajput, Zobia Sohail

Abstract:

Music note detection is an area of study for the past few years and has its own influence in music file generation from sheet music. We proposed a method to detect music notes on sheet music using basic thresholding and blob detection. Subsequently, we created a notes dictionary using a semi-supervised learning approach. After notes detection, for each test image, the new symbols are added to the dictionary. This makes the notes detection semi-automatic. The experiments are done on images from a dataset and also on the captured images. The developed approach showed almost 100% accuracy on the dataset images, whereas varying results have been seen on captured images.

Keywords: music note, sheet music, optical music recognition, blob detection, thresholding, dictionary generation

Procedia PDF Downloads 160
16107 Investigations on Pyrolysis Model for Radiatively Dominant Diesel Pool Fire Using Fire Dynamic Simulator

Authors: Siva K. Bathina, Sudheer Siddapureddy

Abstract:

Pool fires are formed when the flammable liquid accidentally spills on the ground or water and ignites. Pool fire is a kind of buoyancy-driven and diffusion flame. There have been many pool fire accidents caused during processing, handling and storing of liquid fuels in chemical and oil industries. Such kind of accidents causes enormous damage to property as well as the loss of lives. Pool fires are complex in nature due to the strong interaction among the combustion, heat and mass transfers and pyrolysis at the fuel surface. Moreover, the experimental study of such large complex fires involves fire safety issues and difficulties in performing experiments. In the present work, large eddy simulations are performed to study such complex fire scenarios using fire dynamic simulator. A 1 m diesel pool fire is considered for the studied cases, and diesel is chosen as it is most commonly involved fuel in fire accidents. Fire simulations are performed by specifying two different boundary conditions: one the fuel is in liquid state and pyrolysis model is invoked, and the other by assuming the fuel is initially in a vapor state and thereby prescribing the mass loss rate. A domain of size 11.2 m × 11.2 m × 7.28 m with uniform structured grid is chosen for the numerical simulations. Grid sensitivity analysis is performed, and a non-dimensional grid size of 12 corresponding to 8 cm grid size is considered. Flame properties like mass burning rate, irradiance, and time-averaged axial flame temperature profile are predicted. The predicted steady-state mass burning rate is 40 g/s and is within the uncertainty limits of the previously reported experimental data (39.4 g/s). Though the profile of the irradiance at a distance from the fire along the height is somewhat in line with the experimental data and the location of the maximum value of irradiance is shifted to a higher location. This may be due to the lack of sophisticated models for the species transportation along with combustion and radiation in the continuous zone. Furthermore, the axial temperatures are not predicted well (for any of the boundary conditions) in any of the zones. The present study shows that the existing models are not sufficient enough for modeling blended fuels like diesel. The predictions are strongly dependent on the experimental values of the soot yield. Future experiments are necessary for generalizing the soot yield for different fires.

Keywords: burning rate, fire accidents, fire dynamic simulator, pyrolysis

Procedia PDF Downloads 177
16106 RPM-Synchronous Non-Circular Grinding: An Approach to Enhance Efficiency in Grinding of Non-Circular Workpieces

Authors: Matthias Steffan, Franz Haas

Abstract:

The production process grinding is one of the latest steps in a value-added manufacturing chain. Within this step, workpiece geometry and surface roughness are determined. Up to this process stage, considerable costs and energy have already been spent on components. According to the current state of the art, therefore, large safety reserves are calculated in order to guarantee a process capability. Especially for non-circular grinding, this fact leads to considerable losses of process efficiency. With present technology, various non-circular geometries on a workpiece must be grinded subsequently in an oscillating process where X- and Q-axis of the machine are coupled. With the approach of RPM-Synchronous Noncircular Grinding, such workpieces can be machined in an ordinary plung grinding process. Therefore, the workpieces and the grinding wheels revolutionary rate are in a fixed ratio. A non-circular grinding wheel is used to transfer its geometry onto the workpiece. The authors use a worldwide unique machine tool that was especially designed for this technology. Highest revolution rates on the workpiece spindle (up to 4500 rpm) are mandatory for the success of this grinding process. This grinding approach is performed in a two-step process. For roughing, a highly porous vitrified bonded grinding wheel with medium grain size is used. It ensures high specific material removal rates for efficiently producing the non-circular geometry on the workpiece. This process step is adapted by a force control algorithm, which uses acquired data from a three-component force sensor located in the dead centre of the tailstock. For finishing, a grinding wheel with a fine grain size is used. Roughing and finishing are performed consecutively among the same clamping of the workpiece with two locally separated grinding spindles. The approach of RPM-Synchronous Noncircular Grinding shows great efficiency enhancement in non-circular grinding. For the first time, three-dimensional non-circular shapes can be grinded that opens up various fields of application. Especially automotive industries show big interest in the emerging trend in finishing machining.

Keywords: efficiency enhancement, finishing machining, non-circular grinding, rpm-synchronous grinding

Procedia PDF Downloads 266
16105 Integrated Clean Development Mechanism and Risk Management Approach for Infrastructure Transportation Project

Authors: Debasis Sarkar

Abstract:

Clean development mechanism (CDM) can act as an effective instrument for mitigating climate change. This mechanism can effectively reduce the emission of CO2 and other green house gases (GHG). Construction of a mega infrastructure project like underground corridor construction for metro rail operation involves in consumption of substantial quantity of concrete which consumes huge quantity of energy consuming materials like cement and steel. This paper is an attempt to develop an integrated clean development mechanism and risk management approach for sustainable development for an underground corridor metro rail project in India during its construction phase. It was observed that about 35% reduction in CO2 emission can be obtained by adding fly ash as a part replacement of cement. The reduced emission quantity of CO2 which is of the quantum of about 21,646.36 MT would result in cost savings of approximately INR 8.5 million (USD 1,29,878).But construction and operation of such infrastructure projects of the present era are subject to huge risks and uncertainties throughout all the phases of the project, thus reducing the probability of successful completion of the project within stipulated time and cost frame. Thus, an integrated approach of combining CDM with risk management would enable the metro rail authorities to develop a sustainable risk mitigation measure framework to ensure more cost and energy savings and lesser time and cost over-run.

Keywords: clean development mechanism (CDM), infrastructure transportation, project risk management, underground metro rail

Procedia PDF Downloads 463
16104 Machine That Provides Mineral Fertilizer Equal to the Soil on the Slopes

Authors: Huseyn Nuraddin Qurbanov

Abstract:

The reliable food supply of the population of the republic is one of the main directions of the state's economic policy. Grain growing, which is the basis of agriculture, is important in this area. In the cultivation of cereals on the slopes, the application of equal amounts of mineral fertilizers the under the soil before sowing is a very important technological process. The low level of technical equipment in this area prevents producers from providing the country with the necessary quality cereals. Experience in the operation of modern technical means has shown that, at present, there is a need to provide an equal amount of fertilizer on the slopes to under the soil, fully meeting the agro-technical requirements. No fundamental changes have been made to the industrial machines that fertilize the under the soil, and unequal application of fertilizers under the soil on the slopes has been applied. This technological process leads to the destruction of new seedlings and reduced productivity due to intolerance to frost during the winter for the plant planted in the fall. In special climatic conditions, there is an optimal fertilization rate for each agricultural product. The application of fertilizers to the soil is one of the conditions that increase their efficiency in the field. As can be seen, the development of a new technical proposal for fertilizing and plowing the slopes in equal amounts on the slopes, improving the technological and design parameters, and taking into account the physical and mechanical properties of fertilizers is very important. Taking into account the above-mentioned issues, a combined plough was developed in our laboratory. Combined plough carries out pre-sowing technological operation in the cultivation of cereals, providing a smooth equal amount of mineral fertilizers under the soil on the slopes. Mathematical models of a smooth spreader that evenly distributes fertilizers in the field have been developed. Thus, diagrams and graphs obtained without distribution on the 8 partitions of the smooth spreader are constructed under the inclined angles of the slopes. Percentage and productivity of equal distribution in the field were noted by practical and theoretical analysis.

Keywords: combined plough, mineral fertilizer, equal sowing, fertilizer norm, grain-crops, sowing fertilizer

Procedia PDF Downloads 124
16103 Deep-Learning Based Approach to Facial Emotion Recognition through Convolutional Neural Network

Authors: Nouha Khediri, Mohammed Ben Ammar, Monji Kherallah

Abstract:

Recently, facial emotion recognition (FER) has become increasingly essential to understand the state of the human mind. Accurately classifying emotion from the face is a challenging task. In this paper, we present a facial emotion recognition approach named CV-FER, benefiting from deep learning, especially CNN and VGG16. First, the data is pre-processed with data cleaning and data rotation. Then, we augment the data and proceed to our FER model, which contains five convolutions layers and five pooling layers. Finally, a softmax classifier is used in the output layer to recognize emotions. Based on the above contents, this paper reviews the works of facial emotion recognition based on deep learning. Experiments show that our model outperforms the other methods using the same FER2013 database and yields a recognition rate of 92%. We also put forward some suggestions for future work.

Keywords: CNN, deep-learning, facial emotion recognition, machine learning

Procedia PDF Downloads 75
16102 Design Dual Band Band-Pass Filter by Using Stepped Impedance

Authors: Fawzia Al-Sakeer, Hassan Aldeeb

Abstract:

Development in the communications field is proceeding at an amazing speed, which has led researchers to improve and develop electronic circuits by increasing their efficiency and reducing their size to reduce the weight of electronic devices. One of the most important of these circuits is the band-pass filter, which is what made us carry out this research, which aims to use an alternate technology to design a dual band-pass filter by using a stepped impedance microstrip transmission line. We designed a filter that works at two center frequency bands by designing with the ADS program, and the results were excellent, as we obtained the two design frequencies, which are 1 and 3GHz, and the values of insertion loss S11, which was more than 21dB with a small area.

Keywords: band pass filter, dual band band-pass filter, ADS, microstrip filter, stepped impedance

Procedia PDF Downloads 49
16101 An Optimal Steganalysis Based Approach for Embedding Information in Image Cover Media with Security

Authors: Ahlem Fatnassi, Hamza Gharsellaoui, Sadok Bouamama

Abstract:

This paper deals with the study of interest in the fields of Steganography and Steganalysis. Steganography involves hiding information in a cover media to obtain the stego media in such a way that the cover media is perceived not to have any embedded message for its unintended recipients. Steganalysis is the mechanism of detecting the presence of hidden information in the stego media and it can lead to the prevention of disastrous security incidents. In this paper, we provide a critical review of the steganalysis algorithms available to analyze the characteristics of an image stego media against the corresponding cover media and understand the process of embedding the information and its detection. We anticipate that this paper can also give a clear picture of the current trends in steganography so that we can develop and improvise appropriate steganalysis algorithms.

Keywords: optimization, heuristics and metaheuristics algorithms, embedded systems, low-power consumption, steganalysis heuristic approach

Procedia PDF Downloads 280
16100 Surface to the Deeper: A Universal Entity Alignment Approach Focusing on Surface Information

Authors: Zheng Baichuan, Li Shenghui, Li Bingqian, Zhang Ning, Chen Kai

Abstract:

Entity alignment (EA) tasks in knowledge graphs often play a pivotal role in the integration of knowledge graphs, where structural differences often exist between the source and target graphs, such as the presence or absence of attribute information and the types of attribute information (text, timestamps, images, etc.). However, most current research efforts are focused on improving alignment accuracy, often along with an increased reliance on specific structures -a dependency that inevitably diminishes their practical value and causes difficulties when facing knowledge graph alignment tasks with varying structures. Therefore, we propose a universal knowledge graph alignment approach that only utilizes the common basic structures shared by knowledge graphs. We have demonstrated through experiments that our method achieves state-of-the-art performance in fair comparisons.

Keywords: knowledge graph, entity alignment, transformer, deep learning

Procedia PDF Downloads 29
16099 Probabilistic Approach to Contrast Theoretical Predictions from a Public Corruption Game Using Bayesian Networks

Authors: Jaime E. Fernandez, Pablo J. Valverde

Abstract:

This paper presents a methodological approach that aims to contrast/validate theoretical results from a corruption network game through probabilistic analysis of simulated microdata using Bayesian Networks (BNs). The research develops a public corruption model in a game theory framework. Theoretical results suggest a series of 'optimal settings' of model's exogenous parameters that boost the emergence of corruption. The paper contrasts these outcomes with probabilistic inference results based on BNs adjusted over simulated microdata. Principal findings indicate that probabilistic reasoning based on BNs significantly improves parameter specification and causal analysis in a public corruption game.

Keywords: Bayesian networks, probabilistic reasoning, public corruption, theoretical games

Procedia PDF Downloads 192