Search results for: database approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 15202

Search results for: database approach

13252 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups

Authors: Naushad Mamode Khan

Abstract:

The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood based estimating methodology. The joint generalized quasilikelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQLIII) that are based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.

Keywords: longitudinal, com-Poisson, ill-conditioned, INAR(1), GLMS, GQL

Procedia PDF Downloads 355
13251 Financial Assets Return, Economic Factors and Investor's Behavioral Indicators Relationships Modeling: A Bayesian Networks Approach

Authors: Nada Souissi, Mourad Mroua

Abstract:

The main purpose of this study is to examine the interaction between financial asset volatility, economic factors and investor's behavioral indicators related to both the company's and the markets stocks for the period from January 2000 to January2020. Using multiple linear regression and Bayesian Networks modeling, results show a positive and negative relationship between investor's psychology index, economic factors and predicted stock market return. We reveal that the application of the Bayesian Discrete Network contributes to identify the different cause and effect relationships between all economic, financial variables and psychology index.

Keywords: Financial asset return predictability, Economic factors, Investor's psychology index, Bayesian approach, Probabilistic networks, Parametric learning

Procedia PDF Downloads 150
13250 Solving Flowshop Scheduling Problems with Ant Colony Optimization Heuristic

Authors: Arshad Mehmood Ch, Riaz Ahmad, Imran Ali Ch, Waqas Durrani

Abstract:

This study deals with the application of Ant Colony Optimization (ACO) approach to solve no-wait flowshop scheduling problem (NW-FSSP). ACO algorithm so developed has been coded on Matlab computer application. The paper covers detailed steps to apply ACO and focuses on judging the strength of ACO in relation to other solution techniques previously applied to solve no-wait flowshop problem. The general purpose approach was able to find reasonably accurate solutions for almost all the problems under consideration and was able to handle a fairly large spectrum of problems with far reduced CPU effort. Careful scrutiny of the results reveals that the algorithm presented results better than other approaches like Genetic algorithm and Tabu Search heuristics etc; earlier applied to solve NW-FSSP data sets.

Keywords: no-wait, flowshop, scheduling, ant colony optimization (ACO), makespan

Procedia PDF Downloads 435
13249 Convectory Policing-Reconciling Historic and Contemporary Models of Police Service Delivery

Authors: Mark Jackson

Abstract:

Description: This paper is based on an theoretical analysis of the efficacy of the dominant model of policing in western jurisdictions. Those results are then compared with a similar analysis of a traditional reactive model. It is found that neither model provides for optimal delivery of services. Instead optimal service can be achieved by a synchronous hybrid model, termed the Convectory Policing approach. Methodology and Findings: For over three decades problem oriented policing (PO) has been the dominant model for western police agencies. Initially based on the work of Goldstein during the 1970s the problem oriented framework has spawned endless variants and approaches, most of which embrace a problem solving rather than a reactive approach to policing. This has included the Area Policing Concept (APC) applied in many smaller jurisdictions in the USA, the Scaled Response Policing Model (SRPM) currently under trial in Western Australia and the Proactive Pre-Response Approach (PPRA) which has also seen some success. All of these, in some way or another, are largely based on a model that eschews a traditional reactive model of policing. Convectory Policing (CP) is an alternative model which challenges the underpinning assumptions which have seen proliferation of the PO approach in the last three decades and commences by questioning the economics on which PO is based. It is argued that in essence, the PO relies on an unstated, and often unrecognised assumption that resources will be available to meet demand for policing services, while at the same time maintaining the capacity to deploy staff to develop solutions to the problems which were ultimately manifested in those same calls for service. The CP model relies on the observations from a numerous western jurisdictions to challenge the validity of that underpinning assumption, particularly in fiscally tight environment. In deploying staff to pursue and develop solutions to underpinning problems, there is clearly an opportunity cost. Those same staff cannot be allocated to alternative duties while engaged in a problem solution role. At the same time, resources in use responding to calls for service are unavailable, while committed to that role, to pursue solutions to the problems giving rise to those same calls for service. The two approaches, reactive and PO are therefore dichotomous. One cannot be optimised while the other is being pursued. Convectory Policing is a pragmatic response to the schism between the competing traditional and contemporary models. If it is not possible to serve either model with any real rigour, it becomes necessary to taper an approach to deliver specific outcomes against which success or otherwise might be measured. CP proposes that a structured roster-driven approach to calls for service, combined with the application of what is termed a resource-effect response capacity has the potential to resolve the inherent conflict between traditional and models of policing and the expectations of the community in terms of community policing based problem solving models.

Keywords: policing, reactive, proactive, models, efficacy

Procedia PDF Downloads 484
13248 Quantum Modelling of AgHMoO4, CsHMoO4 and AgCsMoO4 Chemistry in the Field of Nuclear Power Plant Safety

Authors: Mohamad Saab, Sidi Souvi

Abstract:

In a major nuclear accident, the released fission products (FPs) and the structural materials are likely to influence the transport of iodine in the reactor coolant system (RCS) of a pressurized water reactor (PWR). So far, the thermodynamic data on cesium and silver species used to estimate the magnitude of FP release show some discrepancies, data are scarce and not reliable. For this reason, it is crucial to review the thermodynamic values related to cesium and silver materials. To this end, we have used state-of-the-art quantum chemical methods to compute the formation enthalpies and entropies of AgHMoO₄, CsHMoO₄, and AgCsMoO₄ in the gas phase. Different quantum chemical methods have been investigated (DFT and CCSD(T)) in order to predict the geometrical parameters and the energetics including the correlation energy. The geometries were optimized with TPSSh-5%HF method, followed by a single point calculation of the total electronic energies using the CCSD(T) wave function method. We thus propose with a final uncertainty of about 2 kJmol⁻¹ standard enthalpies of formation of AgHMoO₄, CsHMoO₄, and AgCsMoO₄.

Keywords: nuclear accident, ASTEC code, thermochemical database, quantum chemical methods

Procedia PDF Downloads 189
13247 Reliable Method for Estimating Rating Curves in the Natural Rivers

Authors: Arash Ahmadi, Amirreza Kavousizadeh, Sanaz Heidarzadeh

Abstract:

Stage-discharge curve is one of the conventional methods for continuous river flow measurement. In this paper, an innovative approach is proposed for predicting the stage-discharge relationship using the application of isovel contours. Using the proposed method, it is possible to estimate the stage-discharge curve in the whole section with only using discharge information from just one arbitrary water level. For this purpose, multivariate relationships are used to determine the mean velocity in a cross-section. The unknown exponents of the proposed relationship have been obtained by using the second version of the Strength Pareto Evolutionary Algorithm (SPEA2), and the appropriate equation was selected by applying the TOPSIS (Technique for Order Preferences by Similarity to an Ideal Solution) approach. Results showed a close agreement between the estimated and observed data in the different cross-sections.

Keywords: rating curves, SPEA2, natural rivers, bed roughness distribution

Procedia PDF Downloads 159
13246 Streamlining Cybersecurity Risk Assessment for Industrial Control and Automation Systems: Leveraging the National Institute of Standard and Technology’s Risk Management Framework (RMF) Using Model-Based System Engineering (MBSE)

Authors: Gampel Alexander, Mazzuchi Thomas, Sarkani Shahram

Abstract:

The cybersecurity landscape is constantly evolving, and organizations must adapt to the changing threat environment to protect their assets. The implementation of the NIST Risk Management Framework (RMF) has become critical in ensuring the security and safety of industrial control and automation systems. However, cybersecurity professionals are facing challenges in implementing RMF, leading to systems operating without authorization and being non-compliant with regulations. The current approach to RMF implementation based on business practices is limited and insufficient, leaving organizations vulnerable to cyberattacks resulting in the loss of personal consumer data and critical infrastructure details. To address these challenges, this research proposes a Model-Based Systems Engineering (MBSE) approach to implementing cybersecurity controls and assessing risk through the RMF process. The study emphasizes the need to shift to a modeling approach, which can streamline the RMF process and eliminate bloated structures that make it difficult to receive an Authorization-To-Operate (ATO). The study focuses on the practical application of MBSE in industrial control and automation systems to improve the security and safety of operations. It is concluded that MBSE can be used to solve the implementation challenges of the NIST RMF process and improve the security of industrial control and automation systems. The research suggests that MBSE provides a more effective and efficient method for implementing cybersecurity controls and assessing risk through the RMF process. The future work for this research involves exploring the broader applicability of MBSE in different industries and domains. The study suggests that the MBSE approach can be applied to other domains beyond industrial control and automation systems.

Keywords: authorization-to-operate (ATO), industrial control systems (ICS), model-based system’s engineering (MBSE), risk management framework (RMF)

Procedia PDF Downloads 95
13245 Emotional Intelligence in the Modern World: A Quantitative and Qualitative Study of the UMCS Students

Authors: Anna Dabrowska

Abstract:

Taking Daniel Goleman’s (1994) belief that success in life depends on IQ in 20% and in 80% on emotional intelligence, and that it is worth considering emotional intelligence as an important factor in human performance and development potential, the aim of the paper is to explore the range of emotions experienced by university students who represent Society 5.0. This quantitative and qualitative study is meant to explore not only the list of the most and least experienced emotions by the students, but also the main reasons behind these feelings. The database of the study consists of 115 respondents out of 129 students of the 1st and 5th year of Applied Linguistics at Maria Curie-Skłodowska University, which constitutes 89% of those being surveyed. The data is extracted from the anonymous questionnaire, which comprises young people’s answers and discourse concerning the causes of their most experienced emotions. Following Robert Plutchik’s theory of eight primary emotions, i.e. anger, fear, sadness, disgust, surprise, anticipation, trust, and joy, we adopt his argument for the primacy of these emotions by showing each to be the trigger of behaviour with high survival value. In fact, all other emotions are mixed or derivative states; that is, they occur as combinations, mixtures, or compounds of the primary emotions. Accordingly, the eight primary emotions, and their mixed states, are checked in the study on the students.

Keywords: emotions, intelligence, students, discourse study, emotional intelligence

Procedia PDF Downloads 42
13244 Developing Rice Disease Analysis System on Mobile via iOS Operating System

Authors: Rujijan Vichivanives, Kittiya Poonsilp, Canasanan Wanavijit

Abstract:

This research aims to create mobile tools to analyze rice disease quickly and easily. The principle of object-oriented software engineering and objective-C language were used for software development methodology and the principle of decision tree technique was used for analysis method. Application users can select the features of rice disease or the color appears on the rice leaves for recognition analysis results on iOS mobile screen. After completing the software development, unit testing and integrating testing method were used to check for program validity. In addition, three plant experts and forty farmers have been assessed for usability and benefit of this system. The overall of users’ satisfaction was found in a good level, 57%. The plant experts give a comment on the addition of various disease symptoms in the database for more precise results of the analysis. For further research, it is suggested that image processing system should be developed as a tool that allows users search and analyze for rice diseases more convenient with great accuracy.

Keywords: rice disease, data analysis system, mobile application, iOS operating system

Procedia PDF Downloads 287
13243 Practical Methods for Automatic MC/DC Test Cases Generation of Boolean Expressions

Authors: Sekou Kangoye, Alexis Todoskoff, Mihaela Barreau

Abstract:

Modified Condition/Decision Coverage (MC/DC) is a structural coverage criterion that aims to prove that all conditions involved in a Boolean expression can influence the result of that expression. In the context of automotive, MC/DC is highly recommended and even required for most security and safety applications testing. However, due to complex Boolean expressions that often embedded in those applications, generating a set of MC/DC compliant test cases for any of these expressions is a nontrivial task and can be time consuming for testers. In this paper we present an approach to automatically generate MC/DC test cases for any Boolean expression. We introduce novel techniques, essentially based on binary trees to quickly and optimally generate MC/DC test cases for the expressions. Thus, the approach can be used to reduce the manual testing effort of testers.

Keywords: binary trees, MC/DC, test case generation, nontrivial task

Procedia PDF Downloads 447
13242 River Habitat Modeling for the Entire Macroinvertebrate Community

Authors: Pinna Beatrice., Laini Alex, Negro Giovanni, Burgazzi Gemma, Viaroli Pierluigi, Vezza Paolo

Abstract:

Habitat models rarely consider macroinvertebrates as ecological targets in rivers. Available approaches mainly focus on single macroinvertebrate species, not addressing the ecological needs and functionality of the entire community. This research aimed to provide an approach to model the habitat of the macroinvertebrate community. The approach is based on the recently developed Flow-T index, together with a Random Forest (RF) regression, which is employed to apply the Flow-T index at the meso-habitat scale. Using different datasets gathered from both field data collection and 2D hydrodynamic simulations, the model has been calibrated in the Trebbia river (2019 campaign), and then validated in the Trebbia, Taro, and Enza rivers (2020 campaign). The three rivers are characterized by a braiding morphology, gravel riverbeds, and summer low flows. The RF model selected 12 mesohabitat descriptors as important for the macroinvertebrate community. These descriptors belong to different frequency classes of water depth, flow velocity, substrate grain size, and connectivity to the main river channel. The cross-validation R² coefficient (R²𝒸ᵥ) of the training dataset is 0.71 for the Trebbia River (2019), whereas the R² coefficient for the validation datasets (Trebbia, Taro, and Enza Rivers 2020) is 0.63. The agreement between the simulated results and the experimental data shows sufficient accuracy and reliability. The outcomes of the study reveal that the model can identify the ecological response of the macroinvertebrate community to possible flow regime alterations and to possible river morphological modifications. Lastly, the proposed approach allows extending the MesoHABSIM methodology, widely used for the fish habitat assessment, to a different ecological target community. Further applications of the approach can be related to flow design in both perennial and non-perennial rivers, including river reaches in which fish fauna is absent.

Keywords: ecological flows, macroinvertebrate community, mesohabitat, river habitat modeling

Procedia PDF Downloads 94
13241 Detection of New Attacks on Ubiquitous Services in Cloud Computing and Countermeasures

Authors: L. Sellami, D. Idoughi, P. F. Tiako

Abstract:

Cloud computing provides infrastructure to the enterprise through the Internet allowing access to cloud services at anytime and anywhere. This pervasive aspect of the services, the distributed nature of data and the wide use of information make cloud computing vulnerable to intrusions that violate the security of the cloud. This requires the use of security mechanisms to detect malicious behavior in network communications and hosts such as intrusion detection systems (IDS). In this article, we focus on the detection of intrusion into the cloud sing IDSs. We base ourselves on client authentication in the computing cloud. This technique allows to detect the abnormal use of ubiquitous service and prevents the intrusion of cloud computing. This is an approach based on client authentication data. Our IDS provides intrusion detection inside and outside cloud computing network. It is a double protection approach: The security user node and the global security cloud computing.

Keywords: cloud computing, intrusion detection system, privacy, trust

Procedia PDF Downloads 324
13240 A Neuron Model of Facial Recognition and Detection of an Authorized Entity Using Machine Learning System

Authors: J. K. Adedeji, M. O. Oyekanmi

Abstract:

This paper has critically examined the use of Machine Learning procedures in curbing unauthorized access into valuable areas of an organization. The use of passwords, pin codes, user’s identification in recent times has been partially successful in curbing crimes involving identities, hence the need for the design of a system which incorporates biometric characteristics such as DNA and pattern recognition of variations in facial expressions. The facial model used is the OpenCV library which is based on the use of certain physiological features, the Raspberry Pi 3 module is used to compile the OpenCV library, which extracts and stores the detected faces into the datasets directory through the use of camera. The model is trained with 50 epoch run in the database and recognized by the Local Binary Pattern Histogram (LBPH) recognizer contained in the OpenCV. The training algorithm used by the neural network is back propagation coded using python algorithmic language with 200 epoch runs to identify specific resemblance in the exclusive OR (XOR) output neurons. The research however confirmed that physiological parameters are better effective measures to curb crimes relating to identities.

Keywords: biometric characters, facial recognition, neural network, OpenCV

Procedia PDF Downloads 256
13239 The Effects of Damping Devices on Displacements, Velocities and Accelerations of Structures

Authors: Radhwane Boudjelthia

Abstract:

The most recent earthquakes that occurred in the world and particularly in Algeria, have killed thousands of people and severe damage. The example that is etched in our memory is the last earthquake in the regions of Boumerdes and Algiers (Boumerdes earthquake of May 21, 2003). For all the actors involved in the building process, the earthquake is the litmus test for construction. The goal we set ourselves is to contribute to the implementation of a thoughtful approach to the seismic protection of structures. For many engineers, the most conventional approach protection works (buildings and bridges) the effects of earthquakes is to increase rigidity. This approach is not always effective, especially when there is a context that favors the phenomenon of resonance and amplification of seismic forces. Therefore, the field of earthquake engineering has made significant inroads among others catalyzed by the development of computational techniques in computer form and the use of powerful test facilities. This has led to the emergence of several innovative technologies, such as the introduction of special devices insulation between infrastructure and superstructure. This approach, commonly known as "seismic isolation" to absorb the significant efforts without the structure is damaged and thus ensuring the protection of lives and property. In addition, the restraints to the construction by the ground shaking are located mainly at the supports. With these moves, the natural period of construction is increasing, and seismic loads are reduced. Thus, there is an attenuation of the seismic movement. Likewise, the insulation of the base mechanism may be used in combination with earthquake dampers in order to control the deformation of the insulation system and the absolute displacement of the superstructure located above the isolation interface. On the other hand, only can use these earthquake dampers to reduce the oscillation amplitudes and thus reduce seismic loads. The use of damping devices represents an effective solution for the rehabilitation of existing structures. Given all these acceleration reducing means considered passive, much research has been conducted for several years to develop an active control system of the response of buildings to earthquakes.

Keywords: earthquake, building, seismic forces, displacement, resonance, response

Procedia PDF Downloads 127
13238 Community-Based Settlement Environment in Malalayang Coastal Area, Manado City

Authors: Teguh R. Hakim, Frenny F. F. Kairupan, Alberta M. Mantiri

Abstract:

The face of the coastal city is generally the same as other cities face showing the dualistic, traditional and modern, rural and urbanity, planned and unplanned, slum and high quality. Manado city is located on the northern coastal areas of the island of Sulawesi, Indonesia. Manado city is located on the northern coastal areas of the island of Sulawesi, Indonesia. Urban environmental problems ever occurred in this city, which is the impact of dualistic urban. Overcrowding, inadequate infrastructure, and limited human resources become the main cause of untidiness the coastal settlements in Malalayang. This has an impact on the activities of social, economic, public health level in the environment of coastal City of Manado, Malalayang. This is becoming a serious problem which must be tackled jointly by the government, private parties, and the community. Community-based settlement environment setup, into one solution to realize the city's coastal settlements livable. As for this research aims to analyze the involvement of local communities in arrangements of the settlement. The participatory approach of the model used in this study. Its application is mainly at macro and meso-scale (region, city, and environment) or community architecture. Model participatory approach leads more operational research approach to find a solution/answer to the problems of settlement. The participatory approach is a model for research that involves researchers and society as an object at the same time the subject of research, which in the process in addition to researching also developed other forms of participation in the design and build together. The expected results of this study were able to provide education to the community about environmental and set up a livable settlement for the sake of improving the quality of life. The study also becomes inputs to the government in applying the pattern of development that will be implemented in the future.

Keywords: arrangements the coastal environment, community participation, urban environmental problems, livable settlement

Procedia PDF Downloads 239
13237 Mechanisms Leading to the Protective Behavior of Ethanol Vapour Drying of Probiotics

Authors: Shahnaz Mansouri, Xiao Dong Chen, Meng Wai Woo

Abstract:

A new antisolvent vapour precipitation approach was used to make ultrafine submicron probiotic encapsulates. The approach uses ethanol vapour to precipitate submicron encapsulates within relatively large droplets. Surprisingly, the probiotics (Lactobacillus delbrueckii ssp. bulgaricus, Streptococcus thermophilus) showed relatively high survival even under destructive ethanolic conditions within the droplet. This unusual behaviour was deduced to be caused by the denaturation and aggregation of the milk protein forming an ethanolic protective matrix for the probiotics. Skim milk droplets which is rich in casein and contains naturally occurring minerals provided higher ethanolic protection when compared whey protein isolate and lactose droplets.

Keywords: whey, skim milk, probiotic, antisolvent, precipitation, encapsulation, denaturation, aggregation

Procedia PDF Downloads 522
13236 Retrospective Reconstruction of Time Series Data for Integrated Waste Management

Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy

Abstract:

The development, operation and maintenance of Integrated Waste Management Systems (IWMS) affects essentially the sustainable concern of every region. The features of such systems have great influence on all of the components of sustainability. In order to reach the optimal way of processes, a comprehensive mapping of the variables affecting the future efficiency of the system is needed such as analysis of the interconnections among the components and modelling of their interactions. The planning of a IWMS is based fundamentally on technical and economical opportunities and the legal framework. Modelling the sustainability and operation effectiveness of a certain IWMS is not in the scope of the present research. The complexity of the systems and the large number of the variables require the utilization of a complex approach to model the outcomes and future risks. This complex method should be able to evaluate the logical framework of the factors composing the system and the interconnections between them. The authors of this paper studied the usability of the Fuzzy Cognitive Map (FCM) approach modelling the future operation of IWMS’s. The approach requires two input data set. One is the connection matrix containing all the factors affecting the system in focus with all the interconnections. The other input data set is the time series, a retrospective reconstruction of the weights and roles of the factors. This paper introduces a novel method to develop time series by content analysis.

Keywords: content analysis, factors, integrated waste management system, time series

Procedia PDF Downloads 326
13235 Anti-Corruption in Adverse Contexts: A Strategic Approach

Authors: Mushtaq H. Khan, Antonio Andreoni, Pallavi Roy

Abstract:

Developing countries are characterized by political settlements where formal rules are generally weakly enforced and widely violated. Conventional anti-corruption strategies that focus on improving the general enforcement of a rule of law and raising the costs of corruption facing individual public officials have typically delivered poor results in these contexts. Our alternative approach is to identify anti-corruption strategies that have a high impact and that are feasible to implement in these contexts. Our alternative approach identifies anti-corruption strategies from the bottom up. This involves identifying the characteristics of the corruption constraining particular development outcomes. By drawing on theories of rents and rent seeking, and theories of political settlements, we can assess the developmental impact of particular anti-corruption strategies and the feasibility of implementing these strategies. We argue that feasible anti-corruption in these contexts cannot be solely based on conventional anti-corruption strategies. In societies that have widespread rule violations, high-impact anti-corruption is only likely to be feasible if the overall strategy succeeds in aligning the interests and capabilities of powerful organizations at the sectoral level to support the enforcement of particular sets of rules. We examine four related strategies for changing these incentives and capabilities of critical stakeholders at the local or sectoral level, and we argue that this can provide a framework for organizing research on the impact and feasibility of anti-corruption activities in different priority areas in particular countries.

Keywords: anti-corruption, development, political settlements analysis, rule of law

Procedia PDF Downloads 421
13234 Dynamic Fault Tree Analysis of Dynamic Positioning System through Monte Carlo Approach

Authors: A. S. Cheliyan, S. K. Bhattacharyya

Abstract:

Dynamic Positioning System (DPS) is employed in marine vessels of the offshore oil and gas industry. It is a computer controlled system to automatically maintain a ship’s position and heading by using its own thrusters. Reliability assessment of the same can be analyzed through conventional fault tree. However, the complex behaviour like sequence failure, redundancy management and priority of failing of events cannot be analyzed by the conventional fault trees. The Dynamic Fault Tree (DFT) addresses these shortcomings of conventional Fault Tree by defining additional gates called dynamic gates. Monte Carlo based simulation approach has been adopted for the dynamic gates. This method of realistic modeling of DPS gives meaningful insight into the system reliability and the ability to improve the same.

Keywords: dynamic positioning system, dynamic fault tree, Monte Carlo simulation, reliability assessment

Procedia PDF Downloads 774
13233 A Sustainability Benchmarking Framework Based on the Life Cycle Sustainability Assessment: The Case of the Italian Ceramic District

Authors: A. M. Ferrari, L. Volpi, M. Pini, C. Siligardi, F. E. Garcia Muina, D. Settembre Blundo

Abstract:

A long tradition in the ceramic manufacturing since the 18th century, primarily due to the availability of raw materials and an efficient transport system, let to the birth and development of the Italian ceramic tiles district that nowadays represents a reference point for this sector even at global level. This economic growth has been coupled to attention towards environmental sustainability issues throughout various initiatives undertaken over the years at the level of the production sector, such as certification activities and sustainability policies. In this way, starting from an evaluation of the sustainability in all its aspects, the present work aims to develop a benchmarking helping both producers and consumers. In the present study, throughout the Life Cycle Sustainability Assessment (LCSA) framework, the sustainability has been assessed in all its dimensions: environmental with the Life Cycle Assessment (LCA), economic with the Life Cycle Costing (LCC) and social with the Social Life Cycle Assessment (S-LCA). The annual district production of stoneware tiles during the 2016 reference year has been taken as reference flow for all the three assessments, and the system boundaries cover the entire life cycle of the tiles, except for the LCC for which only the production costs have been considered at the moment. In addition, a preliminary method for the evaluation of local and indoor emissions has been introduced in order to assess the impact due to atmospheric emissions on both people living in the area surrounding the factories and workers. The Life Cycle Assessment results, obtained from IMPACT 2002+ modified assessment method, highlight that the manufacturing process is responsible for the main impact, especially because of atmospheric emissions at a local scale, followed by the distribution to end users, the installation and the ordinary maintenance of the tiles. With regard to the economic evaluation, both the internal and external costs have been considered. For the LCC, primary data from the analysis of the financial statements of Italian ceramic companies show that the higher cost items refer to expenses for goods and services and costs of human resources. The analysis of externalities with the EPS 2015dx method attributes the main damages to the distribution and installation of the tiles. The social dimension has been investigated with a preliminary approach by using the Social Hotspots Database, and the results indicate that the most affected damage categories are health and safety and labor rights and decent work. This study shows the potential of the LCSA framework applied to an industrial sector; in particular, it can be a useful tool for building a comprehensive benchmark for the sustainability of the ceramic industry, and it can help companies to actively integrate sustainability principles into their business models.

Keywords: benchmarking, Italian ceramic industry, life cycle sustainability assessment, porcelain stoneware tiles

Procedia PDF Downloads 128
13232 Sustainable Transformative Approaches to Reuse the Built Heritage of Erbil Citadel Houses as Part of Restoration

Authors: Wafaa Anwar Sulaiman Goriel

Abstract:

The historiography of the Revival heritage aims to breathe a wider spirit of historical building back into life. This paper reflects an approach to revitalizing architectural antiquities through unusual methodologies elsewhere unknown in the renovation heritage sphere using the Erbil Citadel houses as a example. The 6000-year-old, continuously occupied site of Erbil Citadel embodies the challenges and mutual opportunities in ensuring that historical context is preserved during modern redevelopment. It shows how these principles can engage traditional construction systems with modern materials and technologies. It is an approach that champions the age and integrity of restored heritage sites, containing within its vernacular style elements which add to a sense of relevance when contextually re-set in modern settings. Some Citadel’s houses will be discussed in the paper and the restoration method has been processed.

Keywords: Erbil Citadel houses, preservation, heritage, historical sites

Procedia PDF Downloads 18
13231 Denoising Transient Electromagnetic Data

Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen

Abstract:

Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.

Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform

Procedia PDF Downloads 85
13230 Music Note Detection and Dictionary Generation from Music Sheet Using Image Processing Techniques

Authors: Muhammad Ammar, Talha Ali, Abdul Basit, Bakhtawar Rajput, Zobia Sohail

Abstract:

Music note detection is an area of study for the past few years and has its own influence in music file generation from sheet music. We proposed a method to detect music notes on sheet music using basic thresholding and blob detection. Subsequently, we created a notes dictionary using a semi-supervised learning approach. After notes detection, for each test image, the new symbols are added to the dictionary. This makes the notes detection semi-automatic. The experiments are done on images from a dataset and also on the captured images. The developed approach showed almost 100% accuracy on the dataset images, whereas varying results have been seen on captured images.

Keywords: music note, sheet music, optical music recognition, blob detection, thresholding, dictionary generation

Procedia PDF Downloads 181
13229 RPM-Synchronous Non-Circular Grinding: An Approach to Enhance Efficiency in Grinding of Non-Circular Workpieces

Authors: Matthias Steffan, Franz Haas

Abstract:

The production process grinding is one of the latest steps in a value-added manufacturing chain. Within this step, workpiece geometry and surface roughness are determined. Up to this process stage, considerable costs and energy have already been spent on components. According to the current state of the art, therefore, large safety reserves are calculated in order to guarantee a process capability. Especially for non-circular grinding, this fact leads to considerable losses of process efficiency. With present technology, various non-circular geometries on a workpiece must be grinded subsequently in an oscillating process where X- and Q-axis of the machine are coupled. With the approach of RPM-Synchronous Noncircular Grinding, such workpieces can be machined in an ordinary plung grinding process. Therefore, the workpieces and the grinding wheels revolutionary rate are in a fixed ratio. A non-circular grinding wheel is used to transfer its geometry onto the workpiece. The authors use a worldwide unique machine tool that was especially designed for this technology. Highest revolution rates on the workpiece spindle (up to 4500 rpm) are mandatory for the success of this grinding process. This grinding approach is performed in a two-step process. For roughing, a highly porous vitrified bonded grinding wheel with medium grain size is used. It ensures high specific material removal rates for efficiently producing the non-circular geometry on the workpiece. This process step is adapted by a force control algorithm, which uses acquired data from a three-component force sensor located in the dead centre of the tailstock. For finishing, a grinding wheel with a fine grain size is used. Roughing and finishing are performed consecutively among the same clamping of the workpiece with two locally separated grinding spindles. The approach of RPM-Synchronous Noncircular Grinding shows great efficiency enhancement in non-circular grinding. For the first time, three-dimensional non-circular shapes can be grinded that opens up various fields of application. Especially automotive industries show big interest in the emerging trend in finishing machining.

Keywords: efficiency enhancement, finishing machining, non-circular grinding, rpm-synchronous grinding

Procedia PDF Downloads 283
13228 Assessment of Naturally Occurring Radionuclides of the Surface Water in Vaal River, South Africa

Authors: Kgantsi B. T., Ochwelwang A. R., Mathuthu M., Jegede O. A.

Abstract:

Anthropogenic activities near water bodies contribute to poor water quality, which degrades the condition of the biota and elevates the risk to human health. The Vaal River is essential in supplying Gauteng and neighboring regions of South Africa with portable water for a variety of consumers and industries. Consequently, it is necessary to monitor and assess the radioactive risk in relation to the river's water quality. This study used an inductive coupled plasma mass spectrometer (ICPMS) to analyze the radionuclide activity concentration in the Vaal River, South Africa. Along with thorium and potassium, the total uranium concentration was calculated using the isotopic content of uranium. The elemental concentration of ²³⁸U, ²³⁵U, ²³⁴U, ²³²Th, and 40K were translated into activity concentrations. To assess the water safety for all users and consumers, all values were compared to world average activity concentrations 35, 30, and 400 Bqkg⁻¹ for ²³⁸U, ²³⁴Th, and ⁴⁰K, respectively, according to the UNSCEAR report. The results will serve as a database for further monitoring and evaluation of the radionuclide from the river, taking cognisance of potential health hazards.

Keywords: Val Rivers, ICPMS, uranium, risks

Procedia PDF Downloads 163
13227 Integrated Clean Development Mechanism and Risk Management Approach for Infrastructure Transportation Project

Authors: Debasis Sarkar

Abstract:

Clean development mechanism (CDM) can act as an effective instrument for mitigating climate change. This mechanism can effectively reduce the emission of CO2 and other green house gases (GHG). Construction of a mega infrastructure project like underground corridor construction for metro rail operation involves in consumption of substantial quantity of concrete which consumes huge quantity of energy consuming materials like cement and steel. This paper is an attempt to develop an integrated clean development mechanism and risk management approach for sustainable development for an underground corridor metro rail project in India during its construction phase. It was observed that about 35% reduction in CO2 emission can be obtained by adding fly ash as a part replacement of cement. The reduced emission quantity of CO2 which is of the quantum of about 21,646.36 MT would result in cost savings of approximately INR 8.5 million (USD 1,29,878).But construction and operation of such infrastructure projects of the present era are subject to huge risks and uncertainties throughout all the phases of the project, thus reducing the probability of successful completion of the project within stipulated time and cost frame. Thus, an integrated approach of combining CDM with risk management would enable the metro rail authorities to develop a sustainable risk mitigation measure framework to ensure more cost and energy savings and lesser time and cost over-run.

Keywords: clean development mechanism (CDM), infrastructure transportation, project risk management, underground metro rail

Procedia PDF Downloads 475
13226 An Optimal Steganalysis Based Approach for Embedding Information in Image Cover Media with Security

Authors: Ahlem Fatnassi, Hamza Gharsellaoui, Sadok Bouamama

Abstract:

This paper deals with the study of interest in the fields of Steganography and Steganalysis. Steganography involves hiding information in a cover media to obtain the stego media in such a way that the cover media is perceived not to have any embedded message for its unintended recipients. Steganalysis is the mechanism of detecting the presence of hidden information in the stego media and it can lead to the prevention of disastrous security incidents. In this paper, we provide a critical review of the steganalysis algorithms available to analyze the characteristics of an image stego media against the corresponding cover media and understand the process of embedding the information and its detection. We anticipate that this paper can also give a clear picture of the current trends in steganography so that we can develop and improvise appropriate steganalysis algorithms.

Keywords: optimization, heuristics and metaheuristics algorithms, embedded systems, low-power consumption, steganalysis heuristic approach

Procedia PDF Downloads 292
13225 Surface to the Deeper: A Universal Entity Alignment Approach Focusing on Surface Information

Authors: Zheng Baichuan, Li Shenghui, Li Bingqian, Zhang Ning, Chen Kai

Abstract:

Entity alignment (EA) tasks in knowledge graphs often play a pivotal role in the integration of knowledge graphs, where structural differences often exist between the source and target graphs, such as the presence or absence of attribute information and the types of attribute information (text, timestamps, images, etc.). However, most current research efforts are focused on improving alignment accuracy, often along with an increased reliance on specific structures -a dependency that inevitably diminishes their practical value and causes difficulties when facing knowledge graph alignment tasks with varying structures. Therefore, we propose a universal knowledge graph alignment approach that only utilizes the common basic structures shared by knowledge graphs. We have demonstrated through experiments that our method achieves state-of-the-art performance in fair comparisons.

Keywords: knowledge graph, entity alignment, transformer, deep learning

Procedia PDF Downloads 45
13224 Probabilistic Approach to Contrast Theoretical Predictions from a Public Corruption Game Using Bayesian Networks

Authors: Jaime E. Fernandez, Pablo J. Valverde

Abstract:

This paper presents a methodological approach that aims to contrast/validate theoretical results from a corruption network game through probabilistic analysis of simulated microdata using Bayesian Networks (BNs). The research develops a public corruption model in a game theory framework. Theoretical results suggest a series of 'optimal settings' of model's exogenous parameters that boost the emergence of corruption. The paper contrasts these outcomes with probabilistic inference results based on BNs adjusted over simulated microdata. Principal findings indicate that probabilistic reasoning based on BNs significantly improves parameter specification and causal analysis in a public corruption game.

Keywords: Bayesian networks, probabilistic reasoning, public corruption, theoretical games

Procedia PDF Downloads 210
13223 Value of Willingness to Pay for a Quality-Adjusted Life Years Gained in Iran; A Modified Chained-Approach

Authors: Seyedeh-Fariba Jahanbin, Hasan Yusefzadeh, Bahram Nabilou, Cyrus Alinia, Cyrus Alinia

Abstract:

Background: Due to the lack of a constant Willingness to Pay per one additional Quality Adjusted Life Years gained based on the preferences of Iran’s general public, the cost-efectiveness of health system interventions is unclear and making it challenging to apply economic evaluation to health resources priority setting. Methods: We have measured this cost-efectiveness threshold with the participation of 2854 individuals from fve provinces, each representing an income quintile, using a modifed Time Trade-Of-based Chained-Approach. In this online-based empirical survey, to extract the health utility value, participants were randomly assigned to one of two green (21121) and yellow (22222) health scenarios designed based on the earlier validated EQ-5D-3L questionnaire. Results: Across the two health state versions, mean values for one QALY gain (rounded) ranged from $6740-$7400 and $6480-$7120, respectively, for aggregate and trimmed models, which are equivalent to 1.35-1.18 times of the GDP per capita. Log-linear Multivariate OLS regression analysis confrmed that respondents were more likely to pay if their income, disutility, and education level were higher than their counterparts. Conclusions: In the health system of Iran, any intervention that is with the incremental cost-efectiveness ratio, equal to and less than 7402.12 USD, will be considered cost-efective.

Keywords: willingness to Pay, QALY, chained-approach, cost-efectiveness threshold, Iran

Procedia PDF Downloads 85