Search results for: multi-sensory approach
12771 Analysis of Reflection of Elastic Waves in Three Dimensional Model Comprised with Viscoelastic Anisotropic Medium
Authors: Amares Chattopadhyay, Akanksha Srivastava
Abstract:
A unified approach to study the reflection of a plane wave in three-dimensional model comprised of the triclinic viscoelastic medium. The phase velocities of reflected qP, qSV and qSH wave have been calculated for the concerned medium by using the eigenvalue approach. The generalized method has been implemented to compute the complex form of amplitude ratios. Further, we discussed the nature of reflection coefficients of qP, qSV and qSH wave. The viscoelastic parameter, polar angle and azimuthal angle are found to be strongly influenced by amplitude ratios. The research article is particularly focused to study the effect of viscoelasticity associated with highly anisotropic media which exhibits the notable information about the reflection coefficients of qP, qSV, and qSH wave. The outcomes may further useful to the better exploration of all types of hydrocarbon reservoir and advancement in the field of reflection seismology.Keywords: amplitude ratios, three dimensional, triclinic, viscoelastic
Procedia PDF Downloads 23012770 Post-Earthquake Road Damage Detection by SVM Classification from Quickbird Satellite Images
Authors: Moein Izadi, Ali Mohammadzadeh
Abstract:
Detection of damaged parts of roads after earthquake is essential for coordinating rescuers. In this study, an approach is presented for the semi-automatic detection of damaged roads in a city using pre-event vector maps and both pre- and post-earthquake QuickBird satellite images. Damage is defined in this study as the debris of damaged buildings adjacent to the roads. Some spectral and texture features are considered for SVM classification step to detect damages. Finally, the proposed method is tested on QuickBird pan-sharpened images from the Bam City earthquake and the results show that an overall accuracy of 81% and a kappa coefficient of 0.71 are achieved for the damage detection. The obtained results indicate the efficiency and accuracy of the proposed approach.Keywords: SVM classifier, disaster management, road damage detection, quickBird images
Procedia PDF Downloads 62312769 Item Response Calibration/Estimation: An Approach to Adaptive E-Learning System Development
Authors: Adeniran Adetunji, Babalola M. Florence, Akande Ademola
Abstract:
In this paper, we made an overview on the concept of adaptive e-Learning system, enumerates the elements of adaptive learning concepts e.g. A pedagogical framework, multiple learning strategies and pathways, continuous monitoring and feedback on student performance, statistical inference to reach final learning strategy that works for an individual learner by “mass-customization”. Briefly highlights the motivation of this new system proposed for effective learning teaching. E-Review literature on the concept of adaptive e-learning system and emphasises on the Item Response Calibration, which is an important approach to developing an adaptive e-Learning system. This paper write-up is concluded on the justification of item response calibration/estimation towards designing a successful and effective adaptive e-Learning system.Keywords: adaptive e-learning system, pedagogical framework, item response, computer applications
Procedia PDF Downloads 59612768 Distribution and Taxonomy of Marine Fungi in Nha Trang Bay and Van Phong Bay, Vietnam
Authors: Thu Thuy Pham, Thi Chau Loan Tran, Van Duy Nguyen
Abstract:
Marine fungi play an important role in the marine ecosystems. Marine fungi also supply biomass and metabolic products of industrial value. Currently, the biodiversity of marine fungi along the coastal areas of Vietnam has not yet been studied fully. The objective of this study is to assess the spatial and temporal diversity of planktonic fungi from the coastal waters of Nha Trang Bay and Van Phong Bay in Central Vietnam using culture-dependent and independent approach. Using culture-dependent approach, filamentous fungi and yeasts were isolated on selective media and then classified by phenotype and genotype based on the sequencing of ITS (internal transcribed spacers) regions of rDNA with two primer pairs (ITS1F_KYO2 and ITS4; NS1 and NS8). Using culture-independent approach, environmental DNA samples were isolated and amplified using fungal-specific ITS primer pairs. A total of over 160 strains were isolated from 10 seawater sampling stations at 50 cm depth. They were classified into diverse genera and species of both yeast and mold. At least 5 strains could be potentially novel species. Our results also revealed that planktonic fungi were molecularly diverse with hundreds of phylotypes recovered across these two bays. The results of the study provide data about the distribution and taxonomy of mycoplankton in this area, thereby allowing assessment of their positive role in the biogeochemical cycle of coastal ecosystems and the development of new bioactive compounds for industrial applications.Keywords: biodiversity, ITS, marine fungi, Nha Trang Bay, Van Phong Bay
Procedia PDF Downloads 19012767 From Mathematics Project-Based Learning to Commercial Product Using Geometer’s Sketchpad (GSP)
Authors: Krongthong Khairiree
Abstract:
The purpose of this research study is to explore mathematics project-based learning approach and the use of technology in the context of school mathematics in Thailand. Data of the study were collected from 6 sample secondary schools and the students were 6-14 years old. Research findings show that through mathematics project-based learning approach and the use of GSP, students were able to make mathematics learning fun and challenging. Based on the students’ interviews they revealed that, with GSP, they were able to visualize and create graphical representations, which will enable them to develop their mathematical thinking skills, concepts and understanding. The students had fun in creating variety of graphs of functions which they can not do by drawing on graph paper. In addition, there are evidences to show the students’ abilities in connecting mathematics to real life outside the classroom and commercial products, such as weaving, patterning of broomstick, and ceramics design.Keywords: mathematics, project-based learning, Geometer’s Sketchpad (GSP), commercial products
Procedia PDF Downloads 33612766 Statistical Design of Synthetic VP X-bar Control Chat Using Markov Chain Approach
Authors: Ali Akbar Heydari
Abstract:
Control charts are an important tool of statistical quality control. Thesecharts are used to detect and eliminate unwanted special causes of variation that occurred during aperiod of time. The design and operation of control charts require the determination of three design parameters: the sample size (n), the sampling interval (h), and the width coefficient of control limits (k). Thevariable parameters (VP) x-bar controlchart is the x-barchart in which all the design parameters vary between twovalues. These values are a function of the most recent process information. In fact, in the VP x-bar chart, the position of each sample point on the chart establishes the size of the next sample and the timeof its sampling. The synthetic x-barcontrol chartwhich integrates the x-bar chart and the conforming run length (CRL) chart, provides significant improvement in terms of detection power over the basic x-bar chart for all levels of mean shifts. In this paper, we introduce the syntheticVP x-bar control chart for monitoring changes in the process mean. To determine the design parameters, we used a statistical design based on the minimum out of control average run length (ARL) criteria. The optimal chart parameters of the proposed chart are obtained using the Markov chain approach. A numerical example is also done to show the performance of the proposed chart and comparing it with the other control charts. The results show that our proposed syntheticVP x-bar controlchart perform better than the synthetic x-bar controlchart for all shift parameter values. Also, the syntheticVP x-bar controlchart perform better than the VP x-bar control chart for the moderate or large shift parameter values.Keywords: control chart, markov chain approach, statistical design, synthetic, variable parameter
Procedia PDF Downloads 15412765 Operations Research Applications in Audit Planning and Scheduling
Authors: Abdel-Aziz M. Mohamed
Abstract:
This paper presents a state-of-the-art survey of the operations research models developed for internal audit planning. Two alternative approaches have been followed in the literature for audit planning: (1) identifying the optimal audit frequency; and (2) determining the optimal audit resource allocation. The first approach identifies the elapsed time between two successive audits, which can be presented as the optimal number of audits in a given planning horizon, or the optimal number of transactions after which an audit should be performed. It also includes the optimal audit schedule. The second approach determines the optimal allocation of audit frequency among all auditable units in the firm. In our review, we discuss both the deterministic and probabilistic models developed for audit planning. In addition, game theory models are reviewed to find the optimal auditing strategy based on the interactions between the auditors and the clients.Keywords: operations research applications, audit frequency, audit-staff scheduling, audit planning
Procedia PDF Downloads 81512764 A Survey on Quasi-Likelihood Estimation Approaches for Longitudinal Set-ups
Authors: Naushad Mamode Khan
Abstract:
The Com-Poisson (CMP) model is one of the most popular discrete generalized linear models (GLMS) that handles both equi-, over- and under-dispersed data. In longitudinal context, an integer-valued autoregressive (INAR(1)) process that incorporates covariate specification has been developed to model longitudinal CMP counts. However, the joint likelihood CMP function is difficult to specify and thus restricts the likelihood based estimating methodology. The joint generalized quasilikelihood approach (GQL-I) was instead considered but is rather computationally intensive and may not even estimate the regression effects due to a complex and frequently ill conditioned covariance structure. This paper proposes a new GQL approach for estimating the regression parameters (GQLIII) that are based on a single score vector representation. The performance of GQL-III is compared with GQL-I and separate marginal GQLs (GQL-II) through some simulation experiments and is proved to yield equally efficient estimates as GQL-I and is far more computationally stable.Keywords: longitudinal, com-Poisson, ill-conditioned, INAR(1), GLMS, GQL
Procedia PDF Downloads 35412763 Financial Assets Return, Economic Factors and Investor's Behavioral Indicators Relationships Modeling: A Bayesian Networks Approach
Authors: Nada Souissi, Mourad Mroua
Abstract:
The main purpose of this study is to examine the interaction between financial asset volatility, economic factors and investor's behavioral indicators related to both the company's and the markets stocks for the period from January 2000 to January2020. Using multiple linear regression and Bayesian Networks modeling, results show a positive and negative relationship between investor's psychology index, economic factors and predicted stock market return. We reveal that the application of the Bayesian Discrete Network contributes to identify the different cause and effect relationships between all economic, financial variables and psychology index.Keywords: Financial asset return predictability, Economic factors, Investor's psychology index, Bayesian approach, Probabilistic networks, Parametric learning
Procedia PDF Downloads 14912762 Solving Flowshop Scheduling Problems with Ant Colony Optimization Heuristic
Authors: Arshad Mehmood Ch, Riaz Ahmad, Imran Ali Ch, Waqas Durrani
Abstract:
This study deals with the application of Ant Colony Optimization (ACO) approach to solve no-wait flowshop scheduling problem (NW-FSSP). ACO algorithm so developed has been coded on Matlab computer application. The paper covers detailed steps to apply ACO and focuses on judging the strength of ACO in relation to other solution techniques previously applied to solve no-wait flowshop problem. The general purpose approach was able to find reasonably accurate solutions for almost all the problems under consideration and was able to handle a fairly large spectrum of problems with far reduced CPU effort. Careful scrutiny of the results reveals that the algorithm presented results better than other approaches like Genetic algorithm and Tabu Search heuristics etc; earlier applied to solve NW-FSSP data sets.Keywords: no-wait, flowshop, scheduling, ant colony optimization (ACO), makespan
Procedia PDF Downloads 43412761 Convectory Policing-Reconciling Historic and Contemporary Models of Police Service Delivery
Authors: Mark Jackson
Abstract:
Description: This paper is based on an theoretical analysis of the efficacy of the dominant model of policing in western jurisdictions. Those results are then compared with a similar analysis of a traditional reactive model. It is found that neither model provides for optimal delivery of services. Instead optimal service can be achieved by a synchronous hybrid model, termed the Convectory Policing approach. Methodology and Findings: For over three decades problem oriented policing (PO) has been the dominant model for western police agencies. Initially based on the work of Goldstein during the 1970s the problem oriented framework has spawned endless variants and approaches, most of which embrace a problem solving rather than a reactive approach to policing. This has included the Area Policing Concept (APC) applied in many smaller jurisdictions in the USA, the Scaled Response Policing Model (SRPM) currently under trial in Western Australia and the Proactive Pre-Response Approach (PPRA) which has also seen some success. All of these, in some way or another, are largely based on a model that eschews a traditional reactive model of policing. Convectory Policing (CP) is an alternative model which challenges the underpinning assumptions which have seen proliferation of the PO approach in the last three decades and commences by questioning the economics on which PO is based. It is argued that in essence, the PO relies on an unstated, and often unrecognised assumption that resources will be available to meet demand for policing services, while at the same time maintaining the capacity to deploy staff to develop solutions to the problems which were ultimately manifested in those same calls for service. The CP model relies on the observations from a numerous western jurisdictions to challenge the validity of that underpinning assumption, particularly in fiscally tight environment. In deploying staff to pursue and develop solutions to underpinning problems, there is clearly an opportunity cost. Those same staff cannot be allocated to alternative duties while engaged in a problem solution role. At the same time, resources in use responding to calls for service are unavailable, while committed to that role, to pursue solutions to the problems giving rise to those same calls for service. The two approaches, reactive and PO are therefore dichotomous. One cannot be optimised while the other is being pursued. Convectory Policing is a pragmatic response to the schism between the competing traditional and contemporary models. If it is not possible to serve either model with any real rigour, it becomes necessary to taper an approach to deliver specific outcomes against which success or otherwise might be measured. CP proposes that a structured roster-driven approach to calls for service, combined with the application of what is termed a resource-effect response capacity has the potential to resolve the inherent conflict between traditional and models of policing and the expectations of the community in terms of community policing based problem solving models.Keywords: policing, reactive, proactive, models, efficacy
Procedia PDF Downloads 48312760 Reliable Method for Estimating Rating Curves in the Natural Rivers
Authors: Arash Ahmadi, Amirreza Kavousizadeh, Sanaz Heidarzadeh
Abstract:
Stage-discharge curve is one of the conventional methods for continuous river flow measurement. In this paper, an innovative approach is proposed for predicting the stage-discharge relationship using the application of isovel contours. Using the proposed method, it is possible to estimate the stage-discharge curve in the whole section with only using discharge information from just one arbitrary water level. For this purpose, multivariate relationships are used to determine the mean velocity in a cross-section. The unknown exponents of the proposed relationship have been obtained by using the second version of the Strength Pareto Evolutionary Algorithm (SPEA2), and the appropriate equation was selected by applying the TOPSIS (Technique for Order Preferences by Similarity to an Ideal Solution) approach. Results showed a close agreement between the estimated and observed data in the different cross-sections.Keywords: rating curves, SPEA2, natural rivers, bed roughness distribution
Procedia PDF Downloads 15812759 Streamlining Cybersecurity Risk Assessment for Industrial Control and Automation Systems: Leveraging the National Institute of Standard and Technology’s Risk Management Framework (RMF) Using Model-Based System Engineering (MBSE)
Authors: Gampel Alexander, Mazzuchi Thomas, Sarkani Shahram
Abstract:
The cybersecurity landscape is constantly evolving, and organizations must adapt to the changing threat environment to protect their assets. The implementation of the NIST Risk Management Framework (RMF) has become critical in ensuring the security and safety of industrial control and automation systems. However, cybersecurity professionals are facing challenges in implementing RMF, leading to systems operating without authorization and being non-compliant with regulations. The current approach to RMF implementation based on business practices is limited and insufficient, leaving organizations vulnerable to cyberattacks resulting in the loss of personal consumer data and critical infrastructure details. To address these challenges, this research proposes a Model-Based Systems Engineering (MBSE) approach to implementing cybersecurity controls and assessing risk through the RMF process. The study emphasizes the need to shift to a modeling approach, which can streamline the RMF process and eliminate bloated structures that make it difficult to receive an Authorization-To-Operate (ATO). The study focuses on the practical application of MBSE in industrial control and automation systems to improve the security and safety of operations. It is concluded that MBSE can be used to solve the implementation challenges of the NIST RMF process and improve the security of industrial control and automation systems. The research suggests that MBSE provides a more effective and efficient method for implementing cybersecurity controls and assessing risk through the RMF process. The future work for this research involves exploring the broader applicability of MBSE in different industries and domains. The study suggests that the MBSE approach can be applied to other domains beyond industrial control and automation systems.Keywords: authorization-to-operate (ATO), industrial control systems (ICS), model-based system’s engineering (MBSE), risk management framework (RMF)
Procedia PDF Downloads 9512758 Practical Methods for Automatic MC/DC Test Cases Generation of Boolean Expressions
Authors: Sekou Kangoye, Alexis Todoskoff, Mihaela Barreau
Abstract:
Modified Condition/Decision Coverage (MC/DC) is a structural coverage criterion that aims to prove that all conditions involved in a Boolean expression can influence the result of that expression. In the context of automotive, MC/DC is highly recommended and even required for most security and safety applications testing. However, due to complex Boolean expressions that often embedded in those applications, generating a set of MC/DC compliant test cases for any of these expressions is a nontrivial task and can be time consuming for testers. In this paper we present an approach to automatically generate MC/DC test cases for any Boolean expression. We introduce novel techniques, essentially based on binary trees to quickly and optimally generate MC/DC test cases for the expressions. Thus, the approach can be used to reduce the manual testing effort of testers.Keywords: binary trees, MC/DC, test case generation, nontrivial task
Procedia PDF Downloads 44712757 River Habitat Modeling for the Entire Macroinvertebrate Community
Authors: Pinna Beatrice., Laini Alex, Negro Giovanni, Burgazzi Gemma, Viaroli Pierluigi, Vezza Paolo
Abstract:
Habitat models rarely consider macroinvertebrates as ecological targets in rivers. Available approaches mainly focus on single macroinvertebrate species, not addressing the ecological needs and functionality of the entire community. This research aimed to provide an approach to model the habitat of the macroinvertebrate community. The approach is based on the recently developed Flow-T index, together with a Random Forest (RF) regression, which is employed to apply the Flow-T index at the meso-habitat scale. Using different datasets gathered from both field data collection and 2D hydrodynamic simulations, the model has been calibrated in the Trebbia river (2019 campaign), and then validated in the Trebbia, Taro, and Enza rivers (2020 campaign). The three rivers are characterized by a braiding morphology, gravel riverbeds, and summer low flows. The RF model selected 12 mesohabitat descriptors as important for the macroinvertebrate community. These descriptors belong to different frequency classes of water depth, flow velocity, substrate grain size, and connectivity to the main river channel. The cross-validation R² coefficient (R²𝒸ᵥ) of the training dataset is 0.71 for the Trebbia River (2019), whereas the R² coefficient for the validation datasets (Trebbia, Taro, and Enza Rivers 2020) is 0.63. The agreement between the simulated results and the experimental data shows sufficient accuracy and reliability. The outcomes of the study reveal that the model can identify the ecological response of the macroinvertebrate community to possible flow regime alterations and to possible river morphological modifications. Lastly, the proposed approach allows extending the MesoHABSIM methodology, widely used for the fish habitat assessment, to a different ecological target community. Further applications of the approach can be related to flow design in both perennial and non-perennial rivers, including river reaches in which fish fauna is absent.Keywords: ecological flows, macroinvertebrate community, mesohabitat, river habitat modeling
Procedia PDF Downloads 9412756 Detection of New Attacks on Ubiquitous Services in Cloud Computing and Countermeasures
Authors: L. Sellami, D. Idoughi, P. F. Tiako
Abstract:
Cloud computing provides infrastructure to the enterprise through the Internet allowing access to cloud services at anytime and anywhere. This pervasive aspect of the services, the distributed nature of data and the wide use of information make cloud computing vulnerable to intrusions that violate the security of the cloud. This requires the use of security mechanisms to detect malicious behavior in network communications and hosts such as intrusion detection systems (IDS). In this article, we focus on the detection of intrusion into the cloud sing IDSs. We base ourselves on client authentication in the computing cloud. This technique allows to detect the abnormal use of ubiquitous service and prevents the intrusion of cloud computing. This is an approach based on client authentication data. Our IDS provides intrusion detection inside and outside cloud computing network. It is a double protection approach: The security user node and the global security cloud computing.Keywords: cloud computing, intrusion detection system, privacy, trust
Procedia PDF Downloads 32312755 Fusion of Finger Inner Knuckle Print and Hand Geometry Features to Enhance the Performance of Biometric Verification System
Authors: M. L. Anitha, K. A. Radhakrishna Rao
Abstract:
With the advent of modern computing technology, there is an increased demand for developing recognition systems that have the capability of verifying the identity of individuals. Recognition systems are required by several civilian and commercial applications for providing access to secured resources. Traditional recognition systems which are based on physical identities are not sufficiently reliable to satisfy the security requirements due to the use of several advances of forgery and identity impersonation methods. Recognizing individuals based on his/her unique physiological characteristics known as biometric traits is a reliable technique, since these traits are not transferable and they cannot be stolen or lost. Since the performance of biometric based recognition system depends on the particular trait that is utilized, the present work proposes a fusion approach which combines Inner knuckle print (IKP) trait of the middle, ring and index fingers with the geometrical features of hand. The hand image captured from a digital camera is preprocessed to find finger IKP as region of interest (ROI) and hand geometry features. Geometrical features are represented as the distances between different key points and IKP features are extracted by applying local binary pattern descriptor on the IKP ROI. The decision level AND fusion was adopted, which has shown improvement in performance of the combined scheme. The proposed approach is tested on the database collected at our institute. Proposed approach is of significance since both hand geometry and IKP features can be extracted from the palm region of the hand. The fusion of these features yields a false acceptance rate of 0.75%, false rejection rate of 0.86% for verification tests conducted, which is less when compared to the results obtained using individual traits. The results obtained confirm the usefulness of proposed approach and suitability of the selected features for developing biometric based recognition system based on features from palmar region of hand.Keywords: biometrics, hand geometry features, inner knuckle print, recognition
Procedia PDF Downloads 22012754 The Effects of Damping Devices on Displacements, Velocities and Accelerations of Structures
Authors: Radhwane Boudjelthia
Abstract:
The most recent earthquakes that occurred in the world and particularly in Algeria, have killed thousands of people and severe damage. The example that is etched in our memory is the last earthquake in the regions of Boumerdes and Algiers (Boumerdes earthquake of May 21, 2003). For all the actors involved in the building process, the earthquake is the litmus test for construction. The goal we set ourselves is to contribute to the implementation of a thoughtful approach to the seismic protection of structures. For many engineers, the most conventional approach protection works (buildings and bridges) the effects of earthquakes is to increase rigidity. This approach is not always effective, especially when there is a context that favors the phenomenon of resonance and amplification of seismic forces. Therefore, the field of earthquake engineering has made significant inroads among others catalyzed by the development of computational techniques in computer form and the use of powerful test facilities. This has led to the emergence of several innovative technologies, such as the introduction of special devices insulation between infrastructure and superstructure. This approach, commonly known as "seismic isolation" to absorb the significant efforts without the structure is damaged and thus ensuring the protection of lives and property. In addition, the restraints to the construction by the ground shaking are located mainly at the supports. With these moves, the natural period of construction is increasing, and seismic loads are reduced. Thus, there is an attenuation of the seismic movement. Likewise, the insulation of the base mechanism may be used in combination with earthquake dampers in order to control the deformation of the insulation system and the absolute displacement of the superstructure located above the isolation interface. On the other hand, only can use these earthquake dampers to reduce the oscillation amplitudes and thus reduce seismic loads. The use of damping devices represents an effective solution for the rehabilitation of existing structures. Given all these acceleration reducing means considered passive, much research has been conducted for several years to develop an active control system of the response of buildings to earthquakes.Keywords: earthquake, building, seismic forces, displacement, resonance, response
Procedia PDF Downloads 12712753 Community-Based Settlement Environment in Malalayang Coastal Area, Manado City
Authors: Teguh R. Hakim, Frenny F. F. Kairupan, Alberta M. Mantiri
Abstract:
The face of the coastal city is generally the same as other cities face showing the dualistic, traditional and modern, rural and urbanity, planned and unplanned, slum and high quality. Manado city is located on the northern coastal areas of the island of Sulawesi, Indonesia. Manado city is located on the northern coastal areas of the island of Sulawesi, Indonesia. Urban environmental problems ever occurred in this city, which is the impact of dualistic urban. Overcrowding, inadequate infrastructure, and limited human resources become the main cause of untidiness the coastal settlements in Malalayang. This has an impact on the activities of social, economic, public health level in the environment of coastal City of Manado, Malalayang. This is becoming a serious problem which must be tackled jointly by the government, private parties, and the community. Community-based settlement environment setup, into one solution to realize the city's coastal settlements livable. As for this research aims to analyze the involvement of local communities in arrangements of the settlement. The participatory approach of the model used in this study. Its application is mainly at macro and meso-scale (region, city, and environment) or community architecture. Model participatory approach leads more operational research approach to find a solution/answer to the problems of settlement. The participatory approach is a model for research that involves researchers and society as an object at the same time the subject of research, which in the process in addition to researching also developed other forms of participation in the design and build together. The expected results of this study were able to provide education to the community about environmental and set up a livable settlement for the sake of improving the quality of life. The study also becomes inputs to the government in applying the pattern of development that will be implemented in the future.Keywords: arrangements the coastal environment, community participation, urban environmental problems, livable settlement
Procedia PDF Downloads 23912752 Mechanisms Leading to the Protective Behavior of Ethanol Vapour Drying of Probiotics
Authors: Shahnaz Mansouri, Xiao Dong Chen, Meng Wai Woo
Abstract:
A new antisolvent vapour precipitation approach was used to make ultrafine submicron probiotic encapsulates. The approach uses ethanol vapour to precipitate submicron encapsulates within relatively large droplets. Surprisingly, the probiotics (Lactobacillus delbrueckii ssp. bulgaricus, Streptococcus thermophilus) showed relatively high survival even under destructive ethanolic conditions within the droplet. This unusual behaviour was deduced to be caused by the denaturation and aggregation of the milk protein forming an ethanolic protective matrix for the probiotics. Skim milk droplets which is rich in casein and contains naturally occurring minerals provided higher ethanolic protection when compared whey protein isolate and lactose droplets.Keywords: whey, skim milk, probiotic, antisolvent, precipitation, encapsulation, denaturation, aggregation
Procedia PDF Downloads 52212751 Retrospective Reconstruction of Time Series Data for Integrated Waste Management
Authors: A. Buruzs, M. F. Hatwágner, A. Torma, L. T. Kóczy
Abstract:
The development, operation and maintenance of Integrated Waste Management Systems (IWMS) affects essentially the sustainable concern of every region. The features of such systems have great influence on all of the components of sustainability. In order to reach the optimal way of processes, a comprehensive mapping of the variables affecting the future efficiency of the system is needed such as analysis of the interconnections among the components and modelling of their interactions. The planning of a IWMS is based fundamentally on technical and economical opportunities and the legal framework. Modelling the sustainability and operation effectiveness of a certain IWMS is not in the scope of the present research. The complexity of the systems and the large number of the variables require the utilization of a complex approach to model the outcomes and future risks. This complex method should be able to evaluate the logical framework of the factors composing the system and the interconnections between them. The authors of this paper studied the usability of the Fuzzy Cognitive Map (FCM) approach modelling the future operation of IWMS’s. The approach requires two input data set. One is the connection matrix containing all the factors affecting the system in focus with all the interconnections. The other input data set is the time series, a retrospective reconstruction of the weights and roles of the factors. This paper introduces a novel method to develop time series by content analysis.Keywords: content analysis, factors, integrated waste management system, time series
Procedia PDF Downloads 32612750 Anti-Corruption in Adverse Contexts: A Strategic Approach
Authors: Mushtaq H. Khan, Antonio Andreoni, Pallavi Roy
Abstract:
Developing countries are characterized by political settlements where formal rules are generally weakly enforced and widely violated. Conventional anti-corruption strategies that focus on improving the general enforcement of a rule of law and raising the costs of corruption facing individual public officials have typically delivered poor results in these contexts. Our alternative approach is to identify anti-corruption strategies that have a high impact and that are feasible to implement in these contexts. Our alternative approach identifies anti-corruption strategies from the bottom up. This involves identifying the characteristics of the corruption constraining particular development outcomes. By drawing on theories of rents and rent seeking, and theories of political settlements, we can assess the developmental impact of particular anti-corruption strategies and the feasibility of implementing these strategies. We argue that feasible anti-corruption in these contexts cannot be solely based on conventional anti-corruption strategies. In societies that have widespread rule violations, high-impact anti-corruption is only likely to be feasible if the overall strategy succeeds in aligning the interests and capabilities of powerful organizations at the sectoral level to support the enforcement of particular sets of rules. We examine four related strategies for changing these incentives and capabilities of critical stakeholders at the local or sectoral level, and we argue that this can provide a framework for organizing research on the impact and feasibility of anti-corruption activities in different priority areas in particular countries.Keywords: anti-corruption, development, political settlements analysis, rule of law
Procedia PDF Downloads 41812749 Dynamic Fault Tree Analysis of Dynamic Positioning System through Monte Carlo Approach
Authors: A. S. Cheliyan, S. K. Bhattacharyya
Abstract:
Dynamic Positioning System (DPS) is employed in marine vessels of the offshore oil and gas industry. It is a computer controlled system to automatically maintain a ship’s position and heading by using its own thrusters. Reliability assessment of the same can be analyzed through conventional fault tree. However, the complex behaviour like sequence failure, redundancy management and priority of failing of events cannot be analyzed by the conventional fault trees. The Dynamic Fault Tree (DFT) addresses these shortcomings of conventional Fault Tree by defining additional gates called dynamic gates. Monte Carlo based simulation approach has been adopted for the dynamic gates. This method of realistic modeling of DPS gives meaningful insight into the system reliability and the ability to improve the same.Keywords: dynamic positioning system, dynamic fault tree, Monte Carlo simulation, reliability assessment
Procedia PDF Downloads 77412748 Sustainable Transformative Approaches to Reuse the Built Heritage of Erbil Citadel Houses as Part of Restoration
Authors: Wafaa Anwar Sulaiman Goriel
Abstract:
The historiography of the Revival heritage aims to breathe a wider spirit of historical building back into life. This paper reflects an approach to revitalizing architectural antiquities through unusual methodologies elsewhere unknown in the renovation heritage sphere using the Erbil Citadel houses as a example. The 6000-year-old, continuously occupied site of Erbil Citadel embodies the challenges and mutual opportunities in ensuring that historical context is preserved during modern redevelopment. It shows how these principles can engage traditional construction systems with modern materials and technologies. It is an approach that champions the age and integrity of restored heritage sites, containing within its vernacular style elements which add to a sense of relevance when contextually re-set in modern settings. Some Citadel’s houses will be discussed in the paper and the restoration method has been processed.Keywords: Erbil Citadel houses, preservation, heritage, historical sites
Procedia PDF Downloads 1712747 Denoising Transient Electromagnetic Data
Authors: Lingerew Nebere Kassie, Ping-Yu Chang, Hsin-Hua Huang, , Chaw-Son Chen
Abstract:
Transient electromagnetic (TEM) data plays a crucial role in hydrogeological and environmental applications, providing valuable insights into geological structures and resistivity variations. However, the presence of noise often hinders the interpretation and reliability of these data. Our study addresses this issue by utilizing a FASTSNAP system for the TEM survey, which operates at different modes (low, medium, and high) with continuous adjustments to discretization, gain, and current. We employ a denoising approach that processes the raw data obtained from each acquisition mode to improve signal quality and enhance data reliability. We use a signal-averaging technique for each mode, increasing the signal-to-noise ratio. Additionally, we utilize wavelet transform to suppress noise further while preserving the integrity of the underlying signals. This approach significantly improves the data quality, notably suppressing severe noise at late times. The resulting denoised data exhibits a substantially improved signal-to-noise ratio, leading to increased accuracy in parameter estimation. By effectively denoising TEM data, our study contributes to a more reliable interpretation and analysis of underground structures. Moreover, the proposed denoising approach can be seamlessly integrated into existing ground-based TEM data processing workflows, facilitating the extraction of meaningful information from noisy measurements and enhancing the overall quality and reliability of the acquired data.Keywords: data quality, signal averaging, transient electromagnetic, wavelet transform
Procedia PDF Downloads 8512746 Music Note Detection and Dictionary Generation from Music Sheet Using Image Processing Techniques
Authors: Muhammad Ammar, Talha Ali, Abdul Basit, Bakhtawar Rajput, Zobia Sohail
Abstract:
Music note detection is an area of study for the past few years and has its own influence in music file generation from sheet music. We proposed a method to detect music notes on sheet music using basic thresholding and blob detection. Subsequently, we created a notes dictionary using a semi-supervised learning approach. After notes detection, for each test image, the new symbols are added to the dictionary. This makes the notes detection semi-automatic. The experiments are done on images from a dataset and also on the captured images. The developed approach showed almost 100% accuracy on the dataset images, whereas varying results have been seen on captured images.Keywords: music note, sheet music, optical music recognition, blob detection, thresholding, dictionary generation
Procedia PDF Downloads 18112745 RPM-Synchronous Non-Circular Grinding: An Approach to Enhance Efficiency in Grinding of Non-Circular Workpieces
Authors: Matthias Steffan, Franz Haas
Abstract:
The production process grinding is one of the latest steps in a value-added manufacturing chain. Within this step, workpiece geometry and surface roughness are determined. Up to this process stage, considerable costs and energy have already been spent on components. According to the current state of the art, therefore, large safety reserves are calculated in order to guarantee a process capability. Especially for non-circular grinding, this fact leads to considerable losses of process efficiency. With present technology, various non-circular geometries on a workpiece must be grinded subsequently in an oscillating process where X- and Q-axis of the machine are coupled. With the approach of RPM-Synchronous Noncircular Grinding, such workpieces can be machined in an ordinary plung grinding process. Therefore, the workpieces and the grinding wheels revolutionary rate are in a fixed ratio. A non-circular grinding wheel is used to transfer its geometry onto the workpiece. The authors use a worldwide unique machine tool that was especially designed for this technology. Highest revolution rates on the workpiece spindle (up to 4500 rpm) are mandatory for the success of this grinding process. This grinding approach is performed in a two-step process. For roughing, a highly porous vitrified bonded grinding wheel with medium grain size is used. It ensures high specific material removal rates for efficiently producing the non-circular geometry on the workpiece. This process step is adapted by a force control algorithm, which uses acquired data from a three-component force sensor located in the dead centre of the tailstock. For finishing, a grinding wheel with a fine grain size is used. Roughing and finishing are performed consecutively among the same clamping of the workpiece with two locally separated grinding spindles. The approach of RPM-Synchronous Noncircular Grinding shows great efficiency enhancement in non-circular grinding. For the first time, three-dimensional non-circular shapes can be grinded that opens up various fields of application. Especially automotive industries show big interest in the emerging trend in finishing machining.Keywords: efficiency enhancement, finishing machining, non-circular grinding, rpm-synchronous grinding
Procedia PDF Downloads 28312744 Integrated Clean Development Mechanism and Risk Management Approach for Infrastructure Transportation Project
Authors: Debasis Sarkar
Abstract:
Clean development mechanism (CDM) can act as an effective instrument for mitigating climate change. This mechanism can effectively reduce the emission of CO2 and other green house gases (GHG). Construction of a mega infrastructure project like underground corridor construction for metro rail operation involves in consumption of substantial quantity of concrete which consumes huge quantity of energy consuming materials like cement and steel. This paper is an attempt to develop an integrated clean development mechanism and risk management approach for sustainable development for an underground corridor metro rail project in India during its construction phase. It was observed that about 35% reduction in CO2 emission can be obtained by adding fly ash as a part replacement of cement. The reduced emission quantity of CO2 which is of the quantum of about 21,646.36 MT would result in cost savings of approximately INR 8.5 million (USD 1,29,878).But construction and operation of such infrastructure projects of the present era are subject to huge risks and uncertainties throughout all the phases of the project, thus reducing the probability of successful completion of the project within stipulated time and cost frame. Thus, an integrated approach of combining CDM with risk management would enable the metro rail authorities to develop a sustainable risk mitigation measure framework to ensure more cost and energy savings and lesser time and cost over-run.Keywords: clean development mechanism (CDM), infrastructure transportation, project risk management, underground metro rail
Procedia PDF Downloads 47412743 Deep-Learning Based Approach to Facial Emotion Recognition through Convolutional Neural Network
Authors: Nouha Khediri, Mohammed Ben Ammar, Monji Kherallah
Abstract:
Recently, facial emotion recognition (FER) has become increasingly essential to understand the state of the human mind. Accurately classifying emotion from the face is a challenging task. In this paper, we present a facial emotion recognition approach named CV-FER, benefiting from deep learning, especially CNN and VGG16. First, the data is pre-processed with data cleaning and data rotation. Then, we augment the data and proceed to our FER model, which contains five convolutions layers and five pooling layers. Finally, a softmax classifier is used in the output layer to recognize emotions. Based on the above contents, this paper reviews the works of facial emotion recognition based on deep learning. Experiments show that our model outperforms the other methods using the same FER2013 database and yields a recognition rate of 92%. We also put forward some suggestions for future work.Keywords: CNN, deep-learning, facial emotion recognition, machine learning
Procedia PDF Downloads 9512742 An Optimal Steganalysis Based Approach for Embedding Information in Image Cover Media with Security
Authors: Ahlem Fatnassi, Hamza Gharsellaoui, Sadok Bouamama
Abstract:
This paper deals with the study of interest in the fields of Steganography and Steganalysis. Steganography involves hiding information in a cover media to obtain the stego media in such a way that the cover media is perceived not to have any embedded message for its unintended recipients. Steganalysis is the mechanism of detecting the presence of hidden information in the stego media and it can lead to the prevention of disastrous security incidents. In this paper, we provide a critical review of the steganalysis algorithms available to analyze the characteristics of an image stego media against the corresponding cover media and understand the process of embedding the information and its detection. We anticipate that this paper can also give a clear picture of the current trends in steganography so that we can develop and improvise appropriate steganalysis algorithms.Keywords: optimization, heuristics and metaheuristics algorithms, embedded systems, low-power consumption, steganalysis heuristic approach
Procedia PDF Downloads 292