Search results for: sound source localization
5162 Profiling Risky Code Using Machine Learning
Authors: Zunaira Zaman, David Bohannon
Abstract:
This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties
Procedia PDF Downloads 1095161 An Overview on the Effectiveness of Brand Mascot and Celebrity Endorsement
Authors: Isari Pairoa, Proud Arunrangsiwed
Abstract:
Celebrity and brand mascot endorsement have been explored for more than three decades. Both endorsers can effectively transfer their reputation to corporate image and can influence the customers to purchase the product. However, there was little known about the mediators between the level of endorsement and its effect on buying behavior. The objective of the current study is to identify the gab of the previous studies and to seek possible mediators. It was found that consumer’s memory and identification are the mediators, of source credibility and endorsement effect. A future study should confirm the model of endorsement, which was established in the current study.Keywords: product endorsement, memory, identification theory, source credibility, unintentional effect
Procedia PDF Downloads 2285160 The Impact of Rising Architectural Façade in Improving Terms of the Physical Urban Ambience Inside the Free Space for Urban Fabric - the Street- Case Study the City of Biskra
Authors: Rami Qaoud, Alkama Djamal
Abstract:
When we ask about the impact of rising architectural façade in improving the terms physical urban ambiance inside the free space for urban fabric. Considered as bringing back life and culture values and civilization to these cities. And This will be the theme of this search. Where we have conducted the study about the relationship that connects the empty and full of in the urban fabric in terms of the density construction and the architectural elevation of its façade to street view. In this framework, we adopted in the methodology of this research the technical field experience. And according to three types of Street engineering(H≥2W, H=W, H≤0.5W). Where we conducted a field to raise the values of the physical ambiance according to three main axes of ambiance. The first axe 1 - Thermal ambiance. Where the temperature values were collected, relative humidity, wind speed, temperature of surfaces (the outer wall-ground). The second axe 2- Visual ambiance. Where we took the values of natural lighting levels during the daytime. The third axe 3- Acoustic ambiance . Where we take sound values during the entire day. That experience, which lasted for three consecutive days, and through six stations of measuring, where it has been one measuring station for each type of the street engineering and in two different way street. Through the obtained results and with the comparison of those values. We noticed the difference between this values and the three type of street engineering. Where the difference the calorific values of air equal 4 ° C , in terms of the visual ambiance the difference in the direct lighting natural periods amounted six hours between the three types of street engineering. As well in terms of sound ambience, registered a difference in values of up 15 (db) between the three types. This difference in values indicates The impact of rising architectural façade in improving the physical urban ambiance within the free field - street- for urban fabric.Keywords: street, physical urban ambience, rising architectural façade, urban fabric
Procedia PDF Downloads 2915159 A Framework for Auditing Multilevel Models Using Explainability Methods
Authors: Debarati Bhaumik, Diptish Dey
Abstract:
Multilevel models, increasingly deployed in industries such as insurance, food production, and entertainment within functions such as marketing and supply chain management, need to be transparent and ethical. Applications usually result in binary classification within groups or hierarchies based on a set of input features. Using open-source datasets, we demonstrate that popular explainability methods, such as SHAP and LIME, consistently underperform inaccuracy when interpreting these models. They fail to predict the order of feature importance, the magnitudes, and occasionally even the nature of the feature contribution (negative versus positive contribution to the outcome). Besides accuracy, the computational intractability of SHAP for binomial classification is a cause of concern. For transparent and ethical applications of these hierarchical statistical models, sound audit frameworks need to be developed. In this paper, we propose an audit framework for technical assessment of multilevel regression models focusing on three aspects: (i) model assumptions & statistical properties, (ii) model transparency using different explainability methods, and (iii) discrimination assessment. To this end, we undertake a quantitative approach and compare intrinsic model methods with SHAP and LIME. The framework comprises a shortlist of KPIs, such as PoCE (Percentage of Correct Explanations) and MDG (Mean Discriminatory Gap) per feature, for each of these three aspects. A traffic light risk assessment method is furthermore coupled to these KPIs. The audit framework will assist regulatory bodies in performing conformity assessments of AI systems using multilevel binomial classification models at businesses. It will also benefit businesses deploying multilevel models to be future-proof and aligned with the European Commission’s proposed Regulation on Artificial Intelligence.Keywords: audit, multilevel model, model transparency, model explainability, discrimination, ethics
Procedia PDF Downloads 955158 Radiosensitization Properties of Gold Nanoparticles in Brachytherapy of Uterus Cancer by High Dose Rate I-125 Seed: A Simulation Study by MCNPX and MCNP6 Codes
Authors: Elham Mansouri, Asghar Mesbahi
Abstract:
Purpose: In the current study, we aimed to investigate the macroscopic and microscopic dose enhancement effect of metallic nanoparticles in interstitial brachytherapy of uterus cancer by Iodin-125 source using a nano-lattice model in MCNPX (5) and MCNP6.1 codes. Materials and methods: Based on a nano-lattice simulation model containing a radiation source and a tumor tissue with cellular compartments loaded with 7mg/g spherical nanoparticles (bismuth, gold, and gadolinium), the energy deposited by the secondary electrons in microscopic and macroscopic level was estimated. Results: The results show that the values of macroscopic DEF is higher than microscopic DEF values and the macroscopic DEF values decreases as a function of distance from the brachytherapy source surface. Also, the results revealed a remarkable discrepancy between the DEF and secondary electron spectra calculated by MCNPX (5) and MCNP6.1 codes, which could be justified by the difference in energy cut-off and electron transport algorithms of two codes. Conclusion: According to the both MCNPX (5) and MCNP6.1 outputs, it could be concluded that the presence of metallic nanoparticles in the tumor tissue of uteruscancer increases the physical effectiveness of brachytherapy by I-125 source. The results presented herein give a physical view of radiosensitization potential of different metallic nanoparticles and could be considered in design of analytical and experimental radiosensitization studies in tumor regions using various radiotherapy modalities in the presence of heavy nanomaterials.Keywords: MCNPX, MCNP6, nanoparticle, brachytherapy
Procedia PDF Downloads 1035157 Acoustic Energy Harvesting Using Polyvinylidene Fluoride (PVDF) and PVDF-ZnO Piezoelectric Polymer
Authors: S. M. Giripunje, Mohit Kumar
Abstract:
Acoustic energy that exists in our everyday life and environment have been overlooked as a green energy that can be extracted, generated, and consumed without any significant negative impact to the environment. The harvested energy can be used to enable new technology like wireless sensor networks. Technological developments in the realization of truly autonomous MEMS devices and energy storage systems have made acoustic energy harvesting (AEH) an increasingly viable technology. AEH is the process of converting high and continuous acoustic waves from the environment into electrical energy by using an acoustic transducer or resonator. AEH is not popular as other types of energy harvesting methods since sound waves have lower energy density and such energy can only be harvested in very noisy environment. However, the energy requirements for certain applications are also correspondingly low and also there is a necessity to observe the noise to reduce noise pollution. So the ability to reclaim acoustic energy and store it in a usable electrical form enables a novel means of supplying power to relatively low power devices. A quarter-wavelength straight-tube acoustic resonator as an acoustic energy harvester is introduced with polyvinylidene fluoride (PVDF) and PVDF doped with ZnO nanoparticles, piezoelectric cantilever beams placed inside the resonator. When the resonator is excited by an incident acoustic wave at its first acoustic eigen frequency, an amplified acoustic resonant standing wave is developed inside the resonator. The acoustic pressure gradient of the amplified standing wave then drives the vibration motion of the PVDF piezoelectric beams, generating electricity due to the direct piezoelectric effect. In order to maximize the amount of the harvested energy, each PVDF and PVDF-ZnO piezoelectric beam has been designed to have the same structural eigen frequency as the acoustic eigen frequency of the resonator. With a single PVDF beam placed inside the resonator, the harvested voltage and power become the maximum near the resonator tube open inlet where the largest acoustic pressure gradient vibrates the PVDF beam. As the beam is moved to the resonator tube closed end, the voltage and power gradually decrease due to the decreased acoustic pressure gradient. Multiple piezoelectric beams PVDF and PVDF-ZnO have been placed inside the resonator with two different configurations: the aligned and zigzag configurations. With the zigzag configuration which has the more open path for acoustic air particle motions, the significant increases in the harvested voltage and power have been observed. Due to the interruption of acoustic air particle motion caused by the beams, it is found that placing PVDF beams near the closed tube end is not beneficial. The total output voltage of the piezoelectric beams increases linearly as the incident sound pressure increases. This study therefore reveals that the proposed technique used to harvest sound wave energy has great potential of converting free energy into useful energy.Keywords: acoustic energy, acoustic resonator, energy harvester, eigenfrequency, polyvinylidene fluoride (PVDF)
Procedia PDF Downloads 3875156 A Mixing Matrix Estimation Algorithm for Speech Signals under the Under-Determined Blind Source Separation Model
Authors: Jing Wu, Wei Lv, Yibing Li, Yuanfan You
Abstract:
The separation of speech signals has become a research hotspot in the field of signal processing in recent years. It has many applications and influences in teleconferencing, hearing aids, speech recognition of machines and so on. The sounds received are usually noisy. The issue of identifying the sounds of interest and obtaining clear sounds in such an environment becomes a problem worth exploring, that is, the problem of blind source separation. This paper focuses on the under-determined blind source separation (UBSS). Sparse component analysis is generally used for the problem of under-determined blind source separation. The method is mainly divided into two parts. Firstly, the clustering algorithm is used to estimate the mixing matrix according to the observed signals. Then the signal is separated based on the known mixing matrix. In this paper, the problem of mixing matrix estimation is studied. This paper proposes an improved algorithm to estimate the mixing matrix for speech signals in the UBSS model. The traditional potential algorithm is not accurate for the mixing matrix estimation, especially for low signal-to noise ratio (SNR).In response to this problem, this paper considers the idea of an improved potential function method to estimate the mixing matrix. The algorithm not only avoids the inuence of insufficient prior information in traditional clustering algorithm, but also improves the estimation accuracy of mixing matrix. This paper takes the mixing of four speech signals into two channels as an example. The results of simulations show that the approach in this paper not only improves the accuracy of estimation, but also applies to any mixing matrix.Keywords: DBSCAN, potential function, speech signal, the UBSS model
Procedia PDF Downloads 1355155 Effect of Spatially Correlated Disorder on Electronic Transport Properties of Aperiodic Superlattices (GaAs/AlxGa1-xAs)
Authors: F. Bendahma, S. Bentata, S. Cherid, A. Zitouni, S. Terkhi, T. Lantri, Y. Sefir, Z. F. Meghoufel
Abstract:
We examine the electronic transport properties in AlxGa1-xAs/GaAs superlattices. Using the transfer-matrix technique and the exact Airy function formalism, we investigate theoretically the effect of structural parameters on the electronic energy spectra of trimer thickness barrier (TTB). Our numerical calculations showed that the localization length of the states becomes more extended when the disorder is correlated (trimer case). We have also found that the resonant tunneling time (RTT) is of the order of several femtoseconds.Keywords: electronic transport properties, structural parameters, superlattices, transfer-matrix technique
Procedia PDF Downloads 2855154 The Expanding Role of Islamic Law in the Current Indonesian Legal Reform
Authors: Muhammad Ilham Agus Salim, Saufa Ata Taqiyya
Abstract:
In many Muslim countries, secularization has successfully reduced the role of Islamic law as a formal legal source during this last century. The most obvious fact was the reform of Daulah Utsmaniyah to be Secular Republic of Turkey. Religion is strictly separated from the state authorities in many countries today. But these last decades in Indonesia, a remarkable fact is apparent. Islamic law has expanded its role in Indonesian legal system, especially in districts regulations. In Aceh province, as a case in point, shariah has been the basic source of law in all regulations. There are more provinces in Indonesia which adopted Islamic law as a formal legal source by the end of 2014. Different from some other countries which clearly stipulates the status of Islam in formal ways, Indonesian constitution formally does not render any recognition for Islam to be the formal religion of the state. But in this Muslim majority country, Islamic law takes a place in democratic way, namely on the basis of the voice of majority. This paper will analyze how this reality increases significantly since what so called by Indonesian reformation era (end of nineties). Some causes will be identified regarding this tendency of expansion of role. Some lessons learned also will be recommended as the concluding remarks by the end of the paper.Keywords: Islamic law, Indonesia, legal reform, Syariah local regulation
Procedia PDF Downloads 3525153 Free and Open Source Software for BIM Workflow of Steel Structure Design
Authors: Danilo Di Donato
Abstract:
The continuous new releases of free and open source software (FOSS) and the high costs of proprietary software -whose monopoly is characterized by closed codes and the low level of implementation and customization of software by end-users- impose a reflection on possible tools that can be chosen and adopted for the design and the representation of new steel constructions. The paper aims to show experimentation carried out to verify the actual potential and the effective applicability of FOSS supports to the BIM modeling of steel structures, particularly considering the goal of a possible workflow in order to achieve high level of development (LOD); allow effective interchange methods between different software. To this end, the examined software packages are those with open source or freeware licenses, in order to evaluate their use in architectural praxis. The test has primarily involved the experimentation of Freecad -the only Open Source software that allows a complete and integrated BIM workflow- and then the results have been compared with those of two proprietary software, Sketchup and TeklaBim Sight, which are released with a free version, but not usable for commercial purposes. The experiments carried out on Open Source, and freeware software was then compared with the outcomes that are obtained by two proprietary software, Sketchup Pro and Tekla Structure which has special modules particularly addressed to the design of steel structures. This evaluation has concerned different comparative criteria, that have been defined on the basis of categories related to the reliability, the efficiency, the potentiality, achievable LOD and user-friendliness of the analyzed software packages. In order to verify the actual outcomes of FOSS BIM for the steel structure projects, these results have been compared with a simulation related to a real case study and carried out with a proprietary software BIM modeling. Therefore, the same design theme, the project of a shelter of public space, has been developed using different software. Therefore the purpose of the contribution is to assess what are the developments and potentialities inherent in FOSS BIM, in order to estimate their effective applicability to professional practice, their limits and new fields of research they propose.Keywords: BIM, steel buildings, FOSS, LOD
Procedia PDF Downloads 1755152 The Effects of Source and Timing on the Acceptance of New Product Recommendation: A Lab Experiment
Abstract:
A new product is important for companies to extend consumers and manifest competitiveness. New product often involves new features that consumers might not be familiar with while it may also have a competitive advantage to attract consumers compared to established products. However, although most online retailers employ recommendation agents (RA) to influence consumers’ product choice decision, recommended new products are not accepted and chosen as expected. We argue that it might also be caused by providing a new product recommendation in the wrong way at the wrong time. This study seeks to discuss how new product evaluations sourced from third parties could be employed in RAs as evidence of the superiority for the new product and how the new product recommendation could be provided to a consumer at the right time so that it can be accepted and finally chosen during the consumer’s decision-making process. A 2*2 controlled laboratory experiment was conducted to understand the selection of new product recommendation sources and recommendation timing. Human subjects were randomly assigned to one of the four treatments to minimize the effects of individual differences on the results. Participants were told to make purchase choices from our product categories. We find that a new product recommended right after a similar existing product and with the source of the expert review will be more likely to be accepted. Based on this study, both theoretical and practical contributions are provided regarding new product recommendation.Keywords: new product recommendation, recommendation timing, recommendation source, recommendation agents
Procedia PDF Downloads 1555151 Discrete-Event Modeling and Simulation Methodologies: Past, Present and Future
Authors: Gabriel Wainer
Abstract:
Modeling and Simulation methods have been used to better analyze the behavior of complex physical systems, and it is now common to use simulation as a part of the scientific and technological discovery process. M&S advanced thanks to the improvements in computer technology, which, in many cases, resulted in the development of simulation software using ad-hoc techniques. Formal M&S appeared in order to try to improve the development task of very complex simulation systems. Some of these techniques proved to be successful in providing a sound base for the development of discrete-event simulation models, improving the ease of model definition and enhancing the application development tasks; reducing costs and favoring reuse. The DEVS formalism is one of these techniques, which proved to be successful in providing means for modeling while reducing development complexity and costs. DEVS model development is based on a sound theoretical framework. The independence of M&S tasks made possible to run DEVS models on different environments (personal computers, parallel computers, real-time equipment, and distributed simulators) and middleware. We will present a historical perspective of discrete-event M&S methodologies, showing different modeling techniques. We will introduce DEVS origins and general ideas, and compare it with some of these techniques. We will then show the current status of DEVS M&S, and we will discuss a technological perspective to solve current M&S problems (including real-time simulation, interoperability, and model-centered development techniques). We will show some examples of the current use of DEVS, including applications in different fields. We will finally show current open topics in the area, which include advanced methods for centralized, parallel or distributed simulation, the need for real-time modeling techniques, and our view in these fields.Keywords: modeling and simulation, discrete-event simulation, hybrid systems modeling, parallel and distributed simulation
Procedia PDF Downloads 3235150 Use of Cow Dung Residues of Biogas Plants for Sustainable Development of Rural Communities in Pakistan
Authors: Sumra Siddique Abbasi, Cheng Shikun
Abstract:
Biogas technology has rapidly developed in agriculture sector to upgrade and improve the life of farmers by providing them alternative and cost-effective energy source. Main purpose of this study is to understand the advantages of biogas plants by livestock owners either they are household-based livestock owners or may own farms for livestock. Similarly, a pertinent and major purpose of this research is to examine the factors affecting the decision to adopt biogas technologies at the household level. Based on the result, both public and private sector organization can make decisions related to the installation of biogas projects. Biogas is major energy source which can be used as an alternative and renewable energy source. This energy production technology can contribute in uplifting the lifestyle of farmers and can contribute into sustainable development of rural communities in Pakistan. People with livestock in any community in Pakistan can get benefit from biogas plants and it will contribute in sustainable development program which generates socio economic benefits, heath upgradation, cost effective energy source and positive impact on climate change or environmental issues. This study was conductive using survey method and descriptive analysis. One hundred fifty (150) farmers were the respondents who participated in survey. These farmers were from Layyah district of Punjab and were selected using snowball sampling technique. To generate the results, SPSS tool was used for data analysis.Keywords: biogas plant, animal dunk, renewable energy, pakistan
Procedia PDF Downloads 735149 Cascade Multilevel Inverter-Based Grid-Tie Single-Phase and Three-Phase-Photovoltaic Power System Controlling and Modeling
Authors: Syed Masood Hussain
Abstract:
An effective control method, including system-level control and pulse width modulation for quasi-Z-source cascade multilevel inverter (qZS-CMI) based grid-tie photovoltaic (PV) power system is proposed. The system-level control achieves the grid-tie current injection, independent maximum power point tracking (MPPT) for separate PV panels, and dc-link voltage balance for all quasi-Z-source H-bridge inverter (qZS-HBI) modules. A recent upsurge in the study of photovoltaic (PV) power generation emerges, since they directly convert the solar radiation into electric power without hampering the environment. However, the stochastic fluctuation of solar power is inconsistent with the desired stable power injected to the grid, owing to variations of solar irradiation and temperature. To fully exploit the solar energy, extracting the PV panels’ maximum power and feeding them into grids at unity power factor become the most important. The contributions have been made by the cascade multilevel inverter (CMI). Nevertheless, the H-bridge inverter (HBI) module lacks boost function so that the inverter KVA rating requirement has to be increased twice with a PV voltage range of 1:2; and the different PV panel output voltages result in imbalanced dc-link voltages. However, each HBI module is a two-stage inverter, and many extra dc–dc converters not only increase the complexity of the power circuit and control and the system cost, but also decrease the efficiency. Recently, the Z-source/quasi-Z-source cascade multilevel inverter (ZS/qZS-CMI)-based PV systems were proposed. They possess the advantages of both traditional CMI and Z-source topologies. In order to properly operate the ZS/qZS-CMI, the power injection, independent control of dc-link voltages, and the pulse width modulation (PWM) are necessary. The main contributions of this paper include: 1) a novel multilevel space vector modulation (SVM) technique for the single phase qZS-CMI is proposed, which is implemented without additional resources; 2) a grid-connected control for the qZS-CMI based PV system is proposed, where the all PV panel voltage references from their independent MPPTs are used to control the grid-tie current; the dual-loop dc-link peak voltage control.Keywords: Quzi-Z source inverter, Photo voltaic power system, space vector modulation, cascade multilevel inverter
Procedia PDF Downloads 5485148 Analyzing the Impact of Code Commenting on Software Quality
Authors: Thulya Premathilake, Tharushi Perera, Hansi Thathsarani, Tharushi Nethmini, Dilshan De Silva, Piyumika Samarasekara
Abstract:
One of the most efficient ways to assist developers in grasping the source code is to make use of comments, which can be found throughout the code. When working in fields such as software development, having comments in your code that are of good quality is a fundamental requirement. Tackling software problems while making use of programs that have already been built. It is essential for the intention of the source code to be made crystal apparent in the comments that are added to the code. This assists programmers in better comprehending the programs they are working on and enables them to complete software maintenance jobs in a more timely manner. In spite of the fact that comments and documentation are meant to improve readability and maintainability, the vast majority of programmers place the majority of their focus on the actual code that is being written. This study provides a complete and comprehensive overview of the previous research that has been conducted on the topic of code comments. The study focuses on four main topics, including automated comment production, comment consistency, comment classification, and comment quality rating. One is able to get the knowledge that is more complete for use in following inquiries if they conduct an analysis of the proper approaches that were used in this study issue.Keywords: code commenting, source code, software quality, quality assurance
Procedia PDF Downloads 865147 Dynamics of Investor's Behaviour: An Analytical Survey Study in Indian Securities Market
Authors: Saurabh Agarwal
Abstract:
This paper attempts to formalise the effect of demographic variables like marital status, gender, occupation and age on the source of investment advice which, in turn, affect the herd behaviour of investors and probability of investment in near future. Further, postulations have been made for most preferred investment option and purpose of saving and source of investment. Impact of theoretical analysis on choice among investment alternatives has also been investigated. The analysis contributes to understanding the different investment choices made by households in India. The insights offered in the paper indirectly contribute in uncovering the various unexplained asset pricing puzzles.Keywords: portfolio choice, investment decisions, investor’s behaviour, Indian securities market
Procedia PDF Downloads 3675146 Production of Biogas
Authors: J. O. Alabi
Abstract:
Biogas is a clean burning, easily produced natural fuel that is an important source of energy for cooking and heating in rural areas and third world countries. Anaerobic bacteria inside biodigesters break down biomass to produce biogas. (Which is 70% methane)? Currently there is no simple way to compress and store biogas. So, in order to use biogas as a source of energy, a direct feed from biodigeser to the store tap or heater must be made. Any excess biogas is vented into the atmosphere, which is wasteful and car have a negative effect on the environment, we have been tasked with designing a system that will be able to compress biogas using an off-grid power supply, making the biogas portable and makes through the use of large-scale, shared biodigester. Our final design is a system that maximizes simplicity and safety while minimizing cost.Keywords: biogas, biodigesters, natural fuel, bionanotechnology
Procedia PDF Downloads 3675145 The Role of Institutions in Community Wildlife Conservation in Zimbabwe
Authors: Herbert Ntuli, Edwin Muchapondwa
Abstract:
This study used a sample of 336 households and community level data from 30 communities around the Gonarezhou National Park in Zimbabwe to analyse the association between ability to self-organize or cooperation and institutions on one hand and the relationship between success of biodiversity outcomes and cooperation on the other hand. Using both the ordinary least squares and instrumental variables estimation with heteroskedasticity-based instruments, our results confirmed that sound institutions are indeed an important ingredient for cooperation in the respective communities and cooperation positively and significantly affects biodiversity outcomes. Group size, community level trust, the number of stakeholders and punishment were found to be important variables explaining cooperation. From a policy perspective, our results show that external enforcement of rules and regulations does not necessarily translate into sound ecological outcomes but better outcomes are attainable when punishment is rather endogenized by local communities. This seems to suggest that communities should rather be supported in such a way that robust institutions that are tailor made to suit the needs of local condition will emerge that will in turn facilitate good environmental husbandry. Cooperation, training, benefits, distance from the nearest urban canter, distance from the fence, social capital average age of household head, fence and information sharing were found to be very important variables explaining the success of biodiversity outcomes ceteris paribus. Government programmes should target capacity building in terms of institutional capacity and skills development in order to have a positive impact on biodiversity. Hence, the role of stakeholders (e.g., NGOs) in capacity building and government effort should complement each other to ensure that the necessary resources are mobilized and all communities receive the necessary training and resources.Keywords: institutions, self-organize, common pool resources, wildlife, conservation, Zimbabwe
Procedia PDF Downloads 2815144 Employing a Knime-based and Open-source Tools to Identify AMI and VER Metabolites from UPLC-MS Data
Authors: Nouf Alourfi
Abstract:
This study examines the metabolism of amitriptyline (AMI) and verapamil (VER) using a KNIME-based method. KNIME improved workflow is an open-source data-analytics platform that integrates a number of open-source metabolomics tools such as CFMID and MetFrag to provide standard data visualisations, predict candidate metabolites, assess them against experimental data, and produce reports on identified metabolites. The use of this workflow is demonstrated by employing three types of liver microsomes (human, rat, and Guinea pig) to study the in vitro metabolism of the two drugs (AMI and VER). This workflow is used to create and treat UPLC-MS (Orbitrap) data. The formulas and structures of these drugs' metabolites can be assigned automatically. The key metabolic routes for amitriptyline are hydroxylation, N-dealkylation, N-oxidation, and conjugation, while N-demethylation, O-demethylation and N-dealkylation, and conjugation are the primary metabolic routes for verapamil. The identified metabolites are compatible to the published, clarifying the solidity of the workflow technique and the usage of computational tools like KNIME in supporting the integration and interoperability of emerging novel software packages in the metabolomics area.Keywords: KNIME, CFMID, MetFrag, Data Analysis, Metabolomics
Procedia PDF Downloads 1215143 Magnetohydrodynamic Flow of Viscoelastic Nanofluid and Heat Transfer over a Stretching Surface with Non-Uniform Heat Source/Sink and Non-Linear Radiation
Authors: Md. S. Ansari, S. S. Motsa
Abstract:
In this paper, an analysis has been made on the flow of non-Newtonian viscoelastic nanofluid over a linearly stretching sheet under the influence of uniform magnetic field. Heat transfer characteristics is analyzed taking into the effect of nonlinear radiation and non-uniform heat source/sink. Transport equations contain the simultaneous effects of Brownian motion and thermophoretic diffusion of nanoparticles. The relevant partial differential equations are non-dimensionalized and transformed into ordinary differential equations by using appropriate similarity transformations. The transformed, highly nonlinear, ordinary differential equations are solved by spectral local linearisation method. The numerical convergence, error and stability analysis of iteration schemes are presented. The effects of different controlling parameters, namely, radiation, space and temperature-dependent heat source/sink, Brownian motion, thermophoresis, viscoelastic, Lewis number and the magnetic force parameter on the flow field, heat transfer characteristics and nanoparticles concentration are examined. The present investigation has many industrial and engineering applications in the fields of coatings and suspensions, cooling of metallic plates, oils and grease, paper production, coal water or coal–oil slurries, heat exchangers’ technology, and materials’ processing and exploiting.Keywords: magnetic field, nonlinear radiation, non-uniform heat source/sink, similar solution, spectral local linearisation method, Rosseland diffusion approximation
Procedia PDF Downloads 3735142 Study of Composite Materials for Aisha Containment Chamber
Authors: G. Costa, F. Noto, L. Celona, F. Chines, G. Ciavola, G. Cuttone, S. Gammino, O. Leonardi, S. Marletta, G. Torrisi
Abstract:
The ion sources for accelerators devoted to medical applications must provide intense ion beams, with high reproducibility, stability and brightness. AISHa (Advanced Ion Source for Hadron-therapy) is a compact ECRIS whose hybrid magnetic system consists of a permanent Halbach-type hexapole magnet and a set of independently energized superconducting coils. These coils will be enclosed in a compact cryostat with two cryocoolers for LHe-free operation. The AISHa ion source has been designed by taking into account the typical requirements of hospital-based facilities, where the minimization of the mean time between failures (MTBF) is a key point together with the maintenance operations which should be fast and easy. It is intended to be a multipurpose device, operating at 18 GHz, in order to achieve higher plasma densities. It should provide enough versatility for future needs of the hadron therapy, including the ability to run at larger microwave power to produce different species and highly charged ion beams. The source is potentially interesting for any hadrontherapy center using heavy ions. In the paper, we designed an innovative solution for the plasma containment chamber that allows us to solve our isolation and structural problems. We analyzed the materials chosen for our aim (glass fibers and carbon fibers) and we illustrated the all process (spinning, curing and machining) of the assembly of our chamber. The glass fibers and carbon fibers are used to reinforce polymer matrices and give rise to structural composites and composites by molding.Keywords: hadron-therapy, carbon fiber, glass fiber, vacuum-bag, ECR, ion source
Procedia PDF Downloads 2105141 Validation of the Formula for Air Attenuation Coefficient for Acoustic Scale Models
Authors: Katarzyna Baruch, Agata Szelag, Aleksandra Majchrzak, Tadeusz Kamisinski
Abstract:
Methodology of measurement of sound absorption coefficient in scaled models is based on the ISO 354 standard. The measurement is realised indirectly - the coefficient is calculated from the reverberation time of an empty chamber as well as a chamber with an inserted sample. It is crucial to maintain the atmospheric conditions stable during both measurements. Possible differences may be amended basing on the formulas for atmospheric attenuation coefficient α given in ISO 9613-1. Model studies require scaling particular factors in compliance with specified characteristic numbers. For absorption coefficient measurement, these are for example: frequency range or the value of attenuation coefficient m. Thanks to the possibilities of modern electroacoustic transducers, it is no longer a problem to scale the frequencies which have to be proportionally higher. However, it may be problematic to reduce values of the attenuation coefficient. It is practically obtained by drying the air down to a defined relative humidity. Despite the change of frequency range and relative humidity of the air, ISO 9613-1 standard still allows the calculation of the amendment for little differences of the atmospheric conditions in the chamber during measurements. The paper discusses a number of theoretical analyses and experimental measurements performed in order to obtain consistency between the values of attenuation coefficient calculated from the formulas given in the standard and by measurement. The authors performed measurements of reverberation time in a chamber made in a 1/8 scale in a corresponding frequency range, i.e. 800 Hz - 40 kHz and in different values of the relative air humidity (40% 5%). Based on the measurements, empirical values of attenuation coefficient were calculated and compared with theoretical ones. In general, the values correspond with each other, but for high frequencies and low values of relative air humidity the differences are significant. Those discrepancies may directly influence the values of measured sound absorption coefficient and cause errors. Therefore, the authors made an effort to determine an amendment minimizing described inaccuracy.Keywords: air absorption correction, attenuation coefficient, dimensional analysis, model study, scaled modelling
Procedia PDF Downloads 4215140 A Middleware Management System with Supporting Holonic Modules for Reconfigurable Management System
Authors: Roscoe McLean, Jared Padayachee, Glen Bright
Abstract:
There is currently a gap in the technology covering the rapid establishment of control after a reconfiguration in a Reconfigurable Manufacturing System. This gap involves the detection of the factory floor state and the communication link between the factory floor and the high-level software. In this paper, a thin, hardware-supported Middleware Management System (MMS) is proposed and its design and implementation are discussed. The research found that a cost-effective localization technique can be combined with intelligent software to speed up the ramp-up of a reconfigured system. The MMS makes the process more intelligent, more efficient and less time-consuming, thus supporting the industrial implementation of the RMS paradigm.Keywords: intelligent systems, middleware, reconfigurable manufacturing, management system
Procedia PDF Downloads 6765139 Digital Forensics Compute Cluster: A High Speed Distributed Computing Capability for Digital Forensics
Authors: Daniel Gonzales, Zev Winkelman, Trung Tran, Ricardo Sanchez, Dulani Woods, John Hollywood
Abstract:
We have developed a distributed computing capability, Digital Forensics Compute Cluster (DFORC2) to speed up the ingestion and processing of digital evidence that is resident on computer hard drives. DFORC2 parallelizes evidence ingestion and file processing steps. It can be run on a standalone computer cluster or in the Amazon Web Services (AWS) cloud. When running in a virtualized computing environment, its cluster resources can be dynamically scaled up or down using Kubernetes. DFORC2 is an open source project that uses Autopsy, Apache Spark and Kafka, and other open source software packages. It extends the proven open source digital forensics capabilities of Autopsy to compute clusters and cloud architectures, so digital forensics tasks can be accomplished efficiently by a scalable array of cluster compute nodes. In this paper, we describe DFORC2 and compare it with a standalone version of Autopsy when both are used to process evidence from hard drives of different sizes.Keywords: digital forensics, cloud computing, cyber security, spark, Kubernetes, Kafka
Procedia PDF Downloads 3945138 The Adoption and Use of Social Media as a Source of Information by Egyptian Government Journalists
Authors: Essam Mansour
Abstract:
This study purposes to explore the adoption and use of social media as a source of information by Egyptian government journalists. It applied a survey with a total of 386 journalists representing the three official newspapers of Egypt. Findings showed that 27.2% of journalists were found to not use social media, mainly males (69.7%), older than 40 years (77.7%) and mostly with a BA degree (80.4%). On the other hand, 72.8% of them were found to use these platforms who were also males (59.1%), younger than 40 years (65.9%) and mostly with a BA degree (93.2%). More than two-thirds (69.9%) were somewhat old users whose experience ranged from seven to ten years, and more than two-thirds (73.5%) have been heavily using these platforms (four to more than six hours a day. Such results confirm that a large number (95.7%) of users were found to be at least advanced users. Social media users’ home and work were the most significant places to access these platforms, which were found to be easy and useful to use. Most types of social media used were social news, media sharing and micro blogging, blogs comments and forums, social networking sites and bookmarking sites to perform tasks, such as finding information, making communication, keeping up to date, checking materials, sharing information and making discussions. A large number of users tend to accept these media platforms to be a source of information since they are accessible, linked references updated sources, accurate, promote current work, convenient, secured, credible, reliable, stabled, easily identified, copyrighted, build confident and contain filtered information. However, lack of know-how to cite sources, followed by lack of credibility of the source of news, lack of quality of information sources and lack of time were at least significant to journalists when using social media platforms.Keywords: social media, social networking sites, newspapers, journalists, Egypt
Procedia PDF Downloads 2585137 On-Road Text Detection Platform for Driver Assistance Systems
Authors: Guezouli Larbi, Belkacem Soundes
Abstract:
The automation of the text detection process can help the human in his driving task. Its application can be very useful to help drivers to have more information about their environment by facilitating the reading of road signs such as directional signs, events, stores, etc. In this paper, a system consisting of two stages has been proposed. In the first one, we used pseudo-Zernike moments to pinpoint areas of the image that may contain text. The architecture of this part is based on three main steps, region of interest (ROI) detection, text localization, and non-text region filtering. Then, in the second step, we present a convolutional neural network architecture (On-Road Text Detection Network - ORTDN) which is considered a classification phase. The results show that the proposed framework achieved ≈ 35 fps and an mAP of ≈ 90%, thus a low computational time with competitive accuracy.Keywords: text detection, CNN, PZM, deep learning
Procedia PDF Downloads 845136 Geochemical Characterization for Identification of Hydrocarbon Generation: Implication of Unconventional Gas Resources
Authors: Yousif M. Makeen
Abstract:
This research will address the processes of geochemical characterization and hydrocarbon generation process occurring within hydrocarbon source and/or reservoir rocks. The geochemical characterization includes organic-inorganic associations that influence the storage capacity of unconventional hydrocarbon resources (e.g. shale gas) and the migration process of oil/gas of the petroleum source/reservoir rocks. Kerogen i.e. the precursor of petroleum, occurs in various forms and types, may either be oil-prone, gas-prone, or both. China has a number of petroleum-bearing sedimentary basins commonly associated with shale gas, oil sands, and oil shale. Taken Sichuan basin as a selected basin in this study, the Sichuan basin has recorded notable successful discoveries of shale gas especially in the marine shale reservoirs within the area. However, a notable discoveries of lacustrine shale in the North-Este Fuling area indicate the accumulation of shale gas within non-marine source rock. The objective of this study is to evaluate the hydrocarbon storage capacity, generation, and retention processes in the rock matrix of hydrocarbon source/reservoir rocks within the Sichuan basin using an advanced X-ray tomography 3D imaging computational technology, commonly referred to as Micro-CT, SEM (Scanning Electron Microscope), optical microscope as well as organic geochemical facilities (e.g. vitrinite reflectance and UV light). The preliminary results of this study show that the lacustrine shales under investigation are acting as both source and reservoir rocks, which are characterized by very fine grains and very low permeability and porosity. Three pore structures have also been characterized in the study in the lacustrine shales, including organic matter pores, interparticle pores and intraparticle pores using x-ray Computed Tomography (CT). The benefits of this study would be a more successful oil and gas exploration and higher recovery factor, thus having a direct economic impact on China and the surrounding region. Methodologies: SRA TOC/TPH or Rock-Eval technique will be used to determine the source rock richness (S1 and S2) and Tmax. TOC analysis will be carried out using a multi N/C 3100 analyzer. The SRA and TOC results were used in calculating other parameters such as hydrogen index (HI) and production index (PI). This analysis will indicate the quantity of the organic matter. Minimum TOC limits generally accepted as essential for a source-rock are 0.5% for shales and 0.2% for carbonates. Contributions: This research could solve issues related to oil potential, provide targets, and serve as a pathfinder to future exploration activity in the Sichuan basin.Keywords: shale gas, unconventional resources, organic chemistry, Sichuan basin
Procedia PDF Downloads 405135 Korean Men’s Interest in Gonzo Pornography and Use of Condoms
Authors: Chyng Sun
Abstract:
This brief report examines correlations between Korean men’s interest in gonzo pornography, perceptions of pornography’s functional value, and use of condoms. The report found that, neither a higher interest in gonzo or the perception that pornography is a source of sexual information was directly related to condom utilization. However, interest in gonzo pornography interacted with pornography perceptions to predict condomless sex. The findings suggest that Korean men who 1) had higher interest in viewing gonzo pornography, and 2) had a tendency to view pornography as a source of sexual information, are more likely to have sex without condoms. That is, when viewers consider pornography to be a form of sexual education, they are more likely to use the learned pornographic script to inform their sexual behavior.Keywords: Korean, male, pornography, sexuality
Procedia PDF Downloads 1555134 Constructing a Bayesian Network for Solar Energy in Egypt Using Life Cycle Analysis and Machine Learning Algorithms
Authors: Rawaa H. El-Bidweihy, Hisham M. Abdelsalam, Ihab A. El-Khodary
Abstract:
In an era where machines run and shape our world, the need for a stable, non-ending source of energy emerges. In this study, the focus was on the solar energy in Egypt as a renewable source, the most important factors that could affect the solar energy’s market share throughout its life cycle production were analyzed and filtered, the relationships between them were derived before structuring a Bayesian network. Also, forecasted models were built for multiple factors to predict the states in Egypt by 2035, based on historical data and patterns, to be used as the nodes’ states in the network. 37 factors were found to might have an impact on the use of solar energy and then were deducted to 12 factors that were chosen to be the most effective to the solar energy’s life cycle in Egypt, based on surveying experts and data analysis, some of the factors were found to be recurring in multiple stages. The presented Bayesian network could be used later for scenario and decision analysis of using solar energy in Egypt, as a stable renewable source for generating any type of energy needed.Keywords: ARIMA, auto correlation, Bayesian network, forecasting models, life cycle, partial correlation, renewable energy, SARIMA, solar energy
Procedia PDF Downloads 1575133 Feature Engineering Based Detection of Buffer Overflow Vulnerability in Source Code Using Deep Neural Networks
Authors: Mst Shapna Akter, Hossain Shahriar
Abstract:
One of the most important challenges in the field of software code audit is the presence of vulnerabilities in software source code. Every year, more and more software flaws are found, either internally in proprietary code or revealed publicly. These flaws are highly likely exploited and lead to system compromise, data leakage, or denial of service. C and C++ open-source code are now available in order to create a largescale, machine-learning system for function-level vulnerability identification. We assembled a sizable dataset of millions of opensource functions that point to potential exploits. We developed an efficient and scalable vulnerability detection method based on deep neural network models that learn features extracted from the source codes. The source code is first converted into a minimal intermediate representation to remove the pointless components and shorten the dependency. Moreover, we keep the semantic and syntactic information using state-of-the-art word embedding algorithms such as glove and fastText. The embedded vectors are subsequently fed into deep learning networks such as LSTM, BilSTM, LSTM-Autoencoder, word2vec, BERT, and GPT-2 to classify the possible vulnerabilities. Furthermore, we proposed a neural network model which can overcome issues associated with traditional neural networks. Evaluation metrics such as f1 score, precision, recall, accuracy, and total execution time have been used to measure the performance. We made a comparative analysis between results derived from features containing a minimal text representation and semantic and syntactic information. We found that all of the deep learning models provide comparatively higher accuracy when we use semantic and syntactic information as the features but require higher execution time as the word embedding the algorithm puts on a bit of complexity to the overall system.Keywords: cyber security, vulnerability detection, neural networks, feature extraction
Procedia PDF Downloads 91