Search results for: source domain
5817 Numerical Analysis of Internal Cooled Turbine Blade Using Conjugate Heat Transfer
Authors: Bhavesh N. Bhatt, Zozimus D. Labana
Abstract:
This work is mainly focused on the analysis of heat transfer of blade by using internal cooling method. By using conjugate heat transfer technology we can effectively compute the cooling and heat transfer analysis of blade. Here blade temperature is limited by materials melting temperature. By using CFD code, we will analyze the blade cooling with the help of CHT method. There are two types of CHT methods. In the first method, we apply coupled CHT method in which all three domains modeled at once, and in the second method, we will first model external domain and then, internal domain of cooling channel. Ten circular cooling channels are used as a cooling method with different mass flow rate and temperature value. This numerical simulation is applied on NASA C3X turbine blade, and results are computed. Here results are showing good agreement with experimental results. Temperature and pressure are high at the leading edge of the blade on stagnation point due to its first faces the flow. On pressure side, shock wave is formed which also make a sudden change in HTC and other parameters. After applying internal cooling, we are succeeded in reducing the metal temperature of blade by some extends.Keywords: gas turbine, conjugate heat transfer, NASA C3X Blade, circular film cooling channel
Procedia PDF Downloads 3355816 Coupling of Two Discretization Schemes for the Lattice Boltzmann Equation
Authors: Tobias Horstmann, Thomas Le Garrec, Daniel-Ciprian Mincu, Emmanuel Lévêque
Abstract:
Despite the efficiency and low dissipation of the stream-collide formulation of the Lattice Boltzmann (LB) algorithm, which is nowadays implemented in many commercial LBM solvers, there are certain situations, e.g. mesh transition, in which a classical finite-volume or finite-difference formulation of the LB algorithm still bear advantages. In this paper, we present an algorithm that combines the node-based streaming of the distribution functions with a second-order finite volume discretization of the advection term of the BGK-LB equation on a uniform D2Q9 lattice. It is shown that such a coupling is possible for a multi-domain approach as long as the overlap, or buffer zone, between two domains, is achieved on at least 2Δx. This also implies that a direct coupling (without buffer zone) of a stream-collide and finite-volume LB algorithm on a single grid is not stable. The critical parameter in the coupling is the CFL number equal to 1 that is imposed by the stream-collide algorithm. Nevertheless, an explicit filtering step on the finite-volume domain can stabilize the solution. In a further investigation, we demonstrate how such a coupling can be used for mesh transition, resulting in an intrinsic conservation of mass over the interface.Keywords: algorithm coupling, finite volume formulation, grid refinement, Lattice Boltzmann method
Procedia PDF Downloads 3785815 Finding the Optimal Meeting Point Based on Travel Plans in Road Networks
Authors: Mohammad H. Ahmadi, Vahid Haghighatdoost
Abstract:
Given a set of source locations for a group of friends, and a set of trip plans for each group member as a sequence of Categories-of-Interests (COIs) (e.g., restaurant), and finally a specific COI as a common destination that all group members will gather together, in Meeting Point Based on Trip Plans (MPTPs) queries our goal is to find a Point-of-Interest (POI) from different COIs, such that the aggregate travel distance for the group is minimized. In this work, we considered two cases for aggregate function as Sum and Max. For solving this query, we propose an efficient pruning technique for shrinking the search space. Our approach contains three steps. In the first step, it prunes the search space around the source locations. In the second step, it prunes the search space around the centroid of source locations. Finally, we compute the intersection of all pruned areas as the final refined search space. We prove that the POIs beyond the refined area cannot be part of optimal answer set. The paper also covers an extensive performance study of the proposed technique.Keywords: meeting point, trip plans, road networks, spatial databases
Procedia PDF Downloads 1855814 Multisignature Schemes for Reinforcing Trust in Cloud Software-As-A-Service Services
Authors: Mustapha Hedabou, Ali Azougaghe, Ahmed Bentajer, Hicham Boukhris, Mourad Eddiwani, Zakaria Igarramen
Abstract:
Software-as-a-service (SaaS) is emerging as a dominant approach to delivering software. It encompasses a range of business, technical opportunities, issue, and challenges. Trustiness in the cloud services regarding the security and the privacy of the delivered data is the most critical issue with the SaaS model. In this paper, we survey the security concerns related to the SaaS model, and we propose the design of a trusted SaaS model that gives users more confidence into SaaS services by leveraging a trust in a neutral source code certifying authority. The proposed design is based on the use of the multisignature mechanism for signing the source code of the application service. In our model, the cloud provider acts as a root of trust by ensuring the integrity of the application service when it was running on its platform. The proposed design prevents insider attacks from tampering with application service before and after it was launched in a cloud provider platform.Keywords: cloud computing, SaaS Platform, TPM, trustiness, code source certification, multi-signature schemes
Procedia PDF Downloads 2755813 Sizing of Hybrid Source Battery/Supercapacitor for Automotive Applications
Authors: Laid Degaa, Bachir Bendjedia, Nassim Rizoug, Abdelkader Saidane
Abstract:
Energy storage system is a key aspect for the development of clean cars. The work proposed here deals with the modeling of hybrid storage sources composed of a combination of lithium-ion battery and supercapacitors. Simulation results show the performance of the active model for a hybrid source and confirm the feasibility of our approach. In this context, sizing of the electrical energy supply is carried out. The aim of this sizing is to propose an 'optimal' solution that improves the performance of electric vehicles in term of weight, cost and aging.Keywords: battery, electric vehicles, energy, hybrid storage, supercapacitor
Procedia PDF Downloads 7925812 Study of Biological Denitrification using Heterotrophic Bacteria and Natural Source of Carbon
Authors: Benbelkacem Ouerdia
Abstract:
Heterotrophic denitrification has been proven to be one of the most feasible processes for removing nitrate from wastewater and drinking water. In this process, heterotrophic bacteria use organic carbon for both growth and as an electron source. Underground water pollution by nitrates become alarming in Algeria. A survey carried out revealed that the nitrate concentration is in continual increase. Studies in some region revealed contamination exceeding the recommended permissible dose which is 50 mg/L. Worrying values in the regions of Mascara, Ouled saber, El Eulma, Bouira and Algiers are respectively 72 mg/L, 75 mg/L, 97 mg/L, 102 mg/L, and 158 mg/L. High concentration of nitrate in drinking water is associated with serious health risks. Research on nitrate removal technologies from municipal water supplies is increasing because of nitrate contamination. Biological denitrification enables the transformation of oxidized nitrogen compounds by a wide spectrum of heterotrophic bacteria into harmless nitrogen gas with accompanying carbon removal. Globally, denitrification is commonly employed in biological nitrogen removal processes to enhance water quality The study investigated the valorization of a vegetable residue as a carbon source (dates nodes) in water treatment using the denitrification process. Throughout the study, the effect of inoculums addition, pH, and initial concentration of nitrates was also investigated. In this research, a natural organic substance: dates nodes were investigated as a carbon source in the biological denitrification of drinking water. This material acts as a solid substrate and bio-film carrier. The experiments were carried out in batch processes. Complete denitrification was achieved varied between 80 and 100% according to the type of process used. It was found that the nitrate removal rate based on our results, we concluded that the removal of organic matter and nitrogen compounds depended mainly on the initial concentration of nitrate. The effluent pH was mainly affected by the C/N ratio, where a decrease increases pH.Keywords: biofilm, carbon source, dates nodes, heterotrophic denitrification, nitrate, nitrite
Procedia PDF Downloads 4845811 Valorization of Dates Nodes as a Carbon Source Using Biological Denitrification
Authors: Ouerdia Benbelkacem Belouanas
Abstract:
Heterotrophic denitrification has been proven to be one of the most feasible processes for removing nitrate from waste water and drinking water. In this process, heterotrophic bacteria use organic carbon for both growth and as an electron source. Underground water pollution by nitrates become alarming in Algeria. A survey carried out revealed that the nitrate concentration is in continual increase. Studies in some region revealed contamination exceeding the recommended permissible dose which is 50 mg/L. Worrying values in the regions of Mascara, Ouled saber, El Eulma, Bouira and Algiers are respectively 72 mg/L, 75 mg/L, 97 mg/L, 102 mg/L, and 158 mg/L. High concentration of nitrate in drinking water is associated with serious health risks. Research on nitrate removal technologies from municipal water supplies is increasing because of nitrate contamination. Biological denitrification enables transformation of oxidized nitrogen compounds by a wide spectrum of heterotrophic bacteria into harmless nitrogen gas with accompanying carbon removal. Globally, denitrification is commonly employed in biological nitrogen removal processes to enhance water quality. The study investigated the valorization of a vegetable residue as a carbon source (dates nodes) in water treatment using the denitrification process. Throughout the study, the effect of inoculums addition, pH, and initial concentration of nitrates was also investigated. In this research, a natural organic substance: dates nodes were investigated as a carbon source in the biological denitrification of drinking water. This material acts as a solid substrate and bio-film carrier. The experiments were carried out in batch processes. Complete denitrification was achieved varied between 80 and 100% according to the type of process used. It was found that the nitrate removal rate based on our results, we concluded that the removal of organic matter and nitrogen compounds depended mainly on initial concentration of nitrate. The effluent pH was mainly affected by the C/N ratio, where a decrease increases pH.Keywords: biofilm, carbon source, dates nodes, heterotrophic denitrification, nitrate, nitrite
Procedia PDF Downloads 4195810 Thermal Analysis of a Composite of Coco Fiber and Látex
Authors: Elmo Thiago Lins Cöuras Ford, Valentina Alessandra Carvalho do Vale
Abstract:
Given the unquestionable need of environmental preservation, the natural fibers have been seen as a salutary alternative for production of composites in substitution to the synthetic fibers, vitreous and metallic. In this work, the behavior of a composite was analyzed done with fiber of the peel of the coconut as reinforcement and latex as head office, when submitted the source of heat. The temperature profiles were verified in the internal surfaces and it expresses of the composite as well as the temperature gradient in the same. It was also analyzed the behavior of this composite when submitted to a cold source. As consequence, in function of the answers of the system, conclusions were reached.Keywords: natural fiber, composite, temperature, latex, gradient
Procedia PDF Downloads 8175809 The Importance of Functioning and Disability Status Follow-Up in People with Multiple Sclerosis
Authors: Sanela Slavkovic, Congor Nad, Spela Golubovic
Abstract:
Background: The diagnosis of multiple sclerosis (MS) is a major life challenge and has repercussions on all aspects of the daily functioning of those attained by it – personal activities, social participation, and quality of life. Regular follow-up of only the neurological status is not informative enough so that it could provide data on the sort of support and rehabilitation that is required. Objective: The aim of this study was to establish the current level of functioning of persons attained by MS and the factors that influence it. Methods: The study was conducted in Serbia, on a sample of 108 persons with relapse-remitting form of MS, aged 20 to 53 (mean 39.86 years; SD 8.20 years). All participants were fully ambulatory. Methods applied in the study include Expanded Disability Status Scale-EDSS and World Health Organization Disability Assessment Schedule, WHODAS 2.0 (36-item version, self-administered). Results: Participants were found to experience the most problems in the domains of Participation, Mobility, Life activities and Cognition. The least difficulties were found in the domain of Self-care. Symptom duration was the only control variable with a significant partial contribution to the prediction of the WHODAS scale score (β=0.30, p < 0.05). The total EDSS score correlated with the total WHODAS 2.0 score (r=0.34, p=0.00). Statistically significant differences in the domain of EDSS 0-5.5 were found within categories (0-1.5; 2-3.5; 4-5.5). The more pronounced a participant’s EDSS score was, although not indicative of large changes in the neurological status, the more apparent the changes in the functional domain, i.e. in all areas covered by WHODAS 2.0. Pyramidal (β=0.34, p < 0.05) and Bowel and bladder (β=0.24, p < 0.05) functional systems were found to have a significant partial contribution to the prediction of the WHODAS score. Conclusion: Measuring functioning and disability is important in the follow-up of persons suffering from MS in order to plan rehabilitation and define areas in which additional support is needed.Keywords: disability, functionality, multiple sclerosis, rehabilitation
Procedia PDF Downloads 1195808 Structure of Consciousness According to Deep Systemic Constellations
Authors: Dmitry Ustinov, Olga Lobareva
Abstract:
The method of Deep Systemic Constellations is based on a phenomenological approach. Using the phenomenon of substitutive perception it was established that the human consciousness has a hierarchical structure, where deeper levels govern more superficial ones (reactive level, energy or ancestral level, spiritual level, magical level, and deeper levels of consciousness). Every human possesses a depth of consciousness to the spiritual level, however deeper levels of consciousness are not found for every person. It was found that the spiritual level of consciousness is not homogeneous and has its own internal hierarchy of sublevels (the level of formation of spiritual values, the level of the 'inner observer', the level of the 'path', the level of 'God', etc.). The depth of the spiritual level of a person defines the paradigm of all his internal processes and the main motives of the movement through life. At any level of consciousness disturbances can occur. Disturbances at a deeper level cause disturbances at more superficial levels and are manifested in the daily life of a person in feelings, behavioral patterns, psychosomatics, etc. Without removing the deepest source of a disturbance it is impossible to completely correct its manifestation in the actual moment. Thus a destructive pattern of feeling and behavior in the actual moment can exist because of a disturbance, for example, at the spiritual level of a person (although in most cases the source is at the energy level). Psychological work with superficial levels without removing a source of disturbance cannot fully solve the problem. The method of Deep Systemic Constellations allows one to work effectively with the source of the problem located at any depth. The methodology has confirmed its effectiveness in working with more than a thousand people.Keywords: constellations, spiritual psychology, structure of consciousness, transpersonal psychology
Procedia PDF Downloads 2495807 The Combined Effect of Methane and Methanol on Growth and PHB Production in the Alphaproteobacterial Methanotroph Methylocystis Sp. Rockwell
Authors: Lazic Marina, Sugden Scott, Sharma Kanta Hem, Sauvageau Dominic, Stein Lisa
Abstract:
Methane is a highly potent greenhouse gas mostly released through anthropogenic activities. Methane represents a low-cost and sustainable feedstock used for the biological production of value-added compounds by bacteria known as methanotrophs. In addition to methane, these organisms can utilize methanol, another cheap carbon source that is a common industrial by-product. Alphaproteobacteria methanotrophs can utilize both methane and methanol to produce the biopolymer polyhydroxybutyrate. The goal of this study was to examine the effect of methanol on polyhydroxybutyrate production in Methylocystis sp. Rockwell and to identify the optimal methane: methanol ratio that will improve PHB without reducing biomass production. Three methane: methanol ratios (4, 2.5., and 0.5) and three nitrogen source (ammonium or nitrate) concentrations (10 mM, 1 mM, and 0.1 mM) were combined to generate 18 growing conditions (9 per carbon source). The production of polyhydroxybutyrate and biomass was analyzed at the end of growth. Overall, the methane: methanol ratios that promoted polyhydroxybutyrate synthesis without reducing biomass were 4 and 2.5 and the optimal nitrogen concentration was 1 mM for both ammonium and nitrate. The physiological mechanism behind the beneficial effect of combining methane and methanol as carbon sources remain to be discovered. One possibility is that methanol has a dual role as a carbon source at lower concentrations and as a stringent response trigger at higher concentrations. Nevertheless, the beneficial effect of methanol and optimal nitrogen concentration for PHB production was confirmed, providing a basis for future physiological analysis and conditions for process scale-up.Keywords: methane, methanol, methanotrophs, polyhydroxybutyrate, methylocystis sp. rockwell, single carbon bioconversions
Procedia PDF Downloads 1715806 Development of a French to Yorùbá Machine Translation System
Authors: Benjamen Nathaniel, Eludiora Safiriyu Ijiyemi, Egume Oneme Lucky
Abstract:
A review on machine translation systems shows that a lot of computational artefacts has been carried out to translate written or spoken texts from a source language to Yorùbá language through Machine Translation systems. However, there are no work on French to Yorùbá language machine translation system; hence, the study investigated the process involved in the translation of French-to-Yorùbá language equivalent with the view to adopting a rule- based MT approach to build a Machine Translation framework from simple sentences administered through questionnaire. Articles and relevant textbooks were reviewed with key speakers of both languages interviewed to find out the processes involved in the translation of French language and their equivalent in Yorùbálanguage simple sentences using home domain terminologies. Achieving this, a model was formulated using phrase grammar structure, re-write rule, parse tree, automata theory- based techniques, designed and implemented respectively with unified modeling language (UML) and python programming language. Analysing the result, it was observed when carrying out the result that, the Machine Translation system performed 18.45% above Experimental Subject Respondent and 2.7% below Linguistics Expert when analysed with word orthography, sentence syntax and semantic correctness of the sentences. And, when compared with Google Machine Translation system, it was noticed that the developed system performed better on lexicons of the target language.Keywords: machine translation (MT), rule-based, French language, Yoru`ba´ language
Procedia PDF Downloads 775805 Analysis and Evaluation of Both AC and DC Standalone Photovoltaic Supply to Ethio-Telecom Access Layer Devices: The Case of Multi-Service Access Gateway in Adama
Authors: Frie Ayalew, Seada Hussen
Abstract:
Ethio-telecom holds a variety of telecom devices that needs a consistent power source to be operational. The company got this power mainly from the national grid and used this power source alone or with a generator and/or batteries as a backup. In addition, for off-grid or remote areas, the company commonly uses generators and batteries. But unstable diesel prices, huge expenses of fuel and transportation, and high carbon emissions are the main problems associated with fuel energy. So, the design of solar power with battery backup is a highly recommended and advantageous source for the next coming years. This project designs the AC and DC standalone photovoltaic supply to Ethio-telecom access layer devices for the case of multi-service access gateway in Adama. The design is done by using Homer software for both AC and DC loads. The project shows that the design of a solar based microgrid is the best option for the designed area.Keywords: solar power, battery, inverter, Ethio-telecom, solar radiation
Procedia PDF Downloads 825804 Creating a Quasi-Folklore as a Tool for Knowledge Sharing in a Family-Based Business
Authors: Chico A. E. Hindarto
Abstract:
Knowledge management practices are more contextual when they combine with the corporate culture. Each entity has a specific cultural climate that enables knowledge sharing in both functional and individual levels. The interactions between people within organization can be influenced by the culture and how the knowledge is transmitted. On the other hand, these interactions have impacts in culture modification as well. Storytelling is one of the methods in delivering the knowledge throughout the organization. This paper aims to explore the possibility in using a quasi-folklore in the family-based business. Folklore is defined as informal tradition culture that spreading through a word-of-mouth, without knowing the source of the story. In this paper, the quasi-folklore term is used to differentiate it with the original term of folklore. The story is created by somebody in the organization, not like the folklore with unknown source. However, the source is not disclosed, in order to avoid the predicted interest from the story origin. The setting of family-based business is deliberately chosen, since the kinship is considerably strong in this type of entity. Through a thorough literature review that relates to knowledge management, storytelling, and folklore, this paper determines how folklore can be an option for knowledge sharing within the organization.Keywords: folklore, family business, organizational culture, knowledge management, storytelling
Procedia PDF Downloads 2865803 An Automatic Model Transformation Methodology Based on Semantic and Syntactic Comparisons and the Granularity Issue Involved
Authors: Tiexin Wang, Sebastien Truptil, Frederick Benaben
Abstract:
Model transformation, as a pivotal aspect of Model-driven engineering, attracts more and more attentions both from researchers and practitioners. Many domains (enterprise engineering, software engineering, knowledge engineering, etc.) use model transformation principles and practices to serve to their domain specific problems; furthermore, model transformation could also be used to fulfill the gap between different domains: by sharing and exchanging knowledge. Since model transformation has been widely used, there comes new requirement on it: effectively and efficiently define the transformation process and reduce manual effort that involved in. This paper presents an automatic model transformation methodology based on semantic and syntactic comparisons, and focuses particularly on granularity issue that existed in transformation process. Comparing to the traditional model transformation methodologies, this methodology serves to a general purpose: cross-domain methodology. Semantic and syntactic checking measurements are combined into a refined transformation process, which solves the granularity issue. Moreover, semantic and syntactic comparisons are supported by software tool; manual effort is replaced in this way.Keywords: automatic model transformation, granularity issue, model-driven engineering, semantic and syntactic comparisons
Procedia PDF Downloads 3945802 Profiling Risky Code Using Machine Learning
Authors: Zunaira Zaman, David Bohannon
Abstract:
This study explores the application of machine learning (ML) for detecting security vulnerabilities in source code. The research aims to assist organizations with large application portfolios and limited security testing capabilities in prioritizing security activities. ML-based approaches offer benefits such as increased confidence scores, false positives and negatives tuning, and automated feedback. The initial approach using natural language processing techniques to extract features achieved 86% accuracy during the training phase but suffered from overfitting and performed poorly on unseen datasets during testing. To address these issues, the study proposes using the abstract syntax tree (AST) for Java and C++ codebases to capture code semantics and structure and generate path-context representations for each function. The Code2Vec model architecture is used to learn distributed representations of source code snippets for training a machine-learning classifier for vulnerability prediction. The study evaluates the performance of the proposed methodology using two datasets and compares the results with existing approaches. The Devign dataset yielded 60% accuracy in predicting vulnerable code snippets and helped resist overfitting, while the Juliet Test Suite predicted specific vulnerabilities such as OS-Command Injection, Cryptographic, and Cross-Site Scripting vulnerabilities. The Code2Vec model achieved 75% accuracy and a 98% recall rate in predicting OS-Command Injection vulnerabilities. The study concludes that even partial AST representations of source code can be useful for vulnerability prediction. The approach has the potential for automated intelligent analysis of source code, including vulnerability prediction on unseen source code. State-of-the-art models using natural language processing techniques and CNN models with ensemble modelling techniques did not generalize well on unseen data and faced overfitting issues. However, predicting vulnerabilities in source code using machine learning poses challenges such as high dimensionality and complexity of source code, imbalanced datasets, and identifying specific types of vulnerabilities. Future work will address these challenges and expand the scope of the research.Keywords: code embeddings, neural networks, natural language processing, OS command injection, software security, code properties
Procedia PDF Downloads 1065801 Time Domain Dielectric Relaxation Microwave Spectroscopy
Authors: A. C. Kumbharkhane
Abstract:
Time domain dielectric relaxation microwave spectroscopy (TDRMS) is a term used to describe a technique of observing the time dependant response of a sample after application of time dependant electromagnetic field. A TDRMS probes the interaction of a macroscopic sample with a time dependent electrical field. The resulting complex permittivity spectrum, characterizes amplitude (voltage) and time scale of the charge-density fluctuations within the sample. These fluctuations may arise from the reorientation of the permanent dipole moments of individual molecules or from the rotation of dipolar moieties in flexible molecules, like polymers. The time scale of these fluctuations depends on the sample and its relative relaxation mechanism. Relaxation times range from some picoseconds in low viscosity liquids to hours in glasses, Therefore the TDRS technique covers an extensive dynamical process. The corresponding frequencies range from 10-4 Hz to 1012 Hz. This inherent ability to monitor the cooperative motion of molecular ensemble distinguishes dielectric relaxation from methods like NMR or Raman spectroscopy, which yield information on the motions of individual molecules. Recently, we have developed and established the TDR technique in laboratory that provides information regarding dielectric permittivity in the frequency range 10 MHz to 30 GHz. The TDR method involves the generation of step pulse with rise time of 20 pico-seconds in a coaxial line system and monitoring the change in pulse shape after reflection from the sample placed at the end of the coaxial line. There is a great interest to study the dielectric relaxation behaviour in liquid systems to understand the role of hydrogen bond in liquid system. The intermolecular interaction through hydrogen bonds in molecular liquids results in peculiar dynamical properties. The dynamics of hydrogen-bonded liquids have been studied. The theoretical model to explain the experimental results will be discussed.Keywords: microwave, time domain reflectometry (TDR), dielectric measurement, relaxation time
Procedia PDF Downloads 3365800 An Overview on the Effectiveness of Brand Mascot and Celebrity Endorsement
Authors: Isari Pairoa, Proud Arunrangsiwed
Abstract:
Celebrity and brand mascot endorsement have been explored for more than three decades. Both endorsers can effectively transfer their reputation to corporate image and can influence the customers to purchase the product. However, there was little known about the mediators between the level of endorsement and its effect on buying behavior. The objective of the current study is to identify the gab of the previous studies and to seek possible mediators. It was found that consumer’s memory and identification are the mediators, of source credibility and endorsement effect. A future study should confirm the model of endorsement, which was established in the current study.Keywords: product endorsement, memory, identification theory, source credibility, unintentional effect
Procedia PDF Downloads 2275799 Local Spectrum Feature Extraction for Face Recognition
Authors: Muhammad Imran Ahmad, Ruzelita Ngadiran, Mohd Nazrin Md Isa, Nor Ashidi Mat Isa, Mohd ZaizuIlyas, Raja Abdullah Raja Ahmad, Said Amirul Anwar Ab Hamid, Muzammil Jusoh
Abstract:
This paper presents two technique, local feature extraction using image spectrum and low frequency spectrum modelling using GMM to capture the underlying statistical information to improve the performance of face recognition system. Local spectrum features are extracted using overlap sub block window that are mapping on the face image. For each of this block, spatial domain is transformed to frequency domain using DFT. A low frequency coefficient is preserved by discarding high frequency coefficients by applying rectangular mask on the spectrum of the facial image. Low frequency information is non Gaussian in the feature space and by using combination of several Gaussian function that has different statistical properties, the best feature representation can be model using probability density function. The recognition process is performed using maximum likelihood value computed using pre-calculate GMM components. The method is tested using FERET data sets and is able to achieved 92% recognition rates.Keywords: local features modelling, face recognition system, Gaussian mixture models, Feret
Procedia PDF Downloads 6675798 Radiosensitization Properties of Gold Nanoparticles in Brachytherapy of Uterus Cancer by High Dose Rate I-125 Seed: A Simulation Study by MCNPX and MCNP6 Codes
Authors: Elham Mansouri, Asghar Mesbahi
Abstract:
Purpose: In the current study, we aimed to investigate the macroscopic and microscopic dose enhancement effect of metallic nanoparticles in interstitial brachytherapy of uterus cancer by Iodin-125 source using a nano-lattice model in MCNPX (5) and MCNP6.1 codes. Materials and methods: Based on a nano-lattice simulation model containing a radiation source and a tumor tissue with cellular compartments loaded with 7mg/g spherical nanoparticles (bismuth, gold, and gadolinium), the energy deposited by the secondary electrons in microscopic and macroscopic level was estimated. Results: The results show that the values of macroscopic DEF is higher than microscopic DEF values and the macroscopic DEF values decreases as a function of distance from the brachytherapy source surface. Also, the results revealed a remarkable discrepancy between the DEF and secondary electron spectra calculated by MCNPX (5) and MCNP6.1 codes, which could be justified by the difference in energy cut-off and electron transport algorithms of two codes. Conclusion: According to the both MCNPX (5) and MCNP6.1 outputs, it could be concluded that the presence of metallic nanoparticles in the tumor tissue of uteruscancer increases the physical effectiveness of brachytherapy by I-125 source. The results presented herein give a physical view of radiosensitization potential of different metallic nanoparticles and could be considered in design of analytical and experimental radiosensitization studies in tumor regions using various radiotherapy modalities in the presence of heavy nanomaterials.Keywords: MCNPX, MCNP6, nanoparticle, brachytherapy
Procedia PDF Downloads 1025797 Bilateral Telecontrol of AutoMerlin Mobile Robot Using Time Domain Passivity Control
Authors: Aamir Shahzad, Hubert Roth
Abstract:
This paper is presenting the bilateral telecontrol of AutoMerlin Mobile Robot having communication delay. Passivity Observers has been designed to monitor the net energy at both ports of a two port network and if any or both ports become active making net energy negative, then the passivity controllers dissipate the proper energy to make the overall system passive in the presence of time delay. The environment force is modeled and sent back to human operator so that s/he can feel it and has additional information about the environment in the vicinity of mobile robot. The experimental results have been presented to show the performance and stability of bilateral controller. The results show the whenever the passivity observers observe active behavior then the passivity controller come into action to neutralize the active behavior to make overall system passive.Keywords: bilateral control, human operator, haptic device, communication network, time domain passivity control, passivity observer, passivity controller, time delay, mobile robot, environment force
Procedia PDF Downloads 3925796 Geographic Information System for District Level Energy Performance Simulations
Authors: Avichal Malhotra, Jerome Frisch, Christoph van Treeck
Abstract:
The utilization of semantic, cadastral and topological data from geographic information systems (GIS) has exponentially increased for building and urban-scale energy performance simulations. Urban planners, simulation scientists, and researchers use virtual 3D city models for energy analysis, algorithms and simulation tools. For dynamic energy simulations at city and district level, this paper provides an overview of the available GIS data models and their levels of detail. Adhering to different norms and standards, these models also intend to describe building and construction industry data. For further investigations, CityGML data models are considered for simulations. Though geographical information modelling has considerably many different implementations, extensions of virtual city data can also be made for domain specific applications. Highlighting the use of the extended CityGML models for energy researches, a brief introduction to the Energy Application Domain Extension (ADE) along with its significance is made. Consequently, addressing specific input simulation data, a workflow using Modelica underlining the usage of GIS information and the quantification of its significance over annual heating energy demand is presented in this paper.Keywords: CityGML, EnergyADE, energy performance simulation, GIS
Procedia PDF Downloads 1685795 A Mixing Matrix Estimation Algorithm for Speech Signals under the Under-Determined Blind Source Separation Model
Authors: Jing Wu, Wei Lv, Yibing Li, Yuanfan You
Abstract:
The separation of speech signals has become a research hotspot in the field of signal processing in recent years. It has many applications and influences in teleconferencing, hearing aids, speech recognition of machines and so on. The sounds received are usually noisy. The issue of identifying the sounds of interest and obtaining clear sounds in such an environment becomes a problem worth exploring, that is, the problem of blind source separation. This paper focuses on the under-determined blind source separation (UBSS). Sparse component analysis is generally used for the problem of under-determined blind source separation. The method is mainly divided into two parts. Firstly, the clustering algorithm is used to estimate the mixing matrix according to the observed signals. Then the signal is separated based on the known mixing matrix. In this paper, the problem of mixing matrix estimation is studied. This paper proposes an improved algorithm to estimate the mixing matrix for speech signals in the UBSS model. The traditional potential algorithm is not accurate for the mixing matrix estimation, especially for low signal-to noise ratio (SNR).In response to this problem, this paper considers the idea of an improved potential function method to estimate the mixing matrix. The algorithm not only avoids the inuence of insufficient prior information in traditional clustering algorithm, but also improves the estimation accuracy of mixing matrix. This paper takes the mixing of four speech signals into two channels as an example. The results of simulations show that the approach in this paper not only improves the accuracy of estimation, but also applies to any mixing matrix.Keywords: DBSCAN, potential function, speech signal, the UBSS model
Procedia PDF Downloads 1355794 Vibration Transmission across Junctions of Walls and Floors in an Apartment Building: An Experimental Investigation
Authors: Hugo Sampaio Libero, Max de Castro Magalhaes
Abstract:
The perception of sound radiated from a building floor is greatly influenced by the rooms in which it is immersed and by the position of both listener and source. The main question that remains unanswered is related to the influence of the source position on the sound power radiated by a complex wall-floor system in buildings. This research is concerned with the investigation of vibration transmission across walls and floors in buildings. It is primarily based on the determination of vibration reduction index via experimental tests. Knowledge of this parameter may help in predicting noise and vibration propagation in building components. First, the physical mechanisms involving vibration transmission across structural junctions are described. An experimental setup is performed to aid this investigation. The experimental tests have shown that the vibration generation in the walls and floors is directed related to their size and boundary conditions. It is also shown that the vibration source position can affect the overall vibration spectrum significantly. Second, the characteristics of the noise spectra inside the rooms due to an impact source (tapping machine) are also presented. Conclusions are drawn for the general trend of vibration and noise spectrum of the structural components and rooms, respectively. In summary, the aim of this paper is to investigate the vibro-acoustical behavior of building floors and walls under floor impact excitation. The impact excitation was at distinct positions on the slab. The analysis has highlighted the main physical characteristics of the vibration transmission mechanism.Keywords: vibration transmission, vibration reduction index, impact excitation, experimental tests
Procedia PDF Downloads 935793 The Expanding Role of Islamic Law in the Current Indonesian Legal Reform
Authors: Muhammad Ilham Agus Salim, Saufa Ata Taqiyya
Abstract:
In many Muslim countries, secularization has successfully reduced the role of Islamic law as a formal legal source during this last century. The most obvious fact was the reform of Daulah Utsmaniyah to be Secular Republic of Turkey. Religion is strictly separated from the state authorities in many countries today. But these last decades in Indonesia, a remarkable fact is apparent. Islamic law has expanded its role in Indonesian legal system, especially in districts regulations. In Aceh province, as a case in point, shariah has been the basic source of law in all regulations. There are more provinces in Indonesia which adopted Islamic law as a formal legal source by the end of 2014. Different from some other countries which clearly stipulates the status of Islam in formal ways, Indonesian constitution formally does not render any recognition for Islam to be the formal religion of the state. But in this Muslim majority country, Islamic law takes a place in democratic way, namely on the basis of the voice of majority. This paper will analyze how this reality increases significantly since what so called by Indonesian reformation era (end of nineties). Some causes will be identified regarding this tendency of expansion of role. Some lessons learned also will be recommended as the concluding remarks by the end of the paper.Keywords: Islamic law, Indonesia, legal reform, Syariah local regulation
Procedia PDF Downloads 3505792 About Multi-Resolution Techniques for Large Eddy Simulation of Reactive Multi-Phase Flows
Authors: Giacomo Rossi, Bernardo Favini, Eugenio Giacomazzi, Franca Rita Picchia, Nunzio Maria Salvatore Arcidiacono
Abstract:
A numerical technique for mesh refinement in the HeaRT (Heat Release and Transfer) numerical code is presented. In the CFD framework, Large Eddy Simulation (LES) approach is gaining in importance as a tool for simulating turbulent combustion processes, also if this approach has an high computational cost due to the complexity of the turbulent modeling and the high number of grid points necessary to obtain a good numerical solution. In particular, when a numerical simulation of a big domain is performed with a structured grid, the number of grid points can increase so much that the simulation becomes impossible: this problem can be overcame with a mesh refinement technique. Mesh refinement technique developed for HeaRT numerical code (a staggered finite difference code) is based on an high order reconstruction of the variables at the grid interfaces by means of a least square quasi-ENO interpolation: numerical code is written in modern Fortran (2003 standard of newer) and is parallelized using domain decomposition and message passing interface (MPI) standard.Keywords: LES, multi-resolution, ENO, fortran
Procedia PDF Downloads 3655791 Free and Open Source Software for BIM Workflow of Steel Structure Design
Authors: Danilo Di Donato
Abstract:
The continuous new releases of free and open source software (FOSS) and the high costs of proprietary software -whose monopoly is characterized by closed codes and the low level of implementation and customization of software by end-users- impose a reflection on possible tools that can be chosen and adopted for the design and the representation of new steel constructions. The paper aims to show experimentation carried out to verify the actual potential and the effective applicability of FOSS supports to the BIM modeling of steel structures, particularly considering the goal of a possible workflow in order to achieve high level of development (LOD); allow effective interchange methods between different software. To this end, the examined software packages are those with open source or freeware licenses, in order to evaluate their use in architectural praxis. The test has primarily involved the experimentation of Freecad -the only Open Source software that allows a complete and integrated BIM workflow- and then the results have been compared with those of two proprietary software, Sketchup and TeklaBim Sight, which are released with a free version, but not usable for commercial purposes. The experiments carried out on Open Source, and freeware software was then compared with the outcomes that are obtained by two proprietary software, Sketchup Pro and Tekla Structure which has special modules particularly addressed to the design of steel structures. This evaluation has concerned different comparative criteria, that have been defined on the basis of categories related to the reliability, the efficiency, the potentiality, achievable LOD and user-friendliness of the analyzed software packages. In order to verify the actual outcomes of FOSS BIM for the steel structure projects, these results have been compared with a simulation related to a real case study and carried out with a proprietary software BIM modeling. Therefore, the same design theme, the project of a shelter of public space, has been developed using different software. Therefore the purpose of the contribution is to assess what are the developments and potentialities inherent in FOSS BIM, in order to estimate their effective applicability to professional practice, their limits and new fields of research they propose.Keywords: BIM, steel buildings, FOSS, LOD
Procedia PDF Downloads 1745790 Graphic Procession Unit-Based Parallel Processing for Inverse Computation of Full-Field Material Properties Based on Quantitative Laser Ultrasound Visualization
Authors: Sheng-Po Tseng, Che-Hua Yang
Abstract:
Motivation and Objective: Ultrasonic guided waves become an important tool for nondestructive evaluation of structures and components. Guided waves are used for the purpose of identifying defects or evaluating material properties in a nondestructive way. While guided waves are applied for evaluating material properties, instead of knowing the properties directly, preliminary signals such as time domain signals or frequency domain spectra are first revealed. With the measured ultrasound data, inversion calculation can be further employed to obtain the desired mechanical properties. Methods: This research is development of high speed inversion calculation technique for obtaining full-field mechanical properties from the quantitative laser ultrasound visualization system (QLUVS). The quantitative laser ultrasound visualization system (QLUVS) employs a mirror-controlled scanning pulsed laser to generate guided acoustic waves traveling in a two-dimensional target. Guided waves are detected with a piezoelectric transducer located at a fixed location. With a gyro-scanning of the generation source, the QLUVS has the advantage of fast, full-field, and quantitative inspection. Results and Discussions: This research introduces two important tools to improve the computation efficiency. Firstly, graphic procession unit (GPU) with large amount of cores are introduced. Furthermore, combining the CPU and GPU cores, parallel procession scheme is developed for the inversion of full-field mechanical properties based on the QLUVS data. The newly developed inversion scheme is applied to investigate the computation efficiency for single-layered and double-layered plate-like samples. The computation efficiency is shown to be 80 times faster than unparalleled computation scheme. Conclusions: This research demonstrates a high-speed inversion technique for the characterization of full-field material properties based on quantitative laser ultrasound visualization system. Significant computation efficiency is shown, however not reaching the limit yet. Further improvement can be reached by improving the parallel computation. Utilizing the development of the full-field mechanical property inspection technology, full-field mechanical property measured by non-destructive, high-speed and high-precision measurements can be obtained in qualitative and quantitative results. The developed high speed computation scheme is ready for applications where full-field mechanical properties are needed in a nondestructive and nearly real-time way.Keywords: guided waves, material characterization, nondestructive evaluation, parallel processing
Procedia PDF Downloads 2025789 The Effects of Source and Timing on the Acceptance of New Product Recommendation: A Lab Experiment
Abstract:
A new product is important for companies to extend consumers and manifest competitiveness. New product often involves new features that consumers might not be familiar with while it may also have a competitive advantage to attract consumers compared to established products. However, although most online retailers employ recommendation agents (RA) to influence consumers’ product choice decision, recommended new products are not accepted and chosen as expected. We argue that it might also be caused by providing a new product recommendation in the wrong way at the wrong time. This study seeks to discuss how new product evaluations sourced from third parties could be employed in RAs as evidence of the superiority for the new product and how the new product recommendation could be provided to a consumer at the right time so that it can be accepted and finally chosen during the consumer’s decision-making process. A 2*2 controlled laboratory experiment was conducted to understand the selection of new product recommendation sources and recommendation timing. Human subjects were randomly assigned to one of the four treatments to minimize the effects of individual differences on the results. Participants were told to make purchase choices from our product categories. We find that a new product recommended right after a similar existing product and with the source of the expert review will be more likely to be accepted. Based on this study, both theoretical and practical contributions are provided regarding new product recommendation.Keywords: new product recommendation, recommendation timing, recommendation source, recommendation agents
Procedia PDF Downloads 1545788 Active Contours for Image Segmentation Based on Complex Domain Approach
Authors: Sajid Hussain
Abstract:
The complex domain approach for image segmentation based on active contour has been designed, which deforms step by step to partition an image into numerous expedient regions. A novel region-based trigonometric complex pressure force function is proposed, which propagates around the region of interest using image forces. The signed trigonometric force function controls the propagation of the active contour and the active contour stops on the exact edges of the object accurately. The proposed model makes the level set function binary and uses Gaussian smoothing kernel to adjust and escape the re-initialization procedure. The working principle of the proposed model is as follows: The real image data is transformed into complex data by iota (i) times of image data and the average iota (i) times of horizontal and vertical components of the gradient of image data is inserted in the proposed model to catch complex gradient of the image data. A simple finite difference mathematical technique has been used to implement the proposed model. The efficiency and robustness of the proposed model have been verified and compared with other state-of-the-art models.Keywords: image segmentation, active contour, level set, Mumford and Shah model
Procedia PDF Downloads 113