Search results for: illumination techniques
6187 A Fully-Automated Disturbance Analysis Vision for the Smart Grid Based on Smart Switch Data
Authors: Bernardo Cedano, Ahmed H. Eltom, Bob Hay, Jim Glass, Raga Ahmed
Abstract:
The deployment of smart grid devices such as smart meters and smart switches (SS) supported by a reliable and fast communications system makes automated distribution possible, and thus, provides great benefits to electric power consumers and providers alike. However, more research is needed before the full utility of smart switch data is realized. This paper presents new automated switching techniques using SS within the electric power grid. A concise background of the SS is provided, and operational examples are shown. Organization and presentation of data obtained from SS are shown in the context of the future goal of total automation of the distribution network. The description of application techniques, the examples of success with SS, and the vision outlined in this paper serve to motivate future research pertinent to disturbance analysis automation.Keywords: disturbance automation, electric power grid, smart grid, smart switches
Procedia PDF Downloads 3096186 Evaluation of Short-Term Load Forecasting Techniques Applied for Smart Micro-Grids
Authors: Xiaolei Hu, Enrico Ferrera, Riccardo Tomasi, Claudio Pastrone
Abstract:
Load Forecasting plays a key role in making today's and future's Smart Energy Grids sustainable and reliable. Accurate power consumption prediction allows utilities to organize in advance their resources or to execute Demand Response strategies more effectively, which enables several features such as higher sustainability, better quality of service, and affordable electricity tariffs. It is easy yet effective to apply Load Forecasting at larger geographic scale, i.e. Smart Micro Grids, wherein the lower available grid flexibility makes accurate prediction more critical in Demand Response applications. This paper analyses the application of short-term load forecasting in a concrete scenario, proposed within the EU-funded GreenCom project, which collect load data from single loads and households belonging to a Smart Micro Grid. Three short-term load forecasting techniques, i.e. linear regression, artificial neural networks, and radial basis function network, are considered, compared, and evaluated through absolute forecast errors and training time. The influence of weather conditions in Load Forecasting is also evaluated. A new definition of Gain is introduced in this paper, which innovatively serves as an indicator of short-term prediction capabilities of time spam consistency. Two models, 24- and 1-hour-ahead forecasting, are built to comprehensively compare these three techniques.Keywords: short-term load forecasting, smart micro grid, linear regression, artificial neural networks, radial basis function network, gain
Procedia PDF Downloads 4686185 Modern Spectrum Sensing Techniques for Cognitive Radio Networks: Practical Implementation and Performance Evaluation
Authors: Antoni Ivanov, Nikolay Dandanov, Nicole Christoff, Vladimir Poulkov
Abstract:
Spectrum underutilization has made cognitive radio a promising technology both for current and future telecommunications. This is due to the ability to exploit the unused spectrum in the bands dedicated to other wireless communication systems, and thus, increase their occupancy. The essential function, which allows the cognitive radio device to perceive the occupancy of the spectrum, is spectrum sensing. In this paper, the performance of modern adaptations of the four most widely used spectrum sensing techniques namely, energy detection (ED), cyclostationary feature detection (CSFD), matched filter (MF) and eigenvalues-based detection (EBD) is compared. The implementation has been accomplished through the PlutoSDR hardware platform and the GNU Radio software package in very low Signal-to-Noise Ratio (SNR) conditions. The optimal detection performance of the examined methods in a realistic implementation-oriented model is found for the common relevant parameters (number of observed samples, sensing time and required probability of false alarm).Keywords: cognitive radio, dynamic spectrum access, GNU Radio, spectrum sensing
Procedia PDF Downloads 2456184 Intrusion Detection Using Dual Artificial Techniques
Authors: Rana I. Abdulghani, Amera I. Melhum
Abstract:
With the abnormal growth of the usage of computers over networks and under the consideration or agreement of most of the computer security experts who said that the goal of building a secure system is never achieved effectively, all these points led to the design of the intrusion detection systems(IDS). This research adopts a comparison between two techniques for network intrusion detection, The first one used the (Particles Swarm Optimization) that fall within the field (Swarm Intelligence). In this Act, the algorithm Enhanced for the purpose of obtaining the minimum error rate by amending the cluster centers when better fitness function is found through the training stages. Results show that this modification gives more efficient exploration of the original algorithm. The second algorithm used a (Back propagation NN) algorithm. Finally a comparison between the results of two methods used were based on (NSL_KDD) data sets for the construction and evaluation of intrusion detection systems. This research is only interested in clustering the two categories (Normal and Abnormal) for the given connection records. Practices experiments result in intrude detection rate (99.183818%) for EPSO and intrude detection rate (69.446416%) for BP neural network.Keywords: IDS, SI, BP, NSL_KDD, PSO
Procedia PDF Downloads 3826183 Development of National Scale Hydropower Resource Assessment Scheme Using SWAT and Geospatial Techniques
Authors: Rowane May A. Fesalbon, Greyland C. Agno, Jodel L. Cuasay, Dindo A. Malonzo, Ma. Rosario Concepcion O. Ang
Abstract:
The Department of Energy of the Republic of the Philippines estimates that the country’s energy reserves for 2015 are dwindling– observed in the rotating power outages in several localities. To aid in the energy crisis, a national hydropower resource assessment scheme is developed. Hydropower is a resource that is derived from flowing water and difference in elevation. It is a renewable energy resource that is deemed abundant in the Philippines – being an archipelagic country that is rich in bodies of water and water resources. The objectives of this study is to develop a methodology for a national hydropower resource assessment using hydrologic modeling and geospatial techniques in order to generate resource maps for future reference and use of the government and other stakeholders. The methodology developed for this purpose is focused on two models – the implementation of the Soil and Water Assessment Tool (SWAT) for the river discharge and the use of geospatial techniques to analyze the topography and obtain the head, and generate the theoretical hydropower potential sites. The methodology is highly coupled with Geographic Information Systems to maximize the use of geodatabases and the spatial significance of the determined sites. The hydrologic model used in this workflow is SWAT integrated in the GIS software ArcGIS. The head is determined by a developed algorithm that utilizes a Synthetic Aperture Radar (SAR)-derived digital elevation model (DEM) which has a resolution of 10-meters. The initial results of the developed workflow indicate hydropower potential in the river reaches ranging from pico (less than 5 kW) to mini (1-3 MW) theoretical potential.Keywords: ArcSWAT, renewable energy, hydrologic model, hydropower, GIS
Procedia PDF Downloads 3136182 The Journey of a Malicious HTTP Request
Authors: M. Mansouri, P. Jaklitsch, E. Teiniker
Abstract:
SQL injection on web applications is a very popular kind of attack. There are mechanisms such as intrusion detection systems in order to detect this attack. These strategies often rely on techniques implemented at high layers of the application but do not consider the low level of system calls. The problem of only considering the high level perspective is that an attacker can circumvent the detection tools using certain techniques such as URL encoding. One technique currently used for detecting low-level attacks on privileged processes is the tracing of system calls. System calls act as a single gate to the Operating System (OS) kernel; they allow catching the critical data at an appropriate level of detail. Our basic assumption is that any type of application, be it a system service, utility program or Web application, “speaks” the language of system calls when having a conversation with the OS kernel. At this level we can see the actual attack while it is happening. We conduct an experiment in order to demonstrate the suitability of system call analysis for detecting SQL injection. We are able to detect the attack. Therefore we conclude that system calls are not only powerful in detecting low-level attacks but that they also enable us to detect high-level attacks such as SQL injection.Keywords: Linux system calls, web attack detection, interception, SQL
Procedia PDF Downloads 3596181 Concept Drifts Detection and Localisation in Process Mining
Authors: M. V. Manoj Kumar, Likewin Thomas, Annappa
Abstract:
Process mining provides methods and techniques for analyzing event logs recorded in modern information systems that support real-world operations. While analyzing an event-log, state-of-the-art techniques available in process mining believe that the operational process as a static entity (stationary). This is not often the case due to the possibility of occurrence of a phenomenon called concept drift. During the period of execution, the process can experience concept drift and can evolve with respect to any of its associated perspectives exhibiting various patterns-of-change with a different pace. Work presented in this paper discusses the main aspects to consider while addressing concept drift phenomenon and proposes a method for detecting and localizing the sudden concept drifts in control-flow perspective of the process by using features extracted by processing the traces in the process log. Our experimental results are promising in the direction of efficiently detecting and localizing concept drift in the context of process mining research discipline.Keywords: abrupt drift, concept drift, sudden drift, control-flow perspective, detection and localization, process mining
Procedia PDF Downloads 3456180 Temperature Dependent Current-Voltage (I-V) Characteristics of CuO-ZnO Nanorods Based Heterojunction Solar Cells
Authors: Venkatesan Annadurai, Kannan Ethirajalu, Anu Roshini Ramakrishnan
Abstract:
Copper oxide (CuO) and zinc oxide (ZnO) based coaxial (CuO-ZnO nanorods) heterojunction has been the interest of various research communities for solar cells, light emitting diodes (LEDs) and photodetectors applications. Copper oxide (CuO) is a p-type material with the band gap of 1.5 eV and it is considered to be an attractive absorber material in solar cells applications due to its high absorption coefficient and long minority carrier diffusion length. Similarly, n-type ZnO nanorods possess many attractive advantages over thin films such as, the light trapping ability and photosensitivity owing to the presence of oxygen related hole-traps at the surface. Moreover, the abundant availability, non-toxicity, and inexpensiveness of these materials make them suitable for potentially cheap, large area, and stable photovoltaic applications. However, the efficiency of the CuO-ZnO nanorods heterojunction based devices is greatly affected by interface defects which generally lead to the poor performance. In spite of having much potential, not much work has been carried out to understand the interface quality and transport mechanism involved across the CuO-ZnO nanorods heterojunction. Therefore, a detailed investigation of CuO-ZnO heterojunction is needed to understand the interface which affects its photovoltaic performance. Herein, we have fabricated the CuO-ZnO nanorods based heterojunction by simple hydrothermal and electrodeposition technique and investigated its interface quality by carrying out temperature (300 –10 K) dependent current-voltage (I-V) measurements under dark and illumination of visible light. Activation energies extracted from the temperature dependent I-V characteristics reveals that recombination and tunneling mechanism across the interfacial barrier plays a significant role in the current flow.Keywords: heterojunction, electrical transport, nanorods, solar cells
Procedia PDF Downloads 2246179 Exhaustive Study of Essential Constraint Satisfaction Problem Techniques Based on N-Queens Problem
Authors: Md. Ahsan Ayub, Kazi A. Kalpoma, Humaira Tasnim Proma, Syed Mehrab Kabir, Rakib Ibna Hamid Chowdhury
Abstract:
Constraint Satisfaction Problem (CSP) is observed in various applications, i.e., scheduling problems, timetabling problems, assignment problems, etc. Researchers adopt a CSP technique to tackle a certain problem; however, each technique follows different approaches and ways to solve a problem network. In our exhaustive study, it has been possible to visualize the processes of essential CSP algorithms from a very concrete constraint satisfaction example, NQueens Problem, in order to possess a deep understanding about how a particular constraint satisfaction problem will be dealt with by our studied and implemented techniques. Besides, benchmark results - time vs. value of N in N-Queens - have been generated from our implemented approaches, which help understand at what factor each algorithm produces solutions; especially, in N-Queens puzzle. Thus, extended decisions can be made to instantiate a real life problem within CSP’s framework.Keywords: arc consistency (AC), backjumping algorithm (BJ), backtracking algorithm (BT), constraint satisfaction problem (CSP), forward checking (FC), least constrained values (LCV), maintaining arc consistency (MAC), minimum remaining values (MRV), N-Queens problem
Procedia PDF Downloads 3646178 The Ethics of Corporate Social Responsibility Statements in Undercutting Sustainability: A Communication Perspective
Authors: Steven Woods
Abstract:
The use of Corporate Social Responsibility Statements has become ubiquitous in society. The appeal to consumers by being a well-behaved social entity has become a strategy not just to ensure brand loyalty but also to further larger scale projects of corporate interests. Specifically, the use of CSR to position corporations as good planetary citizens involves not just self-promotion but also a way of transferring responsibility from systems to individuals. By using techniques labeled as “greenwashing” and emphasizing ethical consumption choices as the solution, corporations present themselves as good members of the community and pursuing sustainability. Ultimately, the primary function of Corporate Social Responsibility statements is to maintain the economic status quo of ongoing growth and consumption while presenting and environmentally progressive image to the public, as well as reassuring them corporate behavior is superior to government intervention. By analyzing the communication techniques utilized through content analysis of specific examples, along with an analysis of the frames of meaning constructed in the CSR statements, the practices of Corporate Responsibility and Sustainability will be addressed from an ethical perspective.Keywords: corporate social responsibility, ethics, greenwashing, sustainability
Procedia PDF Downloads 716177 Using Data Mining Techniques to Evaluate the Different Factors Affecting the Academic Performance of Students at the Faculty of Information Technology in Hashemite University in Jordan
Authors: Feras Hanandeh, Majdi Shannag
Abstract:
This research studies the different factors that could affect the Faculty of Information Technology in Hashemite University students’ accumulative average. The research paper verifies the student information, background, their academic records, and how this information will affect the student to get high grades. The student information used in the study is extracted from the student’s academic records. The data mining tools and techniques are used to decide which attribute(s) will affect the student’s accumulative average. The results show that the most important factor which affects the students’ accumulative average is the student Acceptance Type. And we built a decision tree model and rules to determine how the student can get high grades in their courses. The overall accuracy of the model is 44% which is accepted rate.Keywords: data mining, classification, extracting rules, decision tree
Procedia PDF Downloads 4166176 [Keynote Talk]: Software Reliability Assessment and Fault Tolerance: Issues and Challenges
Authors: T. Gayen
Abstract:
Although, there are several software reliability models existing today there does not exist any versatile model even today which can be used for the reliability assessment of software. Complex software has a large number of states (unlike the hardware) so it becomes practically difficult to completely test the software. Irrespective of the amount of testing one does, sometimes it becomes extremely difficult to assure that the final software product is fault free. The Black Box Software Reliability models are found be quite uncertain for the reliability assessment of various systems. As mission critical applications need to be highly reliable and since it is not always possible to ensure the development of highly reliable system. Hence, in order to achieve fault-free operation of software one develops some mechanism to handle faults remaining in the system even after the development. Although, several such techniques are currently in use to achieve fault tolerance, yet these mechanisms may not always be very suitable for various systems. Hence, this discussion is focused on analyzing the issues and challenges faced with the existing techniques for reliability assessment and fault tolerance of various software systems.Keywords: black box, fault tolerance, failure, software reliability
Procedia PDF Downloads 4266175 Secret Sharing in Visual Cryptography Using NVSS and Data Hiding Techniques
Authors: Misha Alexander, S. B. Waykar
Abstract:
Visual Cryptography is a special unbreakable encryption technique that transforms the secret image into random noisy pixels. These shares are transmitted over the network and because of its noisy texture it attracts the hackers. To address this issue a Natural Visual Secret Sharing Scheme (NVSS) was introduced that uses natural shares either in digital or printed form to generate the noisy secret share. This scheme greatly reduces the transmission risk but causes distortion in the retrieved secret image through variation in settings and properties of digital devices used to capture the natural image during encryption / decryption phase. This paper proposes a new NVSS scheme that extracts the secret key from randomly selected unaltered multiple natural images. To further improve the security of the shares data hiding techniques such as Steganography and Alpha channel watermarking are proposed.Keywords: decryption, encryption, natural visual secret sharing, natural images, noisy share, pixel swapping
Procedia PDF Downloads 4046174 Achieving Success in NPD Projects
Authors: Ankush Agrawal, Nadia Bhuiyan
Abstract:
The new product development (NPD) literature emphasizes the importance of introducing new products on the market for continuing business success. New products are responsible for employment, economic growth, technological progress, and high standards of living. Therefore, the study of NPD and the processes through which they emerge is important. The goal of our research is to propose a framework of critical success factors, metrics, and tools and techniques for implementing metrics for each stage of the new product development (NPD) process. An extensive literature review was undertaken to investigate decades of studies on NPD success and how it can be achieved. These studies were scanned for common factors for firms that enjoyed success of new products on the market. The paper summarizes NPD success factors, suggests metrics that should be used to measure these factors, and proposes tools and techniques to make use of these metrics. This was done for each stage of the NPD process, and brought together in a framework that the authors propose should be followed for complex NPD projects. While many studies have been conducted on critical success factors for NPD, these studies tend to be fragmented and focus on one or a few phases of the NPD process.Keywords: new product development, performance, critical success factors, framework
Procedia PDF Downloads 3986173 Special Features Of Phacoemulsification Technique For Dense Cataracts
Authors: Shilkin A.G., Goncharov D.V., Rotanov D.A., Voitecha M.A., Kulyagina Y.I., Mochalova U.E.
Abstract:
Context: Phacoemulsification is a surgical technique used to remove cataracts, but it has a higher number of complications when dense cataracts are present. The risk factors include thin posterior capsule, dense nucleus fragments, and prolonged exposure to high-power ultrasound. To minimize these complications, various methods are used. Research aim: The aim of this study is to develop and implement optimal methods of ultrasound phacoemulsification for dense cataracts in order to minimize postoperative complications. Methodology: The study involved 36 eyes of dogs with dense cataracts over a period of 5 years. The surgeries were performed using a LEICA 844 surgical microscope and an Oertli Faros phacoemulsifier. The surgical techniques included the optimal technique for breaking the nucleus, bimanual surgery, and the use of Akahoshi prechoppers. Findings: The complications observed during the surgery included rupture of the posterior capsule and the need for anterior vitrectomy. Complications in the postoperative period included corneal edema and uveitis. Theoretical importance: This study contributes to the field by providing insights into the special features of phacoemulsification for dense cataracts. It highlights the importance of using specific techniques and settings to minimize complications. Data collection and analysis procedures: The data for the study were collected from surgeries performed on dogs with dense cataracts. The complications were documented and analyzed. Question addressed: The study addressed the question of how to minimize complications during phacoemulsification surgery for dense cataracts. Conclusion: By following the optimal techniques, settings, and using prechoppers, the surgery for dense cataracts can be made safer and faster, minimizing the risks and complications.Keywords: dense cataracts, phacoemulsification, phacoemulsification of cataracts in elderly dogs, осложнения факоэмульсификации
Procedia PDF Downloads 626172 Recent Developments in the Application of Deep Learning to Stock Market Prediction
Authors: Shraddha Jain Sharma, Ratnalata Gupta
Abstract:
Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume
Procedia PDF Downloads 906171 Employing Visual Culture to Enhance Initial Adult Maltese Language Acquisition
Authors: Jacqueline Żammit
Abstract:
Recent research indicates that the utilization of right-brain strategies holds significant implications for the acquisition of language skills. Nevertheless, the utilization of visual culture as a means to stimulate these strategies and amplify language retention among adults engaging in second language (L2) learning remains a relatively unexplored area. This investigation delves into the impact of visual culture on activating right-brain processes during the initial stages of language acquisition, particularly in the context of teaching Maltese as a second language (ML2) to adult learners. By employing a qualitative research approach, this study convenes a focus group comprising twenty-seven educators to delve into a range of visual culture techniques integrated within language instruction. The collected data is subjected to thematic analysis using NVivo software. The findings underscore a variety of impactful visual culture techniques, encompassing activities such as drawing, sketching, interactive matching games, orthographic mapping, memory palace strategies, wordless picture books, picture-centered learning methodologies, infographics, Face Memory Game, Spot the Difference, Word Search Puzzles, the Hidden Object Game, educational videos, the Shadow Matching technique, Find the Differences exercises, and color-coded methodologies. These identified techniques hold potential for application within ML2 classes for adult learners. Consequently, this study not only provides insights into optimizing language learning through specific visual culture strategies but also furnishes practical recommendations for enhancing language competencies and skills.Keywords: visual culture, right-brain strategies, second language acquisition, maltese as a second language, visual aids, language-based activities
Procedia PDF Downloads 616170 ViraPart: A Text Refinement Framework for Automatic Speech Recognition and Natural Language Processing Tasks in Persian
Authors: Narges Farokhshad, Milad Molazadeh, Saman Jamalabbasi, Hamed Babaei Giglou, Saeed Bibak
Abstract:
The Persian language is an inflectional subject-object-verb language. This fact makes Persian a more uncertain language. However, using techniques such as Zero-Width Non-Joiner (ZWNJ) recognition, punctuation restoration, and Persian Ezafe construction will lead us to a more understandable and precise language. In most of the works in Persian, these techniques are addressed individually. Despite that, we believe that for text refinement in Persian, all of these tasks are necessary. In this work, we proposed a ViraPart framework that uses embedded ParsBERT in its core for text clarifications. First, used the BERT variant for Persian followed by a classifier layer for classification procedures. Next, we combined models outputs to output cleartext. In the end, the proposed model for ZWNJ recognition, punctuation restoration, and Persian Ezafe construction performs the averaged F1 macro scores of 96.90%, 92.13%, and 98.50%, respectively. Experimental results show that our proposed approach is very effective in text refinement for the Persian language.Keywords: Persian Ezafe, punctuation, ZWNJ, NLP, ParsBERT, transformers
Procedia PDF Downloads 2166169 Using Machine Learning Techniques for Autism Spectrum Disorder Analysis and Detection in Children
Authors: Norah Mohammed Alshahrani, Abdulaziz Almaleh
Abstract:
Autism Spectrum Disorder (ASD) is a condition related to issues with brain development that affects how a person recognises and communicates with others which results in difficulties with interaction and communication socially and it is constantly growing. Early recognition of ASD allows children to lead safe and healthy lives and helps doctors with accurate diagnoses and management of conditions. Therefore, it is crucial to develop a method that will achieve good results and with high accuracy for the measurement of ASD in children. In this paper, ASD datasets of toddlers and children have been analyzed. We employed the following machine learning techniques to attempt to explore ASD and they are Random Forest (RF), Decision Tree (DT), Na¨ıve Bayes (NB) and Support Vector Machine (SVM). Then Feature selection was used to provide fewer attributes from ASD datasets while preserving model performance. As a result, we found that the best result has been provided by the Support Vector Machine (SVM), achieving 0.98% in the toddler dataset and 0.99% in the children dataset.Keywords: autism spectrum disorder, machine learning, feature selection, support vector machine
Procedia PDF Downloads 1526168 Condition Assessment of Reinforced Concrete Bridge Deck Using Ground Penetrating Radar
Authors: Azin Shakibabarough, Mojtaba Valinejadshoubi, Ashutosh Bagchi
Abstract:
Catastrophic bridge failure happens due to the lack of inspection, lack of design and extreme events like flooding, an earthquake. Bridge Management System (BMS) is utilized to diminish such an accident with proper design and frequent inspection. Visual inspection cannot detect any subsurface defects, so using Non-Destructive Evaluation (NDE) techniques remove these barriers as far as possible. Among all NDE techniques, Ground Penetrating Radar (GPR) has been proved as a highly effective device for detecting internal defects in a reinforced concrete bridge deck. GPR is used for detecting rebar location and rebar corrosion in the reinforced concrete deck. GPR profile is composed of hyperbola series in which sound hyperbola denotes sound rebar and blur hyperbola or signal attenuation shows corroded rebar. Interpretation of GPR images is implemented by numerical analysis or visualization. Researchers recently found that interpretation through visualization is more precise than interpretation through numerical analysis, but visualization is time-consuming and a highly subjective process. Automating the interpretation of GPR image through visualization can solve these problems. After interpretation of all scans of a bridge, condition assessment is conducted based on the generated corrosion map. However, this such a condition assessment is not objective and precise. Condition assessment based on structural integrity and strength parameters can make it more objective and precise. The main purpose of this study is to present an automated interpretation method of a reinforced concrete bridge deck through a visualization technique. In the end, the combined analysis of the structural condition in a bridge is implemented.Keywords: bridge condition assessment, ground penetrating radar, GPR, NDE techniques, visualization
Procedia PDF Downloads 1486167 Standardization Of Miniature Neutron Research Reactor And Occupational Safety Analysis
Authors: Raymond Limen Njinga
Abstract:
The comparator factors (Fc) for miniature research reactors are of great importance in the field of nuclear physics as it provide accurate bases for the evaluation of elements in all form of samples via ko-NAA techniques. The Fc was initially simulated theoretically thereafter, series of experiments were performed to validate the results. In this situation, the experimental values were obtained using the alloy of Au(0.1%) - Al monitor foil and a neutron flux setting of 5.00E+11 cm-2.s-1. As was observed in the inner irradiation position, the average experimental value of 7.120E+05 was reported against the theoretical value of 7.330E+05. In comparison, a percentage deviation of 2.86 (from theoretical value) was observed. In the large case of the outer irradiation position, the experimental value of 1.170E+06 was recorded against the theoretical value of 1.210E+06 with a percentage deviation of 3.310 (from the theoretical value). The estimation of equivalent dose rate at 5m from neutron flux of 5.00E+11 cm-2.s-1 within the neutron energies of 1KeV, 10KeV, 100KeV, 500KeV, 1MeV, 5MeV and 10MeV were calculated to be 0.01 Sv/h, 0.01 Sv/h, 0.03 Sv/h, 0.15 Sv/h, 0.21Sv/h and 0.25 Sv/h respectively with a total dose within a period of an hour was obtained to be 0.66 Sv.Keywords: neutron flux, comparator factor, NAA techniques, neutron energy, equivalent dose
Procedia PDF Downloads 1836166 Application of Artificial Neural Network for Prediction of High Tensile Steel Strands in Post-Tensioned Slabs
Authors: Gaurav Sancheti
Abstract:
This study presents an impacting approach of Artificial Neural Networks (ANNs) in determining the quantity of High Tensile Steel (HTS) strands required in post-tensioned (PT) slabs. Various PT slab configurations were generated by varying the span and depth of the slab. For each of these slab configurations, quantity of required HTS strands were recorded. ANNs with backpropagation algorithm and varying architectures were developed and their performance was evaluated in terms of Mean Square Error (MSE). The recorded data for the quantity of HTS strands was used as a feeder database for training the developed ANNs. The networks were validated using various validation techniques. The results show that the proposed ANNs have a great potential with good prediction and generalization capability.Keywords: artificial neural networks, back propagation, conceptual design, high tensile steel strands, post tensioned slabs, validation techniques
Procedia PDF Downloads 2216165 Safety of Built Infrastructure: Single Degree of Freedom Approach to Blast Resistant RC Wall Panels
Authors: Muizz Sanni-Anibire
Abstract:
The 21st century has witnessed growing concerns for the protection of built facilities against natural and man-made disasters. Studies in earthquake resistant buildings, fire, and explosion resistant buildings now dominate the arena. To protect people and facilities from the effects of the explosion, reinforced concrete walls have been designed to be blast resistant. Understanding the performance of these walls is a key step in ensuring the safety of built facilities. Blast walls are mostly designed using simple techniques such as single degree of freedom (SDOF) method, despite the increasing use of multi-degree of freedom techniques such as the finite element method. This study is the first stage of a continuous research into the safety and reliability of blast walls. It presents the SDOF approach applied to the analysis of a concrete wall panel under three representative bomb situations. These are motorcycle 50 kg, car 400kg and also van with the capacity of 1500 kg of TNT explosive.Keywords: blast wall, safety, protection, explosion
Procedia PDF Downloads 2636164 Effects of Different Processing Methods on Composition, Physicochemical and Morphological Properties of MR263 Rice Flour
Authors: R. Asmeda, A. Noorlaila, M. H. Norziah
Abstract:
This research work was conducted to investigate the effects of different grinding techniques during the milling process of rice grains on physicochemical characteristics of rice flour produced. Dry grinding, semi-wet grinding, and wet grinding were employed to produce the rice flour. The results indicated that different grinding methods significantly (p ≤ 0.05) affected physicochemical and functional properties of starch except for the carbohydrate content, x-ray diffraction pattern and breakdown viscosity. Dry grinding technique caused highest percentage of starch damage compared to semi-wet and wet grinding. Protein, fat and ash content were highest in rice flour obtained by dry grinding. It was found that wet grinding produce flour with smallest average particle size (8.52 µm), resulting in highest process yield (73.14%). Pasting profiles revealed that dry grinding produce rice flour with significantly lowest pasting temperature and highest setback viscosity.Keywords: average particle size, grinding techniques, physicochemical characteristics, rice flour
Procedia PDF Downloads 1916163 3D Point Cloud Model Color Adjustment by Combining Terrestrial Laser Scanner and Close Range Photogrammetry Datasets
Authors: M. Pepe, S. Ackermann, L. Fregonese, C. Achille
Abstract:
3D models obtained with advanced survey techniques such as close-range photogrammetry and laser scanner are nowadays particularly appreciated in Cultural Heritage and Archaeology fields. In order to produce high quality models representing archaeological evidences and anthropological artifacts, the appearance of the model (i.e. color) beyond the geometric accuracy, is not a negligible aspect. The integration of the close-range photogrammetry survey techniques with the laser scanner is still a topic of study and research. By combining point cloud data sets of the same object generated with both technologies, or with the same technology but registered in different moment and/or natural light condition, could construct a final point cloud with accentuated color dissimilarities. In this paper, a methodology to uniform the different data sets, to improve the chromatic quality and to highlight further details by balancing the point color will be presented.Keywords: color models, cultural heritage, laser scanner, photogrammetry
Procedia PDF Downloads 2806162 Comparative DNA Binding of Iron and Manganese Complexes by Spectroscopic and ITC Techniques and Antibacterial Activity
Authors: Maryam Nejat Dehkordi, Per Lincoln, Hassan Momtaz
Abstract:
Interaction of Schiff base complexes of iron and manganese (iron [N, N’ Bis (5-(triphenyl phosphonium methyl) salicylidene) -1, 2 ethanediamine) chloride, [Fe Salen]Cl, manganese [N, N’ Bis (5-(triphenyl phosphonium methyl) salicylidene) -1, 2 ethanediamine) acetate) with DNA were investigated by spectroscopic and isothermal titration calorimetry techniques (ITC). The absorbance spectra of complexes have shown hyper and hypochromism in the presence of DNA that is indication of interaction of complexes with DNA. The linear dichroism (LD) measurements confirmed the bending of DNA in the presence of complexes. Furthermore, isothermal titration calorimetry experiments approved that complexes bound to DNA on the base of both electrostatic and hydrophobic interactions. Furthermore, ITC profile exhibits the existence of two binding phases for the complex. Antibacterial activity of ligand and complexes were tested in vitro to evaluate their activity against the gram positive and negative bacteria.Keywords: Schiff base complexes, ct-DNA, linear dichroism (LD), isothermal titration calorimetry (ITC), antibacterial activity
Procedia PDF Downloads 4716161 Create a Brand Value Assessment Model to Choosing a Cosmetic Brand in Tehran Combining DEMATEL Techniques and Multi-Stage ANFIS
Authors: Hamed Saremi, Suzan Taghavy, Seyed Mohammad Hanif Sanjari, Mostafa Kahali
Abstract:
One of the challenges in manufacturing and service companies to provide a product or service is recognized Brand to consumers in target markets. They provide most of their processes under the same capacity. But the constant threat of devastating internal and external resources to prevent a rise Brands and more companies are recognizing the stages are bankrupt. This paper has tried to identify and analyze effective indicators of brand equity and focuses on indicators and presents a model of intelligent create a model to prevent possible damage. In this study, the identified indicators of brand equity are based on literature study and according to expert opinions, set of indicators By techniques DEMATEL Then to used Multi-Step Adaptive Neural-Fuzzy Inference system (ANFIS) to design a multi-stage intelligent system for assessment of brand equity.Keywords: brand, cosmetic product, ANFIS, DEMATEL
Procedia PDF Downloads 4176160 Automatic Lead Qualification with Opinion Mining in Customer Relationship Management Projects
Authors: Victor Radich, Tania Basso, Regina Moraes
Abstract:
Lead qualification is one of the main procedures in Customer Relationship Management (CRM) projects. Its main goal is to identify potential consumers who have the ideal characteristics to establish a profitable and long-term relationship with a certain organization. Social networks can be an important source of data for identifying and qualifying leads since interest in specific products or services can be identified from the users’ expressed feelings of (dis)satisfaction. In this context, this work proposes the use of machine learning techniques and sentiment analysis as an extra step in the lead qualification process in order to improve it. In addition to machine learning models, sentiment analysis or opinion mining can be used to understand the evaluation that the user makes of a particular service, product, or brand. The results obtained so far have shown that it is possible to extract data from social networks and combine the techniques for a more complete classification.Keywords: lead qualification, sentiment analysis, opinion mining, machine learning, CRM, lead scoring
Procedia PDF Downloads 856159 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System
Authors: Karima Qayumi, Alex Norta
Abstract:
The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)
Procedia PDF Downloads 4326158 Examples from a Traditional Sismo-Resistant Architecture
Authors: Amira Zatir, Abderahmane Mokhtari, Amina Foufa, Sara Zatir
Abstract:
It exists in several regions in the world, of numerous historic monuments, buildings and housing environment, built in traditional ways which survive for earthquakes, even in zones where the seismic risk is particularly raised. These constructions, stemming from vernacular architecture, allow, through their resistances in the time earthquakes, to identify the various sismo-resistant "local" techniques. Through the examples and the experiences presented, the remark which can be made, is that in the traditional built, two major principles in a way opposite, govern the constructions in earthquake-resistant. It is about the very big flexibility, whom answer very light constructions, like the Japanese wooden constructions, Turkish and even Chinese; that of the very big rigidity to which correspond constructions in masonry in particular stone, more or less heavy and massive, which we meet in particular in the Mediterranean Basin, and in the historic sanctuary of Machu Pacchu. In it sensible and well-reflected techniques of construction are added, of which the use of the humble materials such as the earth and the adobe. The ancient communities were able to face the seismic risks, thanks to them know-how reflected in their intelligently designed constructions, testifying of a local seismic culture.Keywords: earthquake, architecture, traditional, construction, resistance
Procedia PDF Downloads 420