Search results for: Rule Based Architecture
28286 Long Distance Aspirating Smoke Detection for Large Radioactive Areas
Authors: Michael Dole, Pierre Ninin, Denis Raffourt
Abstract:
Most of the CERN’s facilities hosting particle accelerators are large, underground and radioactive areas. All fire detection systems installed in such areas, shall be carefully studied to cope with the particularities of this stringent environment. The detection equipment usually chosen by CERN to secure these underground facilities are based on air sampling technology. The electronic equipment is located in non-radioactive areas whereas air sampling networks are deployed in radioactive areas where fire detection is required. The air sampling technology provides very good detection performances and prevent the "radiation-to-electronic" effects. In addition, it reduces the exposure to radiations of maintenance workers and is permanently available during accelerator operation. In order to protect the Super Proton Synchrotron and its 7 km tunnels, a specific long distance aspirating smoke detector has been developed to detect smoke at up to 700 meters between electronic equipment and the last air sampling hole. This paper describes the architecture, performances and return of experience of the long distance fire detection system developed and installed to secure the CERN Super Proton Synchrotron tunnels.Keywords: air sampling, fire detection, long distance, radioactive areas
Procedia PDF Downloads 16428285 A Distinct Method Based on Mamba-Unet for Brain Tumor Image Segmentation
Authors: Djallel Bouamama, Yasser R. Haddadi
Abstract:
Accurate brain tumor segmentation is crucial for diagnosis and treatment planning, yet it remains a challenging task due to the variability in tumor shapes and intensities. This paper introduces a distinct approach to brain tumor image segmentation by leveraging an advanced architecture known as Mamba-Unet. Building on the well-established U-Net framework, Mamba-Unet incorporates distinct design enhancements to improve segmentation performance. Our proposed method integrates a multi-scale attention mechanism and a hybrid loss function to effectively capture fine-grained details and contextual information in brain MRI scans. We demonstrate that Mamba-Unet significantly enhances segmentation accuracy compared to conventional U-Net models by utilizing a comprehensive dataset of annotated brain MRI scans. Quantitative evaluations reveal that Mamba-Unet surpasses traditional U-Net architectures and other contemporary segmentation models regarding Dice coefficient, sensitivity, and specificity. The improvements are attributed to the method's ability to manage class imbalance better and resolve complex tumor boundaries. This work advances the state-of-the-art in brain tumor segmentation and holds promise for improving clinical workflows and patient outcomes through more precise and reliable tumor detection.Keywords: brain tumor classification, image segmentation, CNN, U-NET
Procedia PDF Downloads 4128284 A Survey on Constraint Solving Approaches Using Parallel Architectures
Authors: Nebras Gharbi, Itebeddine Ghorbel
Abstract:
In the latest years and with the advancements of the multicore computing world, the constraint programming community tried to benefit from the capacity of new machines and make the best use of them through several parallel schemes for constraint solving. In this paper, we propose a survey of the different proposed approaches to solve Constraint Satisfaction Problems using parallel architectures. These approaches use in a different way a parallel architecture: the problem itself could be solved differently by several solvers or could be split over solvers.Keywords: constraint programming, parallel programming, constraint satisfaction problem, speed-up
Procedia PDF Downloads 32128283 Principles to Design Urbanism in Cinema; An Aesthetic Study on Identity and Representation of a City in a Movie
Authors: Dorsa Moayedi
Abstract:
‘The Cities’ and Cinema have a history going as far back as silent films; however, the standards of picturing a city in a film are somewhat vague. ‘Genius Loci’ of a city can be easily described with parameters that architects have detected; nevertheless, the genius loci of an ‘urban movie’ is untouched. Cities have been among the provocative matters that pushed filmmakers to ponder upon them and to picture them along with their urban identity thoroughly in their artworks, though the impacts of the urban life on the plot and characters is neglected, and so a city in a movie is usually restricted to ‘the place where the story happens’. Cities and urban life are among those that are in constant change and ongoing expansion; therefore, they are always fresh and ready to challenge people with their existence. Thus, the relationship between the city and cinema is metamorphic, though it could be defined and explored. The dominant research on the idea of urbanism has been conducted by outstanding scholars of architecture, like Christian Norberg-Schulz, and the studies on Cinema have been done by theorists of cinema, like Christian Metz, who have mastered defining their own realm; still, the idea to mingle the domains to reach a unified theory which could be applied to ‘urban movies’ is barely worked on. In this research, we have sought mutual grounds to discuss ‘urbanism in cinema,’ the grounds that cinema could benefit from and get to a more accurate audio-visual representation of a city, in accordance with the ideas of Christopher Alexander and the term he coined ‘The Timeless Way of Building.’ We concentrate on movies that are dependent on urban life, mainly those that possess the names of cities, like ‘Nashville (1975), Manhattan (1979), Fargo (1996), Midnight in Paris (2011) or Roma (2018), according to the ideas of urban design and narratives of cinema. Contrary to what has often been assumed, cinema and architecture could be defined in line with similar parameters, and architectural terms could be applied to the research done on movies. Our findings indicate that the theories of Christopher Alexander can best fit the paradigm to study an ‘Urban Movie’, definitions of a timeless building, elaborate on the characteristics of a design that could be applied to definitions of an urban movie, and set a prototype for further filmmaking regarding the urban life.Keywords: city, urbanism, urban movies, identity, representation
Procedia PDF Downloads 6728282 Closed Forms of Trigonometric Series Interms of Riemann’s ζ Function and Dirichlet η, λ, β Functions or the Hurwitz Zeta Function and Harmonic Numbers
Authors: Slobodan B. Tričković
Abstract:
We present the results concerned with trigonometric series that include sine and cosine functions with a parameter appearing in the denominator. We derive two types of closed-form formulas for trigonometric series. At first, for some integer values, as we know that Riemann’s ζ function and Dirichlet η, λ equal zero at negative even integers, whereas Dirichlet’s β function equals zero at negative odd integers, after a certain number of members, the rest of the series vanishes. Thus, a trigonometric series becomes a polynomial with coefficients involving Riemann’s ζ function and Dirichlet η, λ, β functions. On the other hand, in some cases, one cannot immediately replace the parameter with any positive integer because we shall encounter singularities. So it is necessary to take a limit, so in the process, we apply L’Hospital’s rule and, after a series of rearrangements, we bring a trigonometric series to a form suitable for the application of Choi-Srivastava’s theorem dealing with Hurwitz’s zeta function and Harmonic numbers. In this way, we express a trigonometric series as a polynomial over Hurwitz’s zeta function derivative.Keywords: Dirichlet eta lambda beta functions, Riemann's zeta function, Hurwitz zeta function, Harmonic numbers
Procedia PDF Downloads 10428281 Multi-Modal Feature Fusion Network for Speaker Recognition Task
Authors: Xiang Shijie, Zhou Dong, Tian Dan
Abstract:
Speaker recognition is a crucial task in the field of speech processing, aimed at identifying individuals based on their vocal characteristics. However, existing speaker recognition methods face numerous challenges. Traditional methods primarily rely on audio signals, which often suffer from limitations in noisy environments, variations in speaking style, and insufficient sample sizes. Additionally, relying solely on audio features can sometimes fail to capture the unique identity of the speaker comprehensively, impacting recognition accuracy. To address these issues, we propose a multi-modal network architecture that simultaneously processes both audio and text signals. By gradually integrating audio and text features, we leverage the strengths of both modalities to enhance the robustness and accuracy of speaker recognition. Our experiments demonstrate significant improvements with this multi-modal approach, particularly in complex environments, where recognition performance has been notably enhanced. Our research not only highlights the limitations of current speaker recognition methods but also showcases the effectiveness of multi-modal fusion techniques in overcoming these limitations, providing valuable insights for future research.Keywords: feature fusion, memory network, multimodal input, speaker recognition
Procedia PDF Downloads 3928280 Optimal Bayesian Chart for Controlling Expected Number of Defects in Production Processes
Abstract:
In this paper, we develop an optimal Bayesian chart to control the expected number of defects per inspection unit in production processes with long production runs. We formulate this control problem in the optimal stopping framework. The objective is to determine the optimal stopping rule minimizing the long-run expected average cost per unit time considering partial information obtained from the process sampling at regular epochs. We prove the optimality of the control limit policy, i.e., the process is stopped and the search for assignable causes is initiated when the posterior probability that the process is out of control exceeds a control limit. An algorithm in the semi-Markov decision process framework is developed to calculate the optimal control limit and the corresponding average cost. Numerical examples are presented to illustrate the developed optimal control chart and to compare it with the traditional u-chart.Keywords: Bayesian u-chart, economic design, optimal stopping, semi-Markov decision process, statistical process control
Procedia PDF Downloads 57328279 Design and Optimization of Spoke Rotor Type Brushless Direct Current Motor for Electric Vehicles Using Different Flux Barriers
Authors: Ismail Kurt, Necibe Fusun Oyman Serteller
Abstract:
Today, with the reduction in semiconductor system costs, Brushless Direct Current (BLDC) motors have become widely preferred. Based on rotor architecture, BLDC structures are divided into internal permanent magnet (IPM) and surface permanent magnet (SPM). However, permanent magnet (PM) motors in electric vehicles (EVs) are still predominantly based on interior permanent magnet (IPM) motors, as the rotors do not require sleeves, the PMs are better protected by the rotor cores, and the air-gap lengths can be much smaller. This study discusses the IPM rotor structure in detail, highlighting its higher torque levels, reluctance torque, wide speed range operation, and production advantages. IPM rotor structures are particularly preferred in EVs due to their high-speed capabilities, torque density and field weakening (FW) features. In FW applications, the motor becomes more suitable for operation at torques lower than the rated torque but at speeds above the rated speed. Although V-type and triangular IPM rotor structures are generally preferred in EV applications, the spoke-type rotor structure offers distinct advantages, making it a competitive option for these systems. The flux barriers in the rotor significantly affect motor performance, providing notable benefits in both motor efficiency and cost. This study utilizes ANSYS/Maxwell simulation software to analyze the spoke-type IPM motor and examine its key design parameters. Through analytical and 2D analysis, preliminary motor design and parameter optimization have been carried out. During the parameter optimization phase, torque ripple a common issue, especially for IPM motors has been investigated, along with the associated changes in motor parameters.Keywords: electric vehicle, field weakening, flux barrier, spoke rotor.
Procedia PDF Downloads 1228278 Nutrient Foramina in the Shaft of Long Bones of Upper Limb
Authors: Madala Venkateswara Rao
Abstract:
The major blood supply to the long bones occurs through the nutrient arteries, which enters through the nutrient foramina. This is the study of nutrient Foramina in the shaft of upper limb long bones taken from the department of Anatomy at Narayana medical college nellore. Nutrient foramina play an important role in nutrition and growth of the bones. Most of the nutrient arteries follow the rule, 'to the elbow I go, from the knee I flee' but they are very variable in position. Their number, location, direction & its importance in the growing end of long bones were studied in the long bones of upper limb. The present study has variations in the position & direction of long bones especially in the radius & ulna, as most of the nutrient foramina are found in anterior surface of upper 1/3rd and middle 1/3rd of these bones. The study of nutrient foramina is not only of academic interest but also in medico-legal practice in relation to their position. Careful observation has also been made on the position of nutrient foramina in relation to upper end of long bones. This study also gives importance of length long bones to know the height of an individual. With the knowledge of variations in the nutrient foramen, placement of internal fixation devices can be appropriately done.Keywords: nutrient artery, nutrient foramina, shaft of long bones, upper limb bones
Procedia PDF Downloads 50328277 Cobb Angle Measurement from Coronal X-Rays Using Artificial Neural Networks
Authors: Andrew N. Saylor, James R. Peters
Abstract:
Scoliosis is a complex 3D deformity of the thoracic and lumbar spines, clinically diagnosed by measurement of a Cobb angle of 10 degrees or more on a coronal X-ray. The Cobb angle is the angle made by the lines drawn along the proximal and distal endplates of the respective proximal and distal vertebrae comprising the curve. Traditionally, Cobb angles are measured manually using either a marker, straight edge, and protractor or image measurement software. The task of measuring the Cobb angle can also be represented by a function taking the spine geometry rendered using X-ray imaging as input and returning the approximate angle. Although the form of such a function may be unknown, it can be approximated using artificial neural networks (ANNs). The performance of ANNs is affected by many factors, including the choice of activation function and network architecture; however, the effects of these parameters on the accuracy of scoliotic deformity measurements are poorly understood. Therefore, the objective of this study was to systematically investigate the effect of ANN architecture and activation function on Cobb angle measurement from the coronal X-rays of scoliotic subjects. The data set for this study consisted of 609 coronal chest X-rays of scoliotic subjects divided into 481 training images and 128 test images. These data, which included labeled Cobb angle measurements, were obtained from the SpineWeb online database. In order to normalize the input data, each image was resized using bi-linear interpolation to a size of 500 × 187 pixels, and the pixel intensities were scaled to be between 0 and 1. A fully connected (dense) ANN with a fixed cost function (mean squared error), batch size (10), and learning rate (0.01) was developed using Python Version 3.7.3 and TensorFlow 1.13.1. The activation functions (sigmoid, hyperbolic tangent [tanh], or rectified linear units [ReLU]), number of hidden layers (1, 3, 5, or 10), and number of neurons per layer (10, 100, or 1000) were varied systematically to generate a total of 36 network conditions. Stochastic gradient descent with early stopping was used to train each network. Three trials were run per condition, and the final mean squared errors and mean absolute errors were averaged to quantify the network response for each condition. The network that performed the best used ReLU neurons had three hidden layers, and 100 neurons per layer. The average mean squared error of this network was 222.28 ± 30 degrees2, and the average mean absolute error was 11.96 ± 0.64 degrees. It is also notable that while most of the networks performed similarly, the networks using ReLU neurons, 10 hidden layers, and 1000 neurons per layer, and those using Tanh neurons, one hidden layer, and 10 neurons per layer performed markedly worse with average mean squared errors greater than 400 degrees2 and average mean absolute errors greater than 16 degrees. From the results of this study, it can be seen that the choice of ANN architecture and activation function has a clear impact on Cobb angle inference from coronal X-rays of scoliotic subjects.Keywords: scoliosis, artificial neural networks, cobb angle, medical imaging
Procedia PDF Downloads 13128276 Roadmaps as a Tool of Innovation Management: System View
Authors: Matich Lyubov
Abstract:
Today roadmaps are becoming commonly used tools for detecting and designing a desired future for companies, states and the international community. The growing popularity of this method puts tasks such as identifying basic roadmapping principles, creation of concepts and determination of the characteristics of the use of roadmaps depending on the objectives as well as restrictions and opportunities specific to the study area on the agenda. However, the system approach, e.g. the elements which are recognized to be major for high-quality roadmapping, remains one of the main fields for improving the methodology and practice of their development as limited research was devoted to the detailed analysis of the roadmaps from the view of system approach. Therefore, this article is an attempt to examine roadmaps from the view of the system analysis, to compare areas, where, as a rule, roadmaps and systems analysis are considered the most effective tools. To compare the structure and composition of roadmaps and systems models the identification of common points between construction stages of roadmaps and system modeling and the determination of future directions for research roadmaps from a systems perspective are of special importance.Keywords: technology roadmap, roadmapping, systems analysis, system modeling, innovation management
Procedia PDF Downloads 31128275 Growth and Development Parameters of Saanen Kids Raised under Intensive Conditions in Konya/Turkey
Authors: Vahdetti̇n Sariyel, Bi̇rol Dağ
Abstract:
In this research, growth and development parameters in Konya, a private company in Saanen kids reared in intensive conditions in the province were examined. Average birth weights were 3.42, 2.96, 3.57, 3.23 and 2.77 kg for male, female, single, twins and triplets kids. Average weaning weights (three months of age) were 12.65, 12.09, 12.80, 12.65 and 11.68 kg for male, female, single, twins and triplets kids. Average body weights at seven months of age were 20.55, 18.98, 20.12, 20.12 and 19.05 kg for male, female, single, twins and triplets kids respectively. Considering the gender of the live weight factors birth control and rule in favor of the first en ( P <0.01), the second control finally it disappeared statistically significant ( P> 0.05). Main age and the effect of birth weight in the first month, while significant (P < 0.01); The effect of the second month following the live weight of the kid was not significant.Keywords: Saanen kids, growth, development, body weight
Procedia PDF Downloads 27228274 New Methods to Acquire Grammatical Skills in A Foreign Language
Authors: Indu ray
Abstract:
In today’s digital world the internet is already flooded with information on how to master grammar in a foreign language. It is well known that one cannot master a language without grammar. Grammar is the backbone of any language. Without grammar there would be no structure to help you speak/write or listen/read. Successful communication is only possible if the form and function of linguistic utterances are firmly related to one another. Grammar has its own rules of use to formulate an easier-to-understand language. Like a tool, grammar formulates our thoughts and knowledge in a meaningful way. Every language has its own grammar. With grammar, we can quickly analyze whether there is any action in this text: (Present, past, future). Knowledge of grammar is an important prerequisite for mastering a foreign language. What’s most important is how teachers can make grammar lessons more interesting for students and thus promote grammar skills more successfully. Through this paper, we discuss a few important methods like (Interactive Grammar Exercises between students, Interactive Grammar Exercise between student to teacher, Grammar translation method, Audio -Visual Method, Deductive Method, Inductive Method). This paper is divided into two sections. In the first part, brief definitions and principles of these approaches will be provided. Then the possibility and the case of combination of this approach will be analyzed. In the last section of the paper, I would like to present a survey result conducted at my university on a few methods to quickly learn grammar in Foreign Language. We divided the Grammatical Skills in six Parts. 1.Grammatical Competence 2. Speaking Skills 3. Phonology 4. The syntax and the Semantics 5. Rule 6. Cognitive Function and conducted a survey among students. From our survey results, we can observe that phonology, speaking ability, syntax and semantics can be improved by inductive method, Audio-visual Method, and grammatical translation method, for grammar rules and cognitive functions we should choose IGE (teacher-student) method. and the IGE method (pupil-pupil). The study’s findings revealed, that the teacher delivery Methods should be blend or fusion based on the content of the Grammar.Keywords: innovative method, grammatical skills, audio-visual, translation
Procedia PDF Downloads 7728273 Random Vertical Seismic Vibrations of the Long Span Cantilever Beams
Authors: Sergo Esadze
Abstract:
Seismic resistance norms require calculation of cantilevers on vertical components of the base seismic acceleration. Long span cantilevers, as a rule, must be calculated as a separate construction element. According to the architectural-planning solution, functional purposes and environmental condition of a designing buildings/structures, long span cantilever construction may be of very different types: both by main bearing element (beam, truss, slab), and by material (reinforced concrete, steel). A choice from these is always linked with bearing construction system of the building. Research of vertical seismic vibration of these constructions requires individual approach for each (which is not specified in the norms) in correlation with model of seismic load. The latest may be given both as deterministic load and as a random process. Loading model as a random process is more adequate to this problem. In presented paper, two types of long span (from 6m – up to 12m) reinforcement concrete cantilever beams have been considered: a) bearing elements of cantilevers, i.e., elements in which they fixed, have cross-sections with large sizes and cantilevers are made with haunch; b) cantilever beam with load-bearing rod element. Calculation models are suggested, separately for a) and b) types. They are presented as systems with finite quantity degree (concentrated masses) of freedom. Conditions for fixing ends are corresponding with its types. Vertical acceleration and vertical component of the angular acceleration affect masses. Model is based on assumption translator-rotational motion of the building in the vertical plane, caused by vertical seismic acceleration. Seismic accelerations are considered as random processes and presented by multiplication of the deterministic envelope function on stationary random process. Problem is solved within the framework of the correlation theory of random process. Solved numerical examples are given. The method is effective for solving the specific problems.Keywords: cantilever, random process, seismic load, vertical acceleration
Procedia PDF Downloads 19228272 Automatic Adjustment of Thresholds via Closed-Loop Feedback Mechanism for Solder Paste Inspection
Authors: Chia-Chen Wei, Pack Hsieh, Jeffrey Chen
Abstract:
Surface Mount Technology (SMT) is widely used in the area of the electronic assembly in which the electronic components are mounted to the surface of the printed circuit board (PCB). Most of the defects in the SMT process are mainly related to the quality of solder paste printing. These defects lead to considerable manufacturing costs in the electronics assembly industry. Therefore, the solder paste inspection (SPI) machine for controlling and monitoring the amount of solder paste printing has become an important part of the production process. So far, the setting of the SPI threshold is based on statistical analysis and experts’ experiences to determine the appropriate threshold settings. Because the production data are not normal distribution and there are various variations in the production processes, defects related to solder paste printing still occur. In order to solve this problem, this paper proposes an online machine learning algorithm, called the automatic threshold adjustment (ATA) algorithm, and closed-loop architecture in the SMT process to determine the best threshold settings. Simulation experiments prove that our proposed threshold settings improve the accuracy from 99.85% to 100%.Keywords: big data analytics, Industry 4.0, SPI threshold setting, surface mount technology
Procedia PDF Downloads 11728271 A Study of Recent Contribution on Simulation Tools for Network-on-Chip
Authors: Muthana Saleh Alalaki, Michael Opoku Agyeman
Abstract:
The growth in the number of Intellectual Properties (IPs) or the number of cores on the same chip becomes a critical issue in System-on-Chip (SoC) due to the intra-communication problem between the chip elements. As a result, Network-on-Chip (NoC) has emerged as a system architecture to overcome intra-communication issues. This paper presents a study of recent contributions on simulation tools for NoC. Furthermore, an overview of NoC is covered as well as a comparison between some NoC simulators to help facilitate research in on-chip communication.Keywords: WiNoC, simulation tool, network-on-chip, SoC
Procedia PDF Downloads 49828270 Wireless Backhauling for 5G Small Cell Networks
Authors: Abdullah A. Al Orainy
Abstract:
Small cell backhaul solutions need to be cost-effective, scalable, and easy to install. This paper presents an overview of small cell backhaul technologies. Wireless solutions including TV white space, satellite, sub-6 GHz radio wave, microwave and mmWave with their backhaul characteristics are discussed. Recent research on issues like beamforming, backhaul architecture, precoding and large antenna arrays, and energy efficiency for dense small cell backhaul with mmWave communications is reviewed. Recent trials of 5G technologies are summarized.Keywords: backhaul, small cells, wireless, 5G
Procedia PDF Downloads 51628269 A Distributed Smart Battery Management System – sBMS, for Stationary Energy Storage Applications
Authors: António J. Gano, Carmen Rangel
Abstract:
Currently, electric energy storage systems for stationary applications have known an increasing interest, namely with the integration of local renewable energy power sources into energy communities. Li-ion batteries are considered the leading electric storage devices to achieve this integration, and Battery Management Systems (BMS) are decisive for their control and optimum performance. In this work, the advancement of a smart BMS (sBMS) prototype with a modular distributed topology is described. The system, still under development, has a distributed architecture with modular characteristics to operate with different battery pack topologies and charge capacities, integrating adaptive algorithms for functional state real-time monitoring and management of multicellular Li-ion batteries, and is intended for application in the context of a local energy community fed by renewable energy sources. This sBMS system includes different developed hardware units: (1) Cell monitoring units (CMUs) for interfacing with each individual cell or module monitoring within the battery pack; (2) Battery monitoring and switching unit (BMU) for global battery pack monitoring, thermal control and functional operating state switching; (3) Main management and local control unit (MCU) for local sBMS’s management and control, also serving as a communications gateway to external systems and devices. This architecture is fully expandable to battery packs with a large number of cells, or modules, interconnected in series, as the several units have local data acquisition and processing capabilities, communicating over a standard CAN bus and will be able to operate almost autonomously. The CMU units are intended to be used with Li-ion cells but can be used with other cell chemistries, with output voltages within the 2.5 to 5 V range. The different unit’s characteristics and specifications are described, including the different implemented hardware solutions. The developed hardware supports both passive and active methods for charge equalization, considered fundamental functionalities for optimizing the performance and the useful lifetime of a Li-ion battery package. The functional characteristics of the different units of this sBMS system, including different process variables data acquisition using a flexible set of sensors, can support the development of custom algorithms for estimating the parameters defining the functional states of the battery pack (State-of-Charge, State-of-Health, etc.) as well as different charge equalizing strategies and algorithms. This sBMS system is intended to interface with other systems and devices using standard communication protocols, like those used by the Internet of Things. In the future, this sBMS architecture can evolve to a fully decentralized topology, with all the units using Wi-Fi protocols and integrating a mesh network, making unnecessary the MCU unit. The status of the work in progress is reported, leading to conclusions on the system already executed, considering the implemented hardware solution, not only as fully functional advanced and configurable battery management system but also as a platform for developing custom algorithms and optimizing strategies to achieve better performance of electric energy stationary storage devices.Keywords: Li-ion battery, smart BMS, stationary electric storage, distributed BMS
Procedia PDF Downloads 10328268 Sequence Component-Based Adaptive Protection for Microgrids Connected Power Systems
Authors: Isabelle Snyder
Abstract:
Microgrid protection presents challenges to conventional protection techniques due to the low induced fault current. Protection relays present in microgrid applications require a combination of settings groups to adjust based on the architecture of the microgrid in islanded and grid-connected mode. In a radial system where the microgrid is at the other end of the feeder, directional elements can be used to identify the direction of the fault current and switch settings groups accordingly (grid connected or microgrid connected). However, with multiple microgrid connections, this concept becomes more challenging, and the direction of the current alone is not sufficient to identify the source of the fault current contribution. ORNL has previously developed adaptive relaying schemes through other DOE-funded research projects that will be evaluated and used as a baseline for this research. The four protection techniques in this study are the following: (1) Adaptive Current only Protection System (ACPS), Intentional (2) Unbalanced Control for Protection Control (IUCPC), (3) Adaptive Protection System with Communication Controller (APSCC) (4) Adaptive Model-Driven Protective Relay (AMDPR). The first two methods focus on identifying the islanded mode without communication by monitoring the current sequence component generated by the system (ACPS) or induced with inverter control during islanded mode (IUCPC) to identify the islanding condition without communication at the relay to adjust the settings. These two methods are used as a backup to the APSCC, which relies on a communication network to communicate the islanded configuration to the system components. The fourth method relies on a short circuit model inside the relay that is used in conjunction with communication to adjust the system configuration and computes the fault current and adjusts the settings accordingly.Keywords: adaptive relaying, microgrid protection, sequence components, islanding detection, communication controlled protection, integrated short circuit model
Procedia PDF Downloads 9528267 Frequent Itemset Mining Using Rough-Sets
Authors: Usman Qamar, Younus Javed
Abstract:
Frequent pattern mining is the process of finding a pattern (a set of items, subsequences, substructures, etc.) that occurs frequently in a data set. It was proposed in the context of frequent itemsets and association rule mining. Frequent pattern mining is used to find inherent regularities in data. What products were often purchased together? Its applications include basket data analysis, cross-marketing, catalog design, sale campaign analysis, Web log (click stream) analysis, and DNA sequence analysis. However, one of the bottlenecks of frequent itemset mining is that as the data increase the amount of time and resources required to mining the data increases at an exponential rate. In this investigation a new algorithm is proposed which can be uses as a pre-processor for frequent itemset mining. FASTER (FeAture SelecTion using Entropy and Rough sets) is a hybrid pre-processor algorithm which utilizes entropy and rough-sets to carry out record reduction and feature (attribute) selection respectively. FASTER for frequent itemset mining can produce a speed up of 3.1 times when compared to original algorithm while maintaining an accuracy of 71%.Keywords: rough-sets, classification, feature selection, entropy, outliers, frequent itemset mining
Procedia PDF Downloads 43728266 A Theoretical Model for Pattern Extraction in Large Datasets
Authors: Muhammad Usman
Abstract:
Pattern extraction has been done in past to extract hidden and interesting patterns from large datasets. Recently, advancements are being made in these techniques by providing the ability of multi-level mining, effective dimension reduction, advanced evaluation and visualization support. This paper focuses on reviewing the current techniques in literature on the basis of these parameters. Literature review suggests that most of the techniques which provide multi-level mining and dimension reduction, do not handle mixed-type data during the process. Patterns are not extracted using advanced algorithms for large datasets. Moreover, the evaluation of patterns is not done using advanced measures which are suited for high-dimensional data. Techniques which provide visualization support are unable to handle a large number of rules in a small space. We present a theoretical model to handle these issues. The implementation of the model is beyond the scope of this paper.Keywords: association rule mining, data mining, data warehouses, visualization of association rules
Procedia PDF Downloads 22428265 An Exploratory Study Applied to the Accessibility of Museums in the UK
Authors: Sifan Guo, Xuesen Zheng
Abstract:
Visitors as the vital research group have been mentioned in high frequency in the field of museum studies. With the rise of the New Museology Movement, new challenges in the museum appeared, ranging from how to eliminate the cliché class prejudices in museums to how to make visitor-oriented museums more welcome. In line with this new situation, to create a successful visiting experience is the focus of museums in today. National museums as tourist attractions always attract flooded attention, however the local museums may have the different situations. The residents could be one of the main visitors to the local museums how to attract them returned should be considered here. There are various people with different cultural, education and religion backgrounds, it is necessary to keep the balance of the education and entertainment to reach visitors’ expectations. Regarding these questions, a mixed methods research approach has been adopted: observations, tracking and questionnaires. Based on analysing some museums’ cases in the UK, it can be argued that: 1) Audiences’ accessibility support their options and judgments during the visiting. 2) Highly inclusive architecture and narrative expressions could encourage the visitors to proceed deeply understanding and alleviate conflicts. In addition, the main characteristics of the local museums and the interlinks between museums and urban renaissance will be clarified. The conclusion informs not only practical suggestions for reachable characteristic design, but also potential future research subjects.Keywords: accessibility, challenging visitors, new museology movement, visiting experience
Procedia PDF Downloads 11828264 The Significance of Islamic Concept of Good Faith to Cure Flaws in Public International Law
Authors: M. A. H. Barry
Abstract:
The concept of Good faith (husn al-niyyah) and fair-dealing (Nadl) are the fundamental guiding elements in all contracts and other agreements under Islamic law. The preaching of Al-Quran and Prophet Muhammad’s (Peace Be upon Him) firmly command people to act in good faith in all dealings. There are several Quran verses and the Prophet’s saying which stressed the significance of dealing honestly and fairly in all transactions. Under the English law, the good faith is not considered a fundamental requirement for the formation of a legal contract. However, the concept of Good Faith in private contracts is recognized by the civil law system and in Article 7(1) of the Convention on International Sale of Goods (CISG-Vienna Convention-1980). It took several centuries for the international trading community to recognize the significance of the concept of good faith for the international sale of goods transactions. Nevertheless, the recognition of good faith in Civil law is only confined for the commercial contracts. Subsequently to the CISG, this concept has made inroads into the private international law. There are submissions in favour of applying the good faith concept to public international law based on tacit recognition by the international conventions and International Tribunals. However, under public international law the concept of good faith is not recognized as a source of rights or obligations. This weakens the spirit of the good faith concept, particularly when determining the international disputes. This also creates a fundamental flaw because the absence of good faith application means the breaches tainted by bad faith are tolerated. The objective of this research is to evaluate, examine and analyze the application of the concept of good faith in the modern laws and identify its limitation, in comparison with Islamic concept of good faith. This paper also identifies the problems and issues connected with the non-application of this concept to public international law. This research consists of three key components (1) the preliminary inquiry (2) subject analysis and discovery of research results, and (3) examining the challenging problems, and concluding with proposals. The preliminary inquiry is based on both the primary and secondary sources. The same sources are used for the subject analysis. This research also has both inductive and deductive features. The Islamic concept of good faith covers all situations and circumstances where the bad faith causes unfairness to the affected parties, especially the weak parties. Under the Islamic law, the concept of good faith is a source of rights and obligations as Islam prohibits any person committing wrongful or delinquent acts in any dealing whether in a private or public life. This rule is applicable not only for individuals but also for institutions, states, and international organizations. This paper explains how the unfairness is caused by non-recognition of the good faith concept as a source of rights or obligations under public international law and provides legal and non-legal reasons to show why the Islamic formulation is important.Keywords: good faith, the civil law system, the Islamic concept, public international law
Procedia PDF Downloads 14928263 A Journey to the Past: Hoşap Castle in Van
Authors: Muhammet Kurucu
Abstract:
Hoşap Castle, located in Gürpınar, Van, is one of the most important symbols of the city because it hosted sacred memories of its time. Besides the location and construction features of Güzelsu, in resort city of Van, Hoşap Castle is a great place with an architecture consisting of an outer fortress and the inner fortress. It is one of the Ottoman castles and was built in the 17th century by Sarı Süleyman who was known as bey of Mahmudi. Although some parts of Hoşap Castle have been destroyed by natural disasters, it has survived until today without total collapse and most places with excavations are revealed. In this study, present condition of the Hoşap Castle is observed and introduced briefly.Keywords: Güzelsu, Hoşap Castle, natural disasters, restoration, Van
Procedia PDF Downloads 27728262 Emily Dickinson's Green Aesthetics: Mode Gakuen Cocoon Tower as the Anthropomorphic Architectural Representation in the Age of Anthropocene
Authors: Chia-Wen Kuo
Abstract:
Jesse Curran states that there is a "breath awareness" that "facilitates a present-minded capability" to catalyse an "epistemological rupture" in Emily Dickinson's poetry, particularly in the age of Anthropocene. In Dickinson's "Nature", non-humans are subjectified as nature ceases to be subordinated to human interests, and Dickinson's Eco-humility has driven us, readers, into mimicking nature for the making of a better world. In terms of sustainable architecture, Norman Foster is among the representatives who utilise BIM to reduce architectural waste while satiating the users' aesthetic craving for a spectacular skyline. Notably, the Gherkin - 30 St. Mary Axe in east-end London. In 2019, Foster and his team aspired to savour the London skyline with his new design - the Tulip, which has been certified by the LEED as a legitimate green building as well as a complementary extension of the Gherkin. However, Foster's proposition had been denied for numerous times by the mayor Sadiq Khan and the city council as the Tulip cannot blend in the public space around while its observatory functions like a surveillance platform. The Tulip, except for its aesthetic idiosyncrasy, fails to serve for the public good other than another ostentatious tourist attraction in London. The architectural team for Mode Gakuen Cocoon tower, completed in 2008, intended to honour Nature with the symbolism in the building's aesthetic design. It serves as an architectural cocoon that nurtures the students of "Special Technology and Design College" inside. The building itself turns into a Dickinsonian anthropomorphism, where humans are made humble to learn from the entomological beings for self-betterment in the age of Anthropocene. Despite bearing resemblance to a tulip as well as its LEED credential, Norman Foster’s Tulip merely pays tribute to the Nature in a relatively superficial manner without constructing an apparatus that substantially benefit the Londoners as all green cities should embrace Emily Dickinson’s “breath awareness” and be built and treated as an extensive as well as expansive form of biomimicry.Keywords: green city, sustianable architecture, London, Tokyo
Procedia PDF Downloads 15528261 Security Architecture for Cloud Networking: A Survey
Authors: Vishnu Pratap Singh Kirar
Abstract:
In the cloud computing hierarchy IaaS is the lowest layer, all other layers are built over it. Thus it is the most important layer of cloud and requisite more importance. Along with advantages IaaS faces some serious security related issue. Mainly Security focuses on Integrity, confidentiality and availability. Cloud computing facilitate to share the resources inside as well as outside of the cloud. On the other hand, cloud still not in the state to provide surety to 100% data security. Cloud provider must ensure that end user/client get a Quality of Service. In this report we describe possible aspects of cloud related security.Keywords: cloud computing, cloud networking, IaaS, PaaS, SaaS, cloud security
Procedia PDF Downloads 53128260 Unreality of Real: Debordean Reading of Gillian Flynn's Gone Girl
Authors: Sahand Hamed Moeel Ardebil, Zohreh Taebi Noghondari, Mahmood Reza Ghorban Sabbagh
Abstract:
Gillian Flynn’s Gone Girl, depicts a society in which, as a result of media dominance, the reality is very precarious and difficult to grasp. In Gone Girl, reality and image of reality represented on TV, are challenging to differentiate. Along with reality, individuals’ agency and independence before media and the capitalist rule are called in to question in the novel. In order to expose the unstable nature of reality and an individual’s complicated relationship with media, this study has deployed the ideas of Marxist-media theorist Guy Debord (1931-1992). In his book Society of the Spectacle (1966), Debord delineates a society in which images replace the objective reality, and people are incapable of making real changes. The results of the current study show that despite their efforts, Nick and Amy, the two main characters of the novel, are no more than spectators with very little agency before the media. Moreover, following Debord’s argument about the replacement of reality with images, everyone and every institution in Gone Girl projects an image that does not necessarily embody the objective reality, a fact that makes it very hard to differentiate the real from unreal.Keywords: agency, Debord, Gone Girl, media studies, society of spectacle, reality
Procedia PDF Downloads 12528259 Soil Moisture Control System: A Product Development Approach
Authors: Swapneel U. Naphade, Dushyant A. Patil, Satyabodh M. Kulkarni
Abstract:
In this work, we propose the concept and geometrical design of a soil moisture control system (SMCS) module by following the product development approach to develop an inexpensive, easy to use and quick to install product targeted towards agriculture practitioners. The module delivers water to the agricultural land efficiently by sensing the soil moisture and activating the delivery valve. We start with identifying the general needs of the potential customer. Then, based on customer needs we establish product specifications and identify important measuring quantities to evaluate our product. Keeping in mind the specifications, we develop various conceptual solutions of the product and select the best solution through concept screening and selection matrices. Then, we develop the product architecture by integrating the systems into the final product. In the end, the geometric design is done using human factors engineering concepts like heuristic analysis, task analysis, and human error reduction analysis. The result of human factors analysis reveals the remedies which should be applied while designing the geometry and software components of the product. We find that to design the best grip in terms of comfort and applied force, for a power-type grip, a grip-diameter of 35 mm is the most ideal.Keywords: agriculture, human factors, product design, soil moisture control
Procedia PDF Downloads 17228258 Cellular Automata Using Fractional Integral Model
Authors: Yasser F. Hassan
Abstract:
In this paper, a proposed model of cellular automata is studied by means of fractional integral function. A cellular automaton is a decentralized computing model providing an excellent platform for performing complex computation with the help of only local information. The paper discusses how using fractional integral function for representing cellular automata memory or state. The architecture of computing and learning model will be given and the results of calibrating of approach are also given.Keywords: fractional integral, cellular automata, memory, learning
Procedia PDF Downloads 41528257 Pavement Management for a Metropolitan Area: A Case Study of Montreal
Authors: Luis Amador Jimenez, Md. Shohel Amin
Abstract:
Pavement performance models are based on projections of observed traffic loads, which makes uncertain to study funding strategies in the long run if history does not repeat. Neural networks can be used to estimate deterioration rates but the learning rate and momentum have not been properly investigated, in addition, economic evolvement could change traffic flows. This study addresses both issues through a case study for roads of Montreal that simulates traffic for a period of 50 years and deals with the measurement error of the pavement deterioration model. Travel demand models are applied to simulate annual average daily traffic (AADT) every 5 years. Accumulated equivalent single axle loads (ESALs) are calculated from the predicted AADT and locally observed truck distributions combined with truck factors. A back propagation Neural Network (BPN) method with a Generalized Delta Rule (GDR) learning algorithm is applied to estimate pavement deterioration models capable of overcoming measurement errors. Linear programming of lifecycle optimization is applied to identify M&R strategies that ensure good pavement condition while minimizing the budget. It was found that CAD 150 million is the minimum annual budget to good condition for arterial and local roads in Montreal. Montreal drivers prefer the use of public transportation for work and education purposes. Vehicle traffic is expected to double within 50 years, ESALS are expected to double the number of ESALs every 15 years. Roads in the island of Montreal need to undergo a stabilization period for about 25 years, a steady state seems to be reached after.Keywords: pavement management system, traffic simulation, backpropagation neural network, performance modeling, measurement errors, linear programming, lifecycle optimization
Procedia PDF Downloads 461