Search results for: OpenSees software
3998 Grading Fourteen Zones of Isfahan in Terms of the Impact of Globalization on the Urban Fabric of the City, Using the TOPSIS Model
Authors: A. Zahedi Yeganeh, A. Khademolhosseini, R. Mokhtari Malekabadi
Abstract:
Undoubtedly one of the most far-reaching and controversial topics considered in the past few decades, has been globalization. Globalization lies in the essence of the modern culture. It is a complex and rapidly expanding network of links and mutual interdependence that is an aspect of modern life; though some argue that this link existed since the beginning of human history. If we consider globalization as a dynamic social process in which the geographical constraints governing the political, economic, social and cultural relationships have been undermined, it might not be possible to simply describe its impact on the urban fabric. But since in this phenomenon the increase in communications of societies (while preserving the main cultural - regional characteristics) with one another and the increase in the possibility of influencing other societies are discussed, the need for more studies will be felt. The main objective of this study is to grade based on some globalization factors on urban fabric applying the TOPSIS model. The research method is descriptive - analytical and survey. For data analysis, the TOPSIS model and SPSS software were used and the results of GIS software with fourteen cities are shown on the map. The results show that the process of being influenced by the globalization of the urban fabric of fourteen zones of Isfahan was not similar and there have been large differences in this respect between city zones; the most affected areas are zones 5, 6 and 9 of the municipality and the least impact has been on the zones 4 and 3 and 2.Keywords: grading, globalization, urban fabric, 14 zones of Isfahan, TOPSIS model
Procedia PDF Downloads 3133997 Transient Level in the Surge Chamber at the Robert-bourassa Generating Station
Authors: Maryam Kamali Nezhad
Abstract:
The Robert-Bourassa development (LG-2), the first to be built on the Grande Rivière, comprises two sets of eight turbines- generator units each, the East and West powerhouses. Each powerhouse has two tailrace tunnels with an average length of about 1178 m. The LG-2A powerhouse houses 6 turbine-generator units. The water is discharged through two tailrace tunnels with a length of about 1330 m. The objective of this work, at RB (LG-2), is; 1) to establish a new maximum transient level in the surge chamber, 2) to define the new maximum equipment flow rate for the future turbine-generator units, 3) to ensure safe access to various intervention locations in the surge chamber. The transient levels under normal operating conditions at the RB plant were determined in 2001 by the Hydraulics Unit of HQE using the "Chamber" software. It is a one-dimensional mass oscillation calculation software; it is used to determine the variation of the water level in the equilibrium chamber located downstream of a power plant during the load shedding of the power plant units; it can also be used in the case of an equilibrium stack upstream of a power plant. The RB (LG-2) plant study is based on the theoretical nominal geometry of the chamber and the tailrace tunnels and the flow-level relationship at the outlet of the galleries established during design. The software is used in such a way that the results have an acceptable margin of safety, especially with respect to the maximum transient level (e.g., resumption of flow at an inopportune time), to take into account the turbulent and three-dimensional aspects of the actual flow in the chamber. Note that the transient levels depend on the water levels in the river and in the steady-state equilibrium chambers. These data are established in the HQP CRP database and updated from time to time. The maximum transient levels in the RB-East and RB-West powerhouses surge chamber were revised based on the latest update (set 4) of in-river rating curves and steady-state surge chamber water levels. The results of the revision were also used to update the technical advice on the operating conditions for the aforementioned surge chamber access while considering revisions to the calculated water levels.Keywords: generating station, surge chamber, maximum transient level, hydroelectric power station, turbine-generator, reservoir
Procedia PDF Downloads 803996 Modeling of the Attitude Control Reaction Wheels of a Spacecraft in Software in the Loop Test Bed
Authors: Amr AbdelAzim Ali, G. A. Elsheikh, Moutaz M. Hegazy
Abstract:
Reaction wheels (RWs) are generally used as main actuator in the attitude control system (ACS) of spacecraft (SC) for fast orientation and high pointing accuracy. In order to achieve the required accuracy for the RWs model, the main characteristics of the RWs that necessitate analysis during the ACS design phase include: technical features, sequence of operating and RW control logic are included in function (behavior) model. A mathematical model is developed including the various errors source. The errors in control torque including relative, absolute, and error due to time delay. While the errors in angular velocity due to differences between average and real speed, resolution error, loose in installation of angular sensor, and synchronization errors. The friction torque is presented in the model include the different feature of friction phenomena: steady velocity friction, static friction and break-away torque, and frictional lag. The model response is compared with the experimental torque and frequency-response characteristics of tested RWs. Based on the created RW model, some criteria of optimization based control torque allocation problem can be recommended like: avoiding the zero speed crossing, bias angular velocity, or preventing wheel from running on the same angular velocity.Keywords: friction torque, reaction wheels modeling, software in the loop, spacecraft attitude control
Procedia PDF Downloads 2653995 Semantic Differences between Bug Labeling of Different Repositories via Machine Learning
Authors: Pooja Khanal, Huaming Zhang
Abstract:
Labeling of issues/bugs, also known as bug classification, plays a vital role in software engineering. Some known labels/classes of bugs are 'User Interface', 'Security', and 'API'. Most of the time, when a reporter reports a bug, they try to assign some predefined label to it. Those issues are reported for a project, and each project is a repository in GitHub/GitLab, which contains multiple issues. There are many software project repositories -ranging from individual projects to commercial projects. The labels assigned for different repositories may be dependent on various factors like human instinct, generalization of labels, label assignment policy followed by the reporter, etc. While the reporter of the issue may instinctively give that issue a label, another person reporting the same issue may label it differently. This way, it is not known mathematically if a label in one repository is similar or different to the label in another repository. Hence, the primary goal of this research is to find the semantic differences between bug labeling of different repositories via machine learning. Independent optimal classifiers for individual repositories are built first using the text features from the reported issues. The optimal classifiers may include a combination of multiple classifiers stacked together. Then, those classifiers are used to cross-test other repositories which leads the result to be deduced mathematically. The produce of this ongoing research includes a formalized open-source GitHub issues database that is used to deduce the similarity of the labels pertaining to the different repositories.Keywords: bug classification, bug labels, GitHub issues, semantic differences
Procedia PDF Downloads 1983994 Teaching Tools for Web Processing Services
Authors: Rashid Javed, Hardy Lehmkuehler, Franz Josef-Behr
Abstract:
Web Processing Services (WPS) have up growing concern in geoinformation research. However, teaching about them is difficult because of the generally complex circumstances of their use. They limit the possibilities for hands- on- exercises on Web Processing Services. To support understanding however a Training Tools Collection was brought on the way at University of Applied Sciences Stuttgart (HFT). It is limited to the scope of Geostatistical Interpolation of sample point data where different algorithms can be used like IDW, Nearest Neighbor etc. The Tools Collection aims to support understanding of the scope, definition and deployment of Web Processing Services. For example it is necessary to characterize the input of Interpolation by the data set, the parameters for the algorithm and the interpolation results (here a grid of interpolated values is assumed). This paper reports on first experiences using a pilot installation. This was intended to find suitable software interfaces for later full implementations and conclude on potential user interface characteristics. Experiences were made with Deegree software, one of several Services Suites (Collections). Being strictly programmed in Java, Deegree offers several OGC compliant Service Implementations that also promise to be of benefit for the project. The mentioned parameters for a WPS were formalized following the paradigm that any meaningful component will be defined in terms of suitable standards. E.g. the data output can be defined as a GML file. But, the choice of meaningful information pieces and user interactions is not free but partially determined by the selected WPS Processing Suite.Keywords: deegree, interpolation, IDW, web processing service (WPS)
Procedia PDF Downloads 3543993 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing
Authors: Arjun Kumar Rath, Titus Dhanasingh
Abstract:
Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.Keywords: board diagnostics software, embedded system, hardware testing, test frameworks
Procedia PDF Downloads 1433992 Domain-Specific Languages Evaluation: A Literature Review and Experience Report
Authors: Sofia Meacham
Abstract:
In this abstract paper, the Domain-Specific Languages (DSL) evaluation will be presented based on existing literature and years of experience developing DSLs for several domains. The domains we worked on ranged from AI, business applications, and finances/accounting to health. In general, DSLs have been utilised in many domains to provide tailored and efficient solutions to address specific problems. Although they are a reputable method among highly technical circles and have also been used by non-technical experts with success, according to our knowledge, there isn’t a commonly accepted method for evaluating them. There are some methods that define criteria that are adaptations from the general software engineering quality criteria. Other literature focuses on the DSL usability aspect of evaluation and applies methods such as Human-Computer Interaction (HCI) and goal modeling. All these approaches are either hard to introduce, such as the goal modeling, or seem to ignore the domain-specific focus of the DSLs. From our experience, the DSLs have domain-specificity in their core, and consequently, the methods to evaluate them should also include domain-specific criteria in their core. The domain-specific criteria would require synergy between the domain experts and the DSL developers in the same way that DSLs cannot be developed without domain-experts involvement. Methods from agile and other software engineering practices, such as co-creation workshops, should be further emphasised and explored to facilitate this direction. Concluding, our latest experience and plans for DSLs evaluation will be presented and open for discussion.Keywords: domain-specific languages, DSL evaluation, DSL usability, DSL quality metrics
Procedia PDF Downloads 1013991 Visual Template Detection and Compositional Automatic Regular Expression Generation for Business Invoice Extraction
Authors: Anthony Proschka, Deepak Mishra, Merlyn Ramanan, Zurab Baratashvili
Abstract:
Small and medium-sized businesses receive over 160 billion invoices every year. Since these documents exhibit many subtle differences in layout and text, extracting structured fields such as sender name, amount, and VAT rate from them automatically is an open research question. In this paper, existing work in template-based document extraction is extended, and a system is devised that is able to reliably extract all required fields for up to 70% of all documents in the data set, more than any other previously reported method. The approaches are described for 1) detecting through visual features which template a given document belongs to, 2) automatically generating extraction rules for a given new template by composing regular expressions from multiple components, and 3) computing confidence scores that indicate the accuracy of the automatic extractions. The system can generate templates with as little as one training sample and only requires the ground truth field values instead of detailed annotations such as bounding boxes that are hard to obtain. The system is deployed and used inside a commercial accounting software.Keywords: data mining, information retrieval, business, feature extraction, layout, business data processing, document handling, end-user trained information extraction, document archiving, scanned business documents, automated document processing, F1-measure, commercial accounting software
Procedia PDF Downloads 1293990 A Verification Intellectual Property for Multi-Flow Rate Control on Any Single Flow Bus Functional Model
Authors: Pawamana Ramachandra, Jitesh Gupta, Saranga P. Pogula
Abstract:
In verification of high volume and complex packet processing IPs, finer control of flow management aspects (for example, rate, bits/sec etc.) per flow class (or a virtual channel or a software thread) is needed. When any Software/Universal Verification Methodology (UVM) thread arbitration is left to the simulator (e.g., Verilog Compiler Simulator (VCS) or Incisive Enterprise Simulator core simulation engine (NCSIM)), it is hard to predict its pattern of resulting distribution of bandwidth by the simulator thread arbitration. In many cases, the patterns desired in a test scenario may not be accomplished as the simulator might give a different distribution than what was required. This can lead to missing multiple traffic scenarios, specifically deadlock and starvation related. We invented a component (namely Flow Manager Verification IP) to be intervening between the application (test case) and the protocol VIP (with UVM sequencer) to control the bandwidth per thread/virtual channel/flow. The Flow Manager has knobs visible to the UVM sequence/test to configure the required distribution of rate per thread/virtual channel/flow. This works seamlessly and produces rate stimuli to further harness the Design Under Test (DUT) with asymmetric inputs compared to the programmed bandwidth/Quality of Service (QoS) distributions in the Design Under Test.Keywords: flow manager, UVM sequencer, rated traffic generation, quality of service
Procedia PDF Downloads 983989 Establishment of Precision System for Underground Facilities Based on 3D Absolute Positioning Technology
Authors: Yonggu Jang, Jisong Ryu, Woosik Lee
Abstract:
The study aims to address the limitations of existing underground facility exploration equipment in terms of exploration depth range, relative depth measurement, data processing time, and human-centered ground penetrating radar image interpretation. The study proposed the use of 3D absolute positioning technology to develop a precision underground facility exploration system. The aim of this study is to establish a precise exploration system for underground facilities based on 3D absolute positioning technology, which can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The study developed software and hardware technologies to build the precision exploration system. The software technologies developed include absolute positioning technology, ground surface location synchronization technology of GPR exploration equipment, GPR exploration image AI interpretation technology, and integrated underground space map-based composite data processing technology. The hardware systems developed include a vehicle-type exploration system and a cart-type exploration system. The data was collected using the developed exploration system, which employs 3D absolute positioning technology. The GPR exploration images were analyzed using AI technology, and the three-dimensional location information of the explored precise underground facilities was compared to the integrated underground space map. The study successfully developed a precision underground facility exploration system based on 3D absolute positioning technology. The developed exploration system can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The system comprises software technologies that build a 3D precise DEM, synchronize the GPR sensor's ground surface 3D location coordinates, automatically analyze and detect underground facility information in GPR exploration images and improve accuracy through comparative analysis of the three-dimensional location information, and hardware systems, including a vehicle-type exploration system and a cart-type exploration system. The study's findings and technological advancements are essential for underground safety management in Korea. The proposed precision exploration system significantly contributes to establishing precise location information of underground facility information, which is crucial for underground safety management and improves the accuracy and efficiency of exploration. The study addressed the limitations of existing equipment in exploring underground facilities, proposed 3D absolute positioning technology-based precision exploration system, developed software and hardware systems for the exploration system, and contributed to underground safety management by providing precise location information. The developed precision underground facility exploration system based on 3D absolute positioning technology has the potential to provide accurate and efficient exploration of underground facilities up to a depth of 5m. The system's technological advancements contribute to the establishment of precise location information of underground facility information, which is essential for underground safety management in Korea.Keywords: 3D absolute positioning, AI interpretation of GPR exploration images, complex data processing, integrated underground space maps, precision exploration system for underground facilities
Procedia PDF Downloads 603988 Development and Verification of the Idom Shielding Optimization Tool
Authors: Omar Bouhassoun, Cristian Garrido, César Hueso
Abstract:
The radiation shielding design is an optimization problem with multiple -constrained- objective functions (radiation dose, weight, price, etc.) that depend on several parameters (material, thickness, position, etc.). The classical approach for shielding design consists of a brute force trial-and-error process subject to previous designer experience. Therefore, the result is an empirical solution but not optimal, which can degrade the overall performance of the shielding. In order to automate the shielding design procedure, the IDOM Shielding Optimization Tool (ISOT) has been developed. This software combines optimization algorithms with the capabilities to read/write input files, run calculations, as well as parse output files for different radiation transport codes. In the first stage, the software was established to adjust the input files for two well-known Monte Carlo codes (MCNP and Serpent) and optimize the result (weight, volume, price, dose rate) using multi-objective genetic algorithms. Nevertheless, its modular implementation easily allows the inclusion of more radiation transport codes and optimization algorithms. The work related to the development of ISOT and its verification on a simple 3D multi-layer shielding problem using both MCNP and Serpent will be presented. ISOT looks very promising for achieving an optimal solution to complex shielding problems.Keywords: optimization, shielding, nuclear, genetic algorithm
Procedia PDF Downloads 1093987 Lockit: A Logic Locking Automation Software
Authors: Nemanja Kajtez, Yue Zhan, Basel Halak
Abstract:
The significant rise in the cost of manufacturing of nanoscale integrated circuits (IC) has led the majority of IC design companies to outsource the fabrication of their products to other companies, often located in different countries. This multinational nature of the hardware supply chain has led to a host of security threats, including IP piracy, IC overproduction, and Trojan insertion. To combat that, researchers have proposed logic locking techniques to protect the intellectual properties of the design and increase the difficulty of malicious modification of its functionality. However, the adoption of logic locking approaches is rather slow due to the lack of the integration with IC production process and the lack of efficacy of existing algorithms. This work automates the logic locking process by developing software using Python that performs the locking on a gate-level netlist and can be integrated with the existing digital synthesis tools. Analysis of the latest logic locking algorithms has demonstrated that the SFLL-HD algorithm is one of the most secure and versatile in trading-off levels of protection against different types of attacks and was thus selected for implementation. The presented tool can also be expanded to incorporate the latest locking mechanisms to keep up with the fast-paced development in this field. The paper also presents a case study to demonstrate the functionality of the tool and how it could be used to explore the design space and compare different locking solutions. The source code of this tool is available freely from (https://www.researchgate.net/publication/353195333_Source_Code_for_The_Lockit_Tool).Keywords: design automation, hardware security, IP piracy, logic locking
Procedia PDF Downloads 1803986 Diversity for Safety and Security of Autonomous Vehicles against Accidental and Deliberate Faults
Authors: Anil Ranjitbhai Patel, Clement John Shaji, Peter Liggesmeyer
Abstract:
Safety and security of autonomous vehicles (AVs) is a growing concern, first, due to the increased number of safety-critical functions taken over by automotive embedded systems; second, due to the increased exposure of the software-intensive systems to potential attackers; third, due to dynamic interaction in an uncertain and unknown environment at runtime which results in changed functional and non-functional properties of the system. Frequently occurring environmental uncertainties, random component failures, and compromise security of the AVs might result in hazardous events, sometimes even in an accident, if left undetected. Beyond these technical issues, we argue that the safety and security of AVs against accidental and deliberate faults are poorly understood and rarely implemented. One possible way to overcome this is through a well-known diversity approach. As an effective approach to increase safety and security, diversity has been widely used in the aviation, railway, and aerospace industries. Thus, the paper proposes fault-tolerance by diversity model takes into consideration the mitigation of accidental and deliberate faults by application of structure and variant redundancy. The model can be used to design the AVs with various types of diversity in hardware and software-based multi-version system. The paper evaluates the presented approach by employing an example from adaptive cruise control, followed by discussing the case study with initial findings.Keywords: autonomous vehicles, diversity, fault-tolerance, adaptive cruise control, safety, security
Procedia PDF Downloads 1273985 Urban Development Criteria with a Focus on Resilience to Pandemics: A Case Study of Corona Virus (Covid-19)
Authors: Elham Zabetian Targhi, Niusha Fardnava, Saba Saghafi
Abstract:
Urban resilience to Corona Virus has become a major concern for cities these days. Our country also has not been safe from the destructive effects of this virus in social, economic, physical, governance, and management dimensions; and according to official statistics, hundreds of thousands of people in Iran have been infected with this virus and tens of thousands have died so far. Therefore, to measure urban resilience to this pandemic, some criteria and sub-criteria were developed based on the authors’ documentary and field studies, and their significance or weights were determined using analytical-comparative research method using a questionnaire of paired or L-Saati comparisons from the viewpoint of experts in urban sciences and urban development using AHP hierarchical analysis in EXPERT CHOICE software. Then, designing a questionnaire with a five-point Likert scale, the satisfaction of Tehran residents with the extracted criteria and sub-criteria was measured and the correlation between the important criteria in each dimension was assessed using correlation tests in SPSS16 software. According to the obtained results of AHP analysis and the scores of each sub-criterion, the weight of all criteria was normal. In the next stage, according to the pairwise correlation tests between the important criteria in each dimension from the viewpoint of urban science experts and Tehran residents, it was concluded that the reliability of the correlation between the criteria is 99%. In all the cases, the P-value or the same significance level was less than 0.05, which indicated the significance of the pairwise relations between the variables.Keywords: Urban Resilience, Pandemics, Corona Virus (Covid-19), Criteria.
Procedia PDF Downloads 813984 3D Steady and Transient Centrifugal Pump Flow within Ansys CFX and OpenFOAM
Authors: Clement Leroy, Guillaume Boitel
Abstract:
This paper presents a comparative benchmarking review of a steady and transient three-dimensional (3D) flow computations in centrifugal pump using commercial (AnsysCFX) and open source (OpenFOAM) computational fluid dynamics (CFD) software. In centrifugal rotor-dynamic pump, the fluid enters in the impeller along to the rotating axis to be accelerated in order to increase the pressure, flowing radially outward into another stage, vaned diffuser or volute casing, from where it finally exits into a downstream pipe. Simulations are carried out at the best efficiency point (BEP) and part load, for single-phase flow with several turbulence models. The results are compared with overall performance report from experimental data. The use of CFD technology in industry is still limited by the high computational costs, and even more by the high cost of commercial CFD software and high-performance computing (HPC) licenses. The main objectives of the present study are to define OpenFOAM methodology for high-quality 3D steady and transient turbomachinery CFD simulation to conduct a thorough time-accurate performance analysis. On the other hand a detailed comparisons between computational methods, features on latest Ansys release 18 and OpenFOAM is investigated to assess the accuracy and industrial applications of those solvers. Finally an automated connected workflow (IoT) for turbine blade applications is presented.Keywords: benchmarking, CFX, internet of things, openFOAM, time-accurate, turbomachinery
Procedia PDF Downloads 2033983 Study of the Diaphragm Flexibility Effect on the Inelastic Seismic Response of Thin Wall Reinforced Concrete Buildings (TWRCB): A Purpose to Reduce the Uncertainty in the Vulnerability Estimation
Authors: A. Zapata, Orlando Arroyo, R. Bonett
Abstract:
Over the last two decades, the growing demand for housing in Latin American countries has led to the development of construction projects based on low and medium-rise buildings with thin reinforced concrete walls. This system, known as Thin Walls Reinforced Concrete Buildings (TWRCB), uses walls with thicknesses from 100 to 150 millimetres, with flexural reinforcement formed by welded wire mesh (WWM) with diameters between 5 and 7 millimetres, arranged in one or two layers. These walls often have irregular structural configurations, including combinations of rectangular shapes. Experimental and numerical research conducted in regions where this structural system is commonplace indicates inherent weaknesses, such as limited ductility due to the WWM reinforcement and thin element dimensions. Because of its complexity, numerical analyses have relied on two-dimensional models that don't explicitly account for the floor system, even though it plays a crucial role in distributing seismic forces among the resilient elements. Nonetheless, the numerical analyses assume a rigid diaphragm hypothesis. For this purpose, two study cases of buildings were selected, low-rise and mid-rise characteristics of TWRCB in Colombia. The buildings were analyzed in Opensees using the MVLEM-3D for walls and shell elements to simulate the slabs to involve the effect of coupling diaphragm in the nonlinear behaviour. Three cases are considered: a) models without a slab, b) models with rigid slabs, and c) models with flexible slabs. An incremental static (pushover) and nonlinear dynamic analyses were carried out using a set of 44 far-field ground motions of the FEMA P-695, scaled to 1.0 and 1.5 factors to consider the probability of collapse for the design base earthquake (DBE) and the maximum considered earthquake (MCE) for the model, according to the location sites and hazard zone of the archetypes in the Colombian NSR-10. Shear base capacity, maximum displacement at the roof, walls shear base individual demands and probabilities of collapse were calculated, to evaluate the effect of absence, rigid and flexible slabs in the nonlinear behaviour of the archetype buildings. The pushover results show that the building exhibits an overstrength between 1.1 to 2 when the slab is considered explicitly and depends on the structural walls plan configuration; additionally, the nonlinear behaviour considering no slab is more conservative than if the slab is represented. Include the flexible slab in the analysis remarks the importance to consider the slab contribution in the shear forces distribution between structural elements according to design resistance and rigidity. The dynamic analysis revealed that including the slab reduces the collapse probability of this system due to have lower displacements and deformations, enhancing the safety of residents and the seismic performance. The strategy of including the slab in modelling is important to capture the real effect on the distribution shear forces in walls due to coupling to estimate the correct nonlinear behaviour in this system and the adequate distribution to proportionate the correct resistance and rigidity of the elements in the design to reduce the possibility of damage to the elements during an earthquake.Keywords: thin wall reinforced concrete buildings, coupling slab, rigid diaphragm, flexible diaphragm
Procedia PDF Downloads 723982 Insight2OSC: Using Electroencephalography (EEG) Rhythms from the Emotiv Insight for Musical Composition via Open Sound Control (OSC)
Authors: Constanza Levicán, Andrés Aparicio, Rodrigo F. Cádiz
Abstract:
The artistic usage of Brain-computer interfaces (BCI), initially intended for medical purposes, has increased in the past few years as they become more affordable and available for the general population. One interesting question that arises from this practice is whether it is possible to compose or perform music by using only the brain as a musical instrument. In order to approach this question, we propose a BCI for musical composition, based on the representation of some mental states as the musician thinks about sounds. We developed software, called Insight2OSC, that allows the usage of the Emotiv Insight device as a musical instrument, by sending the EEG data to audio processing software such as MaxMSP through the OSC protocol. We provide two compositional applications bundled with the software, which we call Mapping your Mental State and Thinking On. The signals produced by the brain have different frequencies (or rhythms) depending on the level of activity, and they are classified as one of the following waves: delta (0.5-4 Hz), theta (4-8 Hz), alpha (8-13 Hz), beta (13-30 Hz), gamma (30-50 Hz). These rhythms have been found to be related to some recognizable mental states. For example, the delta rhythm is predominant in a deep sleep, while beta and gamma rhythms have higher amplitudes when the person is awake and very concentrated. Our first application (Mapping your Mental State) produces different sounds representing the mental state of the person: focused, active, relaxed or in a state similar to a deep sleep by the selection of the dominants rhythms provided by the EEG device. The second application relies on the physiology of the brain, which is divided into several lobes: frontal, temporal, parietal and occipital. The frontal lobe is related to abstract thinking and high-level functions, the parietal lobe conveys the stimulus of the body senses, the occipital lobe contains the primary visual cortex and processes visual stimulus, the temporal lobe processes auditory information and it is important for memory tasks. In consequence, our second application (Thinking On) processes the audio output depending on the users’ brain activity as it activates a specific area of the brain that can be measured using the Insight device.Keywords: BCI, music composition, emotiv insight, OSC
Procedia PDF Downloads 3223981 The Design Optimization for Sound Absorption Material of Multi-Layer Structure
Authors: Un-Hwan Park, Jun-Hyeok Heo, In-Sung Lee, Tae-Hyeon Oh, Dae-Kyu Park
Abstract:
Sound absorbing material is used as automotive interior material. Sound absorption coefficient should be predicted to design it. But it is difficult to predict sound absorbing coefficient because it is comprised of several material layers. So, its targets are achieved through many experimental tunings. It causes a lot of cost and time. In this paper, we propose the process to estimate the sound absorption coefficient with multi-layer structure. In order to estimate the coefficient, physical properties of each material are used. These properties also use predicted values by Foam-X software using the sound absorption coefficient data measured by impedance tube. Since there are many physical properties and the measurement equipment is expensive, the values predicted by software are used. Through the measurement of the sound absorption coefficient of each material, its physical properties are calculated inversely. The properties of each material are used to calculate the sound absorption coefficient of the multi-layer material. Since the absorption coefficient of multi-layer can be calculated, optimization design is possible through simulation. Then, we will compare and analyze the calculated sound absorption coefficient with the data measured by scaled reverberation chamber and impedance tubes for a prototype. If this method is used when developing automotive interior materials with multi-layer structure, the development effort can be reduced because it can be optimized by simulation. So, cost and time can be saved.Keywords: sound absorption material, sound impedance tube, sound absorption coefficient, optimization design
Procedia PDF Downloads 2873980 The Status and Role of Women in Indian IT Industry and Relevant Role and Scope of HRM
Authors: Shivani Kolarkar
Abstract:
Splendid growth in Indian IT has generated women employment on a large scale in India and continues to do it. Indian IT industry has achieved this in spite of total masculine dominance in other Indian engineering industries, where the ratio of women employment is almost negligible as compared to men. Indian IT today proudly enjoys a strong pool of technically educated, intellectual, and skillful women employees. IT industry has encouraged technical education for women in India, to a great extent. The software industry has definitely contributed to developing a positive and dignified role and status of women employees in Indian IT industry. It has promoted women’s social and economic role and status. In spite of all, gender discrimination still persists in Indian IT, also, which is low as compared to other industries, but it is a matter of concern. An Indian woman is bound to carry dual roles which are equally over-stressed for IT women employees. Long working hours, night shifts, work pressures and insufficient safety majors and necessary facilities for women contributes to making her physical-mental life, family and married life troublesome. Which forces her either to cluster at low-end jobs in IT/elsewhere or to sacrifice her career. Nature, role and status of HRM needs to be broadened, deepened and shaped into research-oriented multidimensional perspective in the context of really enhancing role and status of Indian IT women with high appreciation of women employees’ dignity and entity.Keywords: attrition, gender discrimination, HRM, Indian IT, software industry, job satisfaction, safety, technical education, women employment
Procedia PDF Downloads 6453979 Study of Polychlorinated Dibenzo-P-Dioxins and Dibenzofurans Dispersion in the Environment of a Municipal Solid Waste Incinerator
Authors: Gómez R. Marta, Martín M. Jesús María
Abstract:
The general aim of this paper identifies the areas of highest concentration of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) around the incinerator through the use of dispersion models. Atmospheric dispersion models are useful tools for estimating and prevent the impact of emissions from a particular source in air quality. These models allow considering different factors that influence in air pollution: source characteristics, the topography of the receiving environment and weather conditions to predict the pollutants concentration. The PCDD/Fs, after its emission into the atmosphere, are deposited on water or land, near or far from emission source depending on the size of the associated particles and climatology. In this way, they are transferred and mobilized through environmental compartments. The modelling of PCDD/Fs was carried out with following tools: Atmospheric Dispersion Model Software (ADMS) and Surfer. ADMS is a dispersion model Gaussian plume, used to model the impact of air quality industrial facilities. And Surfer is a program of surfaces which is used to represent the dispersion of pollutants on a map. For the modelling of emissions, ADMS software requires the following input parameters: characterization of emission sources (source type, height, diameter, the temperature of the release, flow rate, etc.) meteorological and topographical data (coordinate system), mainly. The study area was set at 5 Km around the incinerator and the first population center nearest to focus PCDD/Fs emission is about 2.5 Km, approximately. Data were collected during one year (2013) both PCDD/Fs emissions of the incinerator as meteorology in the study area. The study has been carried out during period's average that legislation establishes, that is to say, the output parameters are taking into account the current legislation. Once all data required by software ADMS, described previously, are entered, and in order to make the representation of the spatial distribution of PCDD/Fs concentration and the areas affecting them, the modelling was proceeded. In general, the dispersion plume is in the direction of the predominant winds (Southwest and Northeast). Total levels of PCDD/Fs usually found in air samples, are from <2 pg/m3 for remote rural areas, from 2-15 pg/m3 in urban areas and from 15-200 pg/m3 for areas near to important sources, as can be an incinerator. The results of dispersion maps show that maximum concentrations are the order of 10-8 ng/m3, well below the values considered for areas close to an incinerator, as in this case.Keywords: atmospheric dispersion, dioxin, furan, incinerator
Procedia PDF Downloads 2153978 FEM and Experimental Modal Analysis of Computer Mount
Authors: Vishwajit Ghatge, David Looper
Abstract:
Over the last few decades, oilfield service rolling equipment has significantly increased in weight, primarily because of emissions regulations, which require larger/heavier engines, larger cooling systems, and emissions after-treatment systems, in some cases, etc. Larger engines cause more vibration and shock loads, leading to failure of electronics and control systems. If the vibrating frequency of the engine matches the system frequency, high resonance is observed on structural parts and mounts. One such existing automated control equipment system comprising wire rope mounts used for mounting computers was designed approximately 12 years ago. This includes the use of an industrial- grade computer to control the system operation. The original computer had a smaller, lighter enclosure. After a few years, a newer computer version was introduced, which was 10 lbm heavier. Some failures of internal computer parts have been documented for cases in which the old mounts were used. Because of the added weight, there is a possibility of having the two brackets impact each other under off-road conditions, which causes a high shock input to the computer parts. This added failure mode requires validating the existing mount design to suit the new heavy-weight computer. This paper discusses the modal finite element method (FEM) analysis and experimental modal analysis conducted to study the effects of vibration on the wire rope mounts and the computer. The existing mount was modelled in ANSYS software, and resultant mode shapes and frequencies were obtained. The experimental modal analysis was conducted, and actual frequency responses were observed and recorded. Results clearly revealed that at resonance frequency, the brackets were colliding and potentially causing damage to computer parts. To solve this issue, spring mounts of different stiffness were modeled in ANSYS software, and the resonant frequency was determined. Increasing the stiffness of the system increased the resonant frequency zone away from the frequency window at which the engine showed heavy vibrations or resonance. After multiple iterations in ANSYS software, the stiffness of the spring mount was finalized, which was again experimentally validated.Keywords: experimental modal analysis, FEM Modal Analysis, frequency, modal analysis, resonance, vibration
Procedia PDF Downloads 3203977 Optimal Design of Linear Generator to Recharge the Smartphone Battery
Authors: Jin Ho Kim, Yujeong Shin, Seong-Jin Cho, Dong-Jin Kim, U-Syn Ha
Abstract:
Due to the development of the information industry and technologies, cellular phones have must not only function to communicate, but also have functions such as the Internet, e-banking, entertainment, etc. These phones are called smartphones. The performance of smartphones has improved, because of the various functions of smartphones, and the capacity of the battery has been increased gradually. Recently, linear generators have been embedded in smartphones in order to recharge the smartphone's battery. In this study, optimization is performed and an array change of permanent magnets is examined in order to increase efficiency. We propose an optimal design using design of experiments (DOE) to maximize the generated induced voltage. The thickness of the poleshoe and permanent magnet (PM), the height of the poleshoe and PM, and the thickness of the coil are determined to be design variables. We made 25 sampling points using an orthogonal array according to four design variables. We performed electromagnetic finite element analysis to predict the generated induced voltage using the commercial electromagnetic analysis software ANSYS Maxwell. Then, we made an approximate model using the Kriging algorithm, and derived optimal values of the design variables using an evolutionary algorithm. The commercial optimization software PIAnO (Process Integration, Automation, and Optimization) was used with these algorithms. The result of the optimization shows that the generated induced voltage is improved.Keywords: smartphone, linear generator, design of experiment, approximate model, optimal design
Procedia PDF Downloads 3443976 Analysis of Advancements in Process Modeling and Reengineering at Fars Regional Electric Company, Iran
Authors: Mohammad Arabi
Abstract:
Business Process Reengineering (BPR) is a systematic approach to fundamentally redesign organizational processes to achieve significant improvements in organizational performance. At Fars Regional Electric Company, implementing BPR is deemed essential to increase productivity, reduce costs, and improve service quality. This article examines how BPR can help enhance the performance of Fars Regional Electric Company. The objective of this research is to evaluate and analyze the advancements in process modeling and reengineering at Fars Regional Electric Company and to provide solutions for improving the productivity and efficiency of organizational processes. This study aims to demonstrate how BPR can be used to improve organizational processes and enhance the overall performance of the company. This research employs both qualitative and quantitative research methods and includes interviews with senior managers and experts at Fars Regional Electric Company. The analytical tools include process modeling software such as Bizagi and ARIS, and statistical analysis software such as SPSS and Minitab. Data analysis was conducted using advanced statistical methods. The results indicate that the use of BPR techniques can lead to a significant reduction in process execution time and overall improvement in quality. Implementing BPR at Fars Regional Electric Company has led to increased productivity, reduced costs, and improved overall performance of the company. This study shows that with proper implementation of BPR and the use of modeling tools, the company can achieve significant improvements in its processes. Recommendations: (1) Continuous Training for Staff: Invest in continuous training of staff to enhance their skills and knowledge in BPR. (2) Use of Advanced Technologies: Utilize modeling and analysis software to improve processes. (3) Implementation of Effective Management Systems: Employ knowledge and information management systems to enhance organizational performance. (4) Continuous Monitoring and Review of Processes: Regularly review and revise processes to ensure ongoing improvements. This article highlights the importance of improving organizational processes at Fars Regional Electric Company and recommends that managers and decision-makers at the company seriously consider reengineering processes and utilizing modeling technologies to achieve developmental goals and continuous improvement.Keywords: business process reengineering, electric company, Fars province, process modeling advancements
Procedia PDF Downloads 473975 Analysys of Some Solutions to Protect the Tombolo of Giens
Authors: Yves Lacroix, Van Van Than, Didier Léandri, Pierre Liardet
Abstract:
The western Tombolo of the Giens peninsula in southern France, known as Almanarre beach, is subject to coastal erosion. We are trying to use computer simulation in order to propose solutions to stop this erosion. Our aim was first to determine the main factors for this erosion and successfully apply a coupled hydro-sedimentological numerical model based on observations and measurements that have been performed on the site for decades. We have gathered all available information and data about waves, winds, currents, tides, bathymetry, coastal line, and sediments concerning the site. These have been divided into two sets: one devoted to calibrating a numerical model using Mike 21 software, the other to serve as a reference in order to numerically compare the present situation to what it could be if we implemented different types of underwater constructions. This paper presents the first part of the study: selecting and melting different sources into a coherent data basis, identifying the main erosion factors, and calibrating the coupled software model against the selected reference period. Our results bring calibration of the numerical model with good fitting coefficients. They also show that the winter South-Western storm events conjugated to depressive weather conditions constitute a major factor of erosion, mainly due to wave impact in the northern part of the Almanarre beach. Together, current and wind impact is shown negligible.Keywords: Almanarre beach, coastal erosion, hydro-sedimentological, numerical model
Procedia PDF Downloads 3183974 Cryptocurrency Forensics: Analysis on Bitcoin E-Wallet from Computer Source Evidence
Authors: Muhammad Nooraiman bin Noorashid, Mohd Sharizuan bin Mohd Omar, Mohd Zabri Adil bin Talib, Aswami Fadillah bin Mohd Ariffin
Abstract:
Nowadays cryptocurrency has become a global phenomenon known to most people. People using this alternative digital money to do a transaction in many ways (e.g. Used for online shopping, wealth management, and fundraising). However, this digital asset also widely used in criminal activities since its use decentralized control as opposed to centralized electronic money and central banking systems and this makes a user, who used this currency invisible. The high-value exchange of these digital currencies also has been a target to criminal activities. The cryptocurrency crimes have become a challenge for the law enforcement to analyze and to proof the evidence as criminal devices. In this paper, our focus is more on bitcoin cryptocurrency and the possible artifacts that can be obtained from the different type of digital wallet, which is software and browser-based application. The process memory and physical hard disk are examined with the aims of identifying and recovering potential digital evidence. The stage of data acquisition divided by three states which are the initial creation of the wallet, transaction that consists transfer and receiving a coin and the last state is after the wallet is being deleted. Findings from this study suggest that both data from software and browser type of wallet process memory is a valuable source of evidence, and many of the artifacts found in process memory are also available from the application and wallet files on the client computer storage.Keywords: cryptocurrency, bitcoin, digital wallet, digital forensics
Procedia PDF Downloads 3393973 On Consolidated Predictive Model of the Natural History of Breast Cancer Considering Primary Tumor and Primary Distant Metastases Growth
Authors: Ella Tyuryumina, Alexey Neznanov
Abstract:
Finding algorithms to predict the growth of tumors has piqued the interest of researchers ever since the early days of cancer research. A number of studies were carried out as an attempt to obtain reliable data on the natural history of breast cancer growth. Mathematical modeling can play a very important role in the prognosis of tumor process of breast cancer. However, mathematical models describe primary tumor growth and metastases growth separately. Consequently, we propose a mathematical growth model for primary tumor and primary metastases which may help to improve predicting accuracy of breast cancer progression using an original mathematical model referred to CoM-IV and corresponding software. We are interested in: 1) modelling the whole natural history of primary tumor and primary metastases; 2) developing adequate and precise CoM-IV which reflects relations between PT and MTS; 3) analyzing the CoM-IV scope of application; 4) implementing the model as a software tool. The CoM-IV is based on exponential tumor growth model and consists of a system of determinate nonlinear and linear equations; corresponds to TNM classification. It allows to calculate different growth periods of primary tumor and primary metastases: 1) ‘non-visible period’ for primary tumor; 2) ‘non-visible period’ for primary metastases; 3) ‘visible period’ for primary metastases. The new predictive tool: 1) is a solid foundation to develop future studies of breast cancer models; 2) does not require any expensive diagnostic tests; 3) is the first predictor which makes forecast using only current patient data, the others are based on the additional statistical data. Thus, the CoM-IV model and predictive software: a) detect different growth periods of primary tumor and primary metastases; b) make forecast of the period of primary metastases appearance; c) have higher average prediction accuracy than the other tools; d) can improve forecasts on survival of BC and facilitate optimization of diagnostic tests. The following are calculated by CoM-IV: the number of doublings for ‘nonvisible’ and ‘visible’ growth period of primary metastases; tumor volume doubling time (days) for ‘nonvisible’ and ‘visible’ growth period of primary metastases. The CoM-IV enables, for the first time, to predict the whole natural history of primary tumor and primary metastases growth on each stage (pT1, pT2, pT3, pT4) relying only on primary tumor sizes. Summarizing: a) CoM-IV describes correctly primary tumor and primary distant metastases growth of IV (T1-4N0-3M1) stage with (N1-3) or without regional metastases in lymph nodes (N0); b) facilitates the understanding of the appearance period and manifestation of primary metastases.Keywords: breast cancer, exponential growth model, mathematical modelling, primary metastases, primary tumor, survival
Procedia PDF Downloads 3323972 Numerical Simulation of a Point Absorber Wave Energy Converter Using OpenFOAM in Indian Scenario
Authors: Pooja Verma, Sumana Ghosh
Abstract:
There is a growing need for alternative way of power generation worldwide. The reason can be attributed to limited resources of fossil fuels, environmental pollution, increasing cost of conventional fuels, and lower efficiency of conversion of energy in existing systems. In this context, one of the potential alternatives for power generation is wave energy. However, it is difficult to estimate the amount of electrical energy generation in an irregular sea condition by experiment and or analytical methods. Therefore in this work, a numerical wave tank is developed using the computational fluid dynamics software Open FOAM. In this software a specific utility known as waves2Foam utility is being used to carry out the simulation work. The computational domain is a tank of dimension: 5m*1.5m*1m with a floating object of dimension: 0.5m*0.2m*0.2m. Regular waves are generated at the inlet of the wave tank according to Stokes second order theory. The main objective of the present study is to validate the numerical model against existing experimental data. It shows a good matching with the existing experimental data of floater displacement. Later the model is exploited to estimate energy extraction due to the movement of such a point absorber in real sea conditions. Scale down the wave properties like wave height, wave length, etc. are used as input parameters. Seasonal variations are also considered.Keywords: OpenFOAM, numerical wave tank, regular waves, floating object, point absorber
Procedia PDF Downloads 3523971 The Effect of Tool Path Strategy on Surface and Dimension in High Speed Milling
Authors: A. Razavykia, A. Esmaeilzadeh, S. Iranmanesh
Abstract:
Many orthopedic implants like proximal humerus cases require lower surface roughness and almost immediate/short lead time surgery. Thus, rapid response from the manufacturer is very crucial. Tool path strategy of milling process has a direct influence on the surface roughness and lead time of medical implant. High-speed milling as promised process would improve the machined surface quality, but conventional or super-abrasive grinding still required which imposes some drawbacks such as additional costs and time. Currently, many CAD/CAM software offers some different tool path strategies to milling free form surfaces. Nevertheless, the users must identify how to choose the strategies according to cutting tool geometry, geometry complexity, and their effects on the machined surface. This study investigates the effect of different tool path strategies for milling a proximal humerus head during finishing operation on stainless steel 316L. Experiments have been performed using MAHO MH700 S vertical milling machine and four machining strategies, namely, spiral outward, spiral inward, and radial as well as zig-zag. In all cases, the obtained surfaces were analyzed in terms of roughness and dimension accuracy compared with those obtained by simulation. The findings provide evidence that surface roughness, dimensional accuracy, and machining time have been affected by the considered tool path strategy.Keywords: CAD/CAM software, milling, orthopedic implants, tool path strategy
Procedia PDF Downloads 2113970 Application of GPRS in Water Quality Monitoring System
Authors: V. Ayishwarya Bharathi, S. M. Hasker, J. Indhu, M. Mohamed Azarudeen, G. Gowthami, R. Vinoth Rajan, N. Vijayarangan
Abstract:
Identification of water quality conditions in a river system based on limited observations is an essential task for meeting the goals of environmental management. The traditional method of water quality testing is to collect samples manually and then send to laboratory for analysis. However, it has been unable to meet the demands of water quality monitoring today. So a set of automatic measurement and reporting system of water quality has been developed. In this project specifies Water quality parameters collected by multi-parameter water quality probe are transmitted to data processing and monitoring center through GPRS wireless communication network of mobile. The multi parameter sensor is directly placed above the water level. The monitoring center consists of GPRS and micro-controller which monitor the data. The collected data can be monitor at any instant of time. In the pollution control board they will monitor the water quality sensor data in computer using Visual Basic Software. The system collects, transmits and processes water quality parameters automatically, so production efficiency and economy benefit are improved greatly. GPRS technology can achieve well within the complex environment of poor water quality non-monitored, and more specifically applicable to the collection point, data transmission automatically generate the field of water analysis equipment data transmission and monitoring.Keywords: multiparameter sensor, GPRS, visual basic software, RS232
Procedia PDF Downloads 4103969 Design and Modeling of a Green Building Energy Efficient System
Authors: Berhane Gebreslassie
Abstract:
Conventional commericial buildings are among the highest unwisely consumes enormous amount of energy and as consequence produce significant amount Carbon Dioxide (CO2). Traditional/conventional buildings have been built for years without consideration being given to their impact on the global warming issues as well as their CO2 contributions. Since 1973, simulation of Green Building (GB) for Energy Efficiency started and many countries in particular the US showed a positive response to minimize the usage of energy in respect to reducing the CO2 emission. As a consequence many software companies developed their own unique building energy efficiency simulation software, interfacing interoperability with Building Information Modeling (BIM). The last decade has witnessed very rapid growing number of researches on GB energy efficiency system. However, the study also indicates that the results of current GB simulation are not yet satisfactory to meet the objectives of GB. In addition most of these previous studies are unlikely excluded the studies of ultimate building energy efficiencies simulation. The aim of this project is to meet the objectives of GB by design, modeling and simulation of building ultimate energy efficiencies system. This research project presents multi-level, L-shape office building in which every particular part of the building materials has been tested for energy efficiency. An overall of 78.62% energy is saved, approaching to NetZero energy saving. Furthermore, the building is implements with distributed energy resources like renewable energies and integrating with Smart Building Automation System (SBAS) for controlling and monitoring energy usage.Keywords: ultimate energy saving, optimum energy saving, green building, sustainable materials and renewable energy
Procedia PDF Downloads 273