Search results for: software metrics
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5281

Search results for: software metrics

4201 Assessment of Rock Masses Performance as a Support of Lined Rock Cavern for Isothermal Compressed Air Energy Storage

Authors: Vathna Suy, Ki-Il Song

Abstract:

In order to store highly pressurized gas such as an isothermal compressed air energy storage, Lined Rock Caverns (LRC) are constructed underground and supported by layers of concrete, steel and rock masses. This study aims to numerically investigate the performance of rock masses which serve as a support of Lined Rock Cavern subjected to high cyclic pressure loadings. FLAC3D finite different software is used for the simulation since the software can effectively model the behavior of concrete lining and steel plate with its built-in structural elements. Cyclic pressure loadings are applied onto the inner surface of the cavern which then transmitted to concrete, steel and eventually to the surrounding rock masses. Changes of stress and strain are constantly monitored throughout all the process of loading operations. The results at various monitoring locations are then extracted and analyzed to assess the response of the rock masses, specifically on its ability to absorb energy during loadings induced by the changes of cyclic pressure loadings inside the cavern. By analyzing the obtained data of stress-strain relation and taking into account the behavior of materials under the effect of strain-dependency, conclusions on the performance of rock masses subjected to high cyclic loading conditions are drawn.

Keywords: cyclic loading, FLAC3D, lined rock cavern (LRC), strain-dependency

Procedia PDF Downloads 241
4200 Evaluation of the Factors Affecting Violence Against Women (Case Study: Couples Referring to Family Counseling Centers in Tehran)

Authors: Hassan Manouchehri

Abstract:

The present study aimed to identify and evaluate the factors affecting violence against women. The statistical population included all couples referring to family counseling centers in Tehran due to domestic violence during the past year. A number of 305 people were selected as a statistical sample using simple random sampling and Cochran's formula in unlimited conditions. A researcher-made questionnaire including 110 items was used for data collection. The face validity and content validity of the questionnaire were confirmed by 30 experts and its reliability was obtained above 0.7 for all studied variables in a preliminary test with 30 subjects and it was acceptable. In order to analyze the data, descriptive statistical methods were used with SPSS software version 22 and inferential statistics were used for modeling structural equations in Smart PLS software version 2. Evaluating the theoretical framework and domestic and foreign studies indicated that, in general, four main factors, including cultural and social factors, economic factors, legal factors, as well as medical factors, underlie violence against women. In addition, structural equation modeling findings indicated that cultural and social factors, economic factors, legal factors, and medical factors affect violence against women.

Keywords: violence against women, cultural and social factors, economic factors, legal factors, medical factors

Procedia PDF Downloads 136
4199 Bayesian Analysis of Topp-Leone Generalized Exponential Distribution

Authors: Najrullah Khan, Athar Ali Khan

Abstract:

The Topp-Leone distribution was introduced by Topp- Leone in 1955. In this paper, an attempt has been made to fit Topp-Leone Generalized exponential (TPGE) distribution. A real survival data set is used for illustrations. Implementation is done using R and JAGS and appropriate illustrations are made. R and JAGS codes have been provided to implement censoring mechanism using both optimization and simulation tools. The main aim of this paper is to describe and illustrate the Bayesian modelling approach to the analysis of survival data. Emphasis is placed on the modeling of data and the interpretation of the results. Crucial to this is an understanding of the nature of the incomplete or 'censored' data encountered. Analytic approximation and simulation tools are covered here, but most of the emphasis is on Markov chain based Monte Carlo method including independent Metropolis algorithm, which is currently the most popular technique. For analytic approximation, among various optimization algorithms and trust region method is found to be the best. In this paper, TPGE model is also used to analyze the lifetime data in Bayesian paradigm. Results are evaluated from the above mentioned real survival data set. The analytic approximation and simulation methods are implemented using some software packages. It is clear from our findings that simulation tools provide better results as compared to those obtained by asymptotic approximation.

Keywords: Bayesian Inference, JAGS, Laplace Approximation, LaplacesDemon, posterior, R Software, simulation

Procedia PDF Downloads 530
4198 Numerical Investigation of Beam-Columns Subjected to Non-Proportional Loadings under Ambient Temperature Conditions

Authors: George Adomako Kumi

Abstract:

The response of structural members, when subjected to various forms of non-proportional loading, plays a major role in the overall stability and integrity of a structure. This research seeks to present the outcome of a finite element investigation conducted by the use of finite element programming software ABAQUS to validate the experimental results of elastic and inelastic behavior and strength of beam-columns subjected to axial loading, biaxial bending, and torsion under ambient temperature conditions. The application of the rigorous and highly complicated ABAQUS finite element software will seek to account for material, non-linear geometry, deformations, and, more specifically, the contact behavior between the beam-columns and support surfaces. Comparisons of the three-dimensional model with the results of actual tests conducted and results from a solution algorithm developed through the use of the finite difference method will be established in order to authenticate the veracity of the developed model. The results of this research will seek to provide structural engineers with much-needed knowledge about the behavior of steel beam columns and their response to various non-proportional loading conditions under ambient temperature conditions.

Keywords: beam-columns, axial loading, biaxial bending, torsion, ABAQUS, finite difference method

Procedia PDF Downloads 175
4197 Grading Fourteen Zones of Isfahan in Terms of the Impact of Globalization on the Urban Fabric of the City, Using the TOPSIS Model

Authors: A. Zahedi Yeganeh, A. Khademolhosseini, R. Mokhtari Malekabadi

Abstract:

Undoubtedly one of the most far-reaching and controversial topics considered in the past few decades, has been globalization. Globalization lies in the essence of the modern culture. It is a complex and rapidly expanding network of links and mutual interdependence that is an aspect of modern life; though some argue that this link existed since the beginning of human history. If we consider globalization as a dynamic social process in which the geographical constraints governing the political, economic, social and cultural relationships have been undermined, it might not be possible to simply describe its impact on the urban fabric. But since in this phenomenon the increase in communications of societies (while preserving the main cultural - regional characteristics) with one another and the increase in the possibility of influencing other societies are discussed, the need for more studies will be felt. The main objective of this study is to grade based on some globalization factors on urban fabric applying the TOPSIS model. The research method is descriptive - analytical and survey. For data analysis, the TOPSIS model and SPSS software were used and the results of GIS software with fourteen cities are shown on the map. The results show that the process of being influenced by the globalization of the urban fabric of fourteen zones of Isfahan was not similar and there have been large differences in this respect between city zones; the most affected areas are zones 5, 6 and 9 of the municipality and the least impact has been on the zones 4 and 3 and 2.

Keywords: grading, globalization, urban fabric, 14 zones of Isfahan, TOPSIS model

Procedia PDF Downloads 310
4196 Support Vector Regression Combined with Different Optimization Algorithms to Predict Global Solar Radiation on Horizontal Surfaces in Algeria

Authors: Laidi Maamar, Achwak Madani, Abdellah El Ahdj Abdellah

Abstract:

The aim of this work is to use Support Vector regression (SVR) combined with dragonfly, firefly, Bee Colony and particle swarm Optimization algorithm to predict global solar radiation on horizontal surfaces in some cities in Algeria. Combining these optimization algorithms with SVR aims principally to enhance accuracy by fine-tuning the parameters, speeding up the convergence of the SVR model, and exploring a larger search space efficiently; these parameters are the regularization parameter (C), kernel parameters, and epsilon parameter. By doing so, the aim is to improve the generalization and predictive accuracy of the SVR model. Overall, the aim is to leverage the strengths of both SVR and optimization algorithms to create a more powerful and effective regression model for various cities and under different climate conditions. Results demonstrate close agreement between predicted and measured data in terms of different metrics. In summary, SVM has proven to be a valuable tool in modeling global solar radiation, offering accurate predictions and demonstrating versatility when combined with other algorithms or used in hybrid forecasting models.

Keywords: support vector regression (SVR), optimization algorithms, global solar radiation prediction, hybrid forecasting models

Procedia PDF Downloads 30
4195 Transient Level in the Surge Chamber at the Robert-bourassa Generating Station

Authors: Maryam Kamali Nezhad

Abstract:

The Robert-Bourassa development (LG-2), the first to be built on the Grande Rivière, comprises two sets of eight turbines- generator units each, the East and West powerhouses. Each powerhouse has two tailrace tunnels with an average length of about 1178 m. The LG-2A powerhouse houses 6 turbine-generator units. The water is discharged through two tailrace tunnels with a length of about 1330 m. The objective of this work, at RB (LG-2), is; 1) to establish a new maximum transient level in the surge chamber, 2) to define the new maximum equipment flow rate for the future turbine-generator units, 3) to ensure safe access to various intervention locations in the surge chamber. The transient levels under normal operating conditions at the RB plant were determined in 2001 by the Hydraulics Unit of HQE using the "Chamber" software. It is a one-dimensional mass oscillation calculation software; it is used to determine the variation of the water level in the equilibrium chamber located downstream of a power plant during the load shedding of the power plant units; it can also be used in the case of an equilibrium stack upstream of a power plant. The RB (LG-2) plant study is based on the theoretical nominal geometry of the chamber and the tailrace tunnels and the flow-level relationship at the outlet of the galleries established during design. The software is used in such a way that the results have an acceptable margin of safety, especially with respect to the maximum transient level (e.g., resumption of flow at an inopportune time), to take into account the turbulent and three-dimensional aspects of the actual flow in the chamber. Note that the transient levels depend on the water levels in the river and in the steady-state equilibrium chambers. These data are established in the HQP CRP database and updated from time to time. The maximum transient levels in the RB-East and RB-West powerhouses surge chamber were revised based on the latest update (set 4) of in-river rating curves and steady-state surge chamber water levels. The results of the revision were also used to update the technical advice on the operating conditions for the aforementioned surge chamber access while considering revisions to the calculated water levels.

Keywords: generating station, surge chamber, maximum transient level, hydroelectric power station, turbine-generator, reservoir

Procedia PDF Downloads 80
4194 Modeling of the Attitude Control Reaction Wheels of a Spacecraft in Software in the Loop Test Bed

Authors: Amr AbdelAzim Ali, G. A. Elsheikh, Moutaz M. Hegazy

Abstract:

Reaction wheels (RWs) are generally used as main actuator in the attitude control system (ACS) of spacecraft (SC) for fast orientation and high pointing accuracy. In order to achieve the required accuracy for the RWs model, the main characteristics of the RWs that necessitate analysis during the ACS design phase include: technical features, sequence of operating and RW control logic are included in function (behavior) model. A mathematical model is developed including the various errors source. The errors in control torque including relative, absolute, and error due to time delay. While the errors in angular velocity due to differences between average and real speed, resolution error, loose in installation of angular sensor, and synchronization errors. The friction torque is presented in the model include the different feature of friction phenomena: steady velocity friction, static friction and break-away torque, and frictional lag. The model response is compared with the experimental torque and frequency-response characteristics of tested RWs. Based on the created RW model, some criteria of optimization based control torque allocation problem can be recommended like: avoiding the zero speed crossing, bias angular velocity, or preventing wheel from running on the same angular velocity.

Keywords: friction torque, reaction wheels modeling, software in the loop, spacecraft attitude control

Procedia PDF Downloads 262
4193 Semantic Differences between Bug Labeling of Different Repositories via Machine Learning

Authors: Pooja Khanal, Huaming Zhang

Abstract:

Labeling of issues/bugs, also known as bug classification, plays a vital role in software engineering. Some known labels/classes of bugs are 'User Interface', 'Security', and 'API'. Most of the time, when a reporter reports a bug, they try to assign some predefined label to it. Those issues are reported for a project, and each project is a repository in GitHub/GitLab, which contains multiple issues. There are many software project repositories -ranging from individual projects to commercial projects. The labels assigned for different repositories may be dependent on various factors like human instinct, generalization of labels, label assignment policy followed by the reporter, etc. While the reporter of the issue may instinctively give that issue a label, another person reporting the same issue may label it differently. This way, it is not known mathematically if a label in one repository is similar or different to the label in another repository. Hence, the primary goal of this research is to find the semantic differences between bug labeling of different repositories via machine learning. Independent optimal classifiers for individual repositories are built first using the text features from the reported issues. The optimal classifiers may include a combination of multiple classifiers stacked together. Then, those classifiers are used to cross-test other repositories which leads the result to be deduced mathematically. The produce of this ongoing research includes a formalized open-source GitHub issues database that is used to deduce the similarity of the labels pertaining to the different repositories.

Keywords: bug classification, bug labels, GitHub issues, semantic differences

Procedia PDF Downloads 192
4192 Performance Evaluation of Hierarchical Location-Based Services Coupled to the Greedy Perimeter Stateless Routing Protocol for Wireless Sensor Networks

Authors: Rania Khadim, Mohammed Erritali, Abdelhakim Maaden

Abstract:

Nowadays Wireless Sensor Networks have attracted worldwide research and industrial interest, because they can be applied in various areas. Geographic routing protocols are very suitable to those networks because they use location information when they need to route packets. Obviously, location information is maintained by Location-Based Services provided by network nodes in a distributed way. In this paper we choose to evaluate the performance of two hierarchical rendezvous location based-services, GLS (Grid Location Service) and HLS (Hierarchical Location Service) coupled to the GPSR routing protocol (Greedy Perimeter Stateless Routing) for Wireless Sensor Network. The simulations were performed using NS2 simulator to evaluate the performance and power of the two services in term of location overhead, the request travel time (RTT) and the query Success ratio (QSR). This work presents also a new scalability performance study of both GLS and HLS, specifically, what happens if the number of nodes N increases. The study will focus on three qualitative metrics: The location maintenance cost, the location query cost and the storage cost.

Keywords: location based-services, routing protocols, scalability, wireless sensor networks

Procedia PDF Downloads 368
4191 Contemplating Charge Transport by Modeling of DNA Nucleobases Based Nano Structures

Authors: Rajan Vohra, Ravinder Singh Sawhney, Kunwar Partap Singh

Abstract:

Electrical charge transport through two basic strands thymine and adenine of DNA have been investigated and analyzed using the jellium model approach. The FFT-2D computations have been performed for semi-empirical Extended Huckel Theory using atomistic tool kit to contemplate the charge transport metrics like current and conductance. The envisaged data is further evaluated in terms of transmission spectrum, HOMO-LUMO Gap and number of electrons. We have scrutinized the behavior of the devices in the range of -2V to 2V for a step size of 0.2V. We observe that both thymine and adenine can act as molecular devices when sandwiched between two gold probes. A prominent observation is a drop in HLGs of adenine and thymine when working as a device as compared to their intrinsic values and this is comparative more visible in case of adenine. The current in the thymine based device exhibit linear increase with voltage in spite of having low conductance. Further, the broader transmission peaks represent the strong coupling of electrodes to the scattering molecule (thymine). Moreover, the observed current in case of thymine is almost 3-4 times than that of observed for adenine. The NDR effect has been perceived in case of adenine based device for higher bias voltages and can be utilized in various future electronics applications.

Keywords: adenine, DNA, extended Huckel, thymine, transmission spectra

Procedia PDF Downloads 153
4190 An Assessment of the Hip Muscular Imbalance for Patients with Rheumatism

Authors: Anthony Bawa, Konstantinos Banitsas

Abstract:

Rheumatism is a muscular disorder that affects the muscles of the upper and lower limbs. This condition could potentially progress to impair the movement of patients. This study aims to investigate the hip muscular imbalance in patients with chronic rheumatism. A clinical trial involving a total of 15 participants, made up of 10 patients and 5 control subjects, took place in KATH Hospital between August and September. Participants recruited for the study were of age 54 ± 8years, weight 65± 8kg, and height 176 ± 8cm. Muscle signals were recorded from the rectus femoris, and vastus lateralis on the right and left hip of participants. The parameters used in determining the hip muscular imbalances were the maximum voluntary contraction (MVC%), the mean difference, and hip muscle fatigue levels. The mean signals were compared using a t-test, and the metrics for muscle fatigue assessment were based on the root mean square (RMS), mean absolute value (MAV) and mean frequency (MEF), which were computed between the hip muscles of participants. The results indicated that there were significant imbalances in the muscle coactivity between the right and left hip muscles of patients. The patients’ MVC values were observed to be above 10% when compared with control subjects. Furthermore, the mean difference was seen to be higher with p > 0.002 among patients, which indicated clear differences in the hip muscle contraction activities. The findings indicate significant hip muscular imbalances for patients with rheumatism compared with control subjects. Information about the imbalances among patients will be useful for clinicians in designing therapeutic muscle-strengthening exercises.

Keywords: muscular, imbalances, rheumatism, Hip

Procedia PDF Downloads 110
4189 Teaching Tools for Web Processing Services

Authors: Rashid Javed, Hardy Lehmkuehler, Franz Josef-Behr

Abstract:

Web Processing Services (WPS) have up growing concern in geoinformation research. However, teaching about them is difficult because of the generally complex circumstances of their use. They limit the possibilities for hands- on- exercises on Web Processing Services. To support understanding however a Training Tools Collection was brought on the way at University of Applied Sciences Stuttgart (HFT). It is limited to the scope of Geostatistical Interpolation of sample point data where different algorithms can be used like IDW, Nearest Neighbor etc. The Tools Collection aims to support understanding of the scope, definition and deployment of Web Processing Services. For example it is necessary to characterize the input of Interpolation by the data set, the parameters for the algorithm and the interpolation results (here a grid of interpolated values is assumed). This paper reports on first experiences using a pilot installation. This was intended to find suitable software interfaces for later full implementations and conclude on potential user interface characteristics. Experiences were made with Deegree software, one of several Services Suites (Collections). Being strictly programmed in Java, Deegree offers several OGC compliant Service Implementations that also promise to be of benefit for the project. The mentioned parameters for a WPS were formalized following the paradigm that any meaningful component will be defined in terms of suitable standards. E.g. the data output can be defined as a GML file. But, the choice of meaningful information pieces and user interactions is not free but partially determined by the selected WPS Processing Suite.

Keywords: deegree, interpolation, IDW, web processing service (WPS)

Procedia PDF Downloads 353
4188 Embedded Test Framework: A Solution Accelerator for Embedded Hardware Testing

Authors: Arjun Kumar Rath, Titus Dhanasingh

Abstract:

Embedded product development requires software to test hardware functionality during development and finding issues during manufacturing in larger quantities. As the components are getting integrated, the devices are tested for their full functionality using advanced software tools. Benchmarking tools are used to measure and compare the performance of product features. At present, these tests are based on a variety of methods involving varying hardware and software platforms. Typically, these tests are custom built for every product and remain unusable for other variants. A majority of the tests goes undocumented, not updated, unusable when the product is released. To bridge this gap, a solution accelerator in the form of a framework can address these issues for running all these tests from one place, using an off-the-shelf tests library in a continuous integration environment. There are many open-source test frameworks or tools (fuego. LAVA, AutoTest, KernelCI, etc.) designed for testing embedded system devices, with each one having several unique good features, but one single tool and framework may not satisfy all of the testing needs for embedded systems, thus an extensible framework with the multitude of tools. Embedded product testing includes board bring-up testing, test during manufacturing, firmware testing, application testing, and assembly testing. Traditional test methods include developing test libraries and support components for every new hardware platform that belongs to the same domain with identical hardware architecture. This approach will have drawbacks like non-reusability where platform-specific libraries cannot be reused, need to maintain source infrastructure for individual hardware platforms, and most importantly, time is taken to re-develop test cases for new hardware platforms. These limitations create challenges like environment set up for testing, scalability, and maintenance. A desirable strategy is certainly one that is focused on maximizing reusability, continuous integration, and leveraging artifacts across the complete development cycle during phases of testing and across family of products. To get over the stated challenges with the conventional method and offers benefits of embedded testing, an embedded test framework (ETF), a solution accelerator, is designed, which can be deployed in embedded system-related products with minimal customizations and maintenance to accelerate the hardware testing. Embedded test framework supports testing different hardwares including microprocessor and microcontroller. It offers benefits such as (1) Time-to-Market: Accelerates board brings up time with prepacked test suites supporting all necessary peripherals which can speed up the design and development stage(board bring up, manufacturing and device driver) (2) Reusability-framework components isolated from the platform-specific HW initialization and configuration makes the adaptability of test cases across various platform quick and simple (3) Effective build and test infrastructure with multiple test interface options and preintegrated with FUEGO framework (4) Continuos integration - pre-integrated with Jenkins which enabled continuous testing and automated software update feature. Applying the embedded test framework accelerator throughout the design and development phase enables to development of the well-tested systems before functional verification and improves time to market to a large extent.

Keywords: board diagnostics software, embedded system, hardware testing, test frameworks

Procedia PDF Downloads 137
4187 Visual Template Detection and Compositional Automatic Regular Expression Generation for Business Invoice Extraction

Authors: Anthony Proschka, Deepak Mishra, Merlyn Ramanan, Zurab Baratashvili

Abstract:

Small and medium-sized businesses receive over 160 billion invoices every year. Since these documents exhibit many subtle differences in layout and text, extracting structured fields such as sender name, amount, and VAT rate from them automatically is an open research question. In this paper, existing work in template-based document extraction is extended, and a system is devised that is able to reliably extract all required fields for up to 70% of all documents in the data set, more than any other previously reported method. The approaches are described for 1) detecting through visual features which template a given document belongs to, 2) automatically generating extraction rules for a given new template by composing regular expressions from multiple components, and 3) computing confidence scores that indicate the accuracy of the automatic extractions. The system can generate templates with as little as one training sample and only requires the ground truth field values instead of detailed annotations such as bounding boxes that are hard to obtain. The system is deployed and used inside a commercial accounting software.

Keywords: data mining, information retrieval, business, feature extraction, layout, business data processing, document handling, end-user trained information extraction, document archiving, scanned business documents, automated document processing, F1-measure, commercial accounting software

Procedia PDF Downloads 126
4186 The Usefulness of Medical Scribes in the Emengecy Department

Authors: Victor Kang, Sirene Bellahnid, Amy Al-Simaani

Abstract:

Efficient documentation and completion of clerical tasks are pillars of efficient patient-centered care in acute settings such as the emergency department (ED). Medical scribes aid physicians with documentation, navigation of electronic health records, results gathering, and communication coordination with other healthcare teams. However, the use of medical scribes is not widespread, with some hospitals even continuing to discontinue their programs. One reason for this could be the lack of studies that have outlined concrete improvements in efficiency and patient and provider satisfaction in emergency departments before and after incorporating scribes. Methods: We conducted a review of the literature concerning the implementation of a medical scribe program and emergency department performance. For this review, a narrative synthesis accompanied by textual commentaries was chosen to present the selected papers. PubMed was searched exclusively. Initially, no date limits were set, but seeing as the electronic medical record was officially implemented in Canada in 2013, studies published after this date were preferred as they provided insight into the interplay between its implementation and scribes on quality improvement. Results: Throughput, efficiency, and cost-effectiveness were the most commonly used parameters in evaluating scribes in the Emergency Department. Important throughput metrics, specifically door-to-doctor and disposition time, were significantly decreased in emergency departments that utilized scribes. Of note, this was shown to be the case in community hospitals, where the burden of documentation and clerical tasks would fall directly upon the attending physician. Academic centers differ in that they rely heavily on residents and students; so the implementation of scribes has been shown to have limited effect on these metrics. However, unique to academic centers was the provider’s perception of incrased time for teaching was unique to academic centers. Consequently, providers express increased work satisfaction in relation to time spent with patients and in teaching. Patients, on the other hand, did not demonstrate a decrease in satisfaction in regards to the care that was provided, but there was no significant increase observed either. Of the studies we reviewed, one of the biggest limitations was the lack of significance in the data. While many individual studies reported that medical scribes in emergency rooms improved relative value units, patient satisfaction, provider satisfaction, and increased number of patients seen, there was no statistically significant improvement in the above criteria when compiled in a systematic review. There is also a clear publication bias; very few studies with negative results were published. To prove significance, data from more emergency rooms with scribe programs would need to be compiled which also includes emergency rooms who did not report noticeable benefits. Furthermore, most data sets focused only on scribes in academic centers. Conclusion: Ultimately, the literature suggests that while emergency room physicians who have access to medical scribes report higher satisfaction due to lower clerical burdens and can see more patients per shift, there is still variability in terms of patient and provider satisfaction. Whether or not this variability exists due to differences in training (in-house trainees versus contractors), population profile (adult versus pediatric), setting (academic versus community), or which shifts scribe work cannot be determined based on the studies that exist. Ultimately, more scribe programs need to be evaluated to determine whether these variables affect outcomes and prove whether scribes significantly improve emergency room efficiency.

Keywords: emergency medicine, medical scribe, scribe, documentation

Procedia PDF Downloads 88
4185 A Verification Intellectual Property for Multi-Flow Rate Control on Any Single Flow Bus Functional Model

Authors: Pawamana Ramachandra, Jitesh Gupta, Saranga P. Pogula

Abstract:

In verification of high volume and complex packet processing IPs, finer control of flow management aspects (for example, rate, bits/sec etc.) per flow class (or a virtual channel or a software thread) is needed. When any Software/Universal Verification Methodology (UVM) thread arbitration is left to the simulator (e.g., Verilog Compiler Simulator (VCS) or Incisive Enterprise Simulator core simulation engine (NCSIM)), it is hard to predict its pattern of resulting distribution of bandwidth by the simulator thread arbitration. In many cases, the patterns desired in a test scenario may not be accomplished as the simulator might give a different distribution than what was required. This can lead to missing multiple traffic scenarios, specifically deadlock and starvation related. We invented a component (namely Flow Manager Verification IP) to be intervening between the application (test case) and the protocol VIP (with UVM sequencer) to control the bandwidth per thread/virtual channel/flow. The Flow Manager has knobs visible to the UVM sequence/test to configure the required distribution of rate per thread/virtual channel/flow. This works seamlessly and produces rate stimuli to further harness the Design Under Test (DUT) with asymmetric inputs compared to the programmed bandwidth/Quality of Service (QoS) distributions in the Design Under Test.

Keywords: flow manager, UVM sequencer, rated traffic generation, quality of service

Procedia PDF Downloads 93
4184 Analysing Waste Management Options in the Printing Industry: Case of a South African Company

Authors: Stanley Fore

Abstract:

The case study company is one of the leading newsprint companies in South Africa. The company has achieved this status through operational expansion, diversification and investing in cutting-edge technology. They have a reputation for the highest quality and personalised service that transcends borders and industries. The company offers a wide variety of small and large scales printing services. The company is faced with the challenge of significant waste production during normal operations. The company generates 1200 kg of plastic waste and 60 – 70 tonnes of paper waste per month. The company operates a waste management process currently, whereby waste paper is sold, at low cost, to recycling firms for further processing. Having considered the quantity of waste being generated, the company has embarked on a venture to find a more profitable solution to its current waste production. As waste management and recycling is not the company’s core business, the aim of the venture is to implement a secondary profitable waste process business. The venture will be expedited as a strategic project. This research aims to estimate the financial feasibility of a selected solution as well as the impact of non-financial considerations thereof. The financial feasibility is analysed using metrics such as Payback period; internal rate of return and net present value.

Keywords: waste, printing industry, up-cycling, management

Procedia PDF Downloads 259
4183 Exergetic and Sustainability Evaluation of a Building Heating System in Izmir, Turkey

Authors: Nurdan Yildirim, Arif Hepbasli

Abstract:

Heating, cooling and lighting appliances in buildings account for more than one third of the world’s primary energy demand. Therefore, main components of the building heating systems play an essential role in terms of energy consumption. In this context, efficient energy and exergy utilization in HVAC-R systems has been very essential, especially in developing energy policies towards increasing efficiencies. The main objective of the present study is to assess the performance of a family house with a volume of 326.7 m3 and a net floor area of 121 m2, located in the city of Izmir, Turkey in terms of energetic, exergetic and sustainability aspects. The indoor and exterior air temperatures are taken as 20°C and 1°C, respectively. In the analysis and assessment, various metrics (indices or indicators) such as exergetic efficiency, exergy flexibility ratio and sustainability index are utilized. Two heating options (Case 1: condensing boiler and Case 2: air heat pump) are considered for comparison purposes. The total heat loss rate of the family house is determined to be 3770.72 W. The overall energy efficiencies of the studied cases are calculated to be 49.4% for Case 1 and 54.7% for Case 2. The overall exergy efficiencies, the flexibility factor and the sustainability index of Cases 1 and 2 are computed to be around 3.3%, 0.17 and 1.034, respectively.

Keywords: buildings, exergy, low exergy, sustainability, efficiency, heating, renewable energy

Procedia PDF Downloads 338
4182 Data Science-Based Key Factor Analysis and Risk Prediction of Diabetic

Authors: Fei Gao, Rodolfo C. Raga Jr.

Abstract:

This research proposal will ascertain the major risk factors for diabetes and to design a predictive model for risk assessment. The project aims to improve diabetes early detection and management by utilizing data science techniques, which may improve patient outcomes and healthcare efficiency. The phase relation values of each attribute were used to analyze and choose the attributes that might influence the examiner's survival probability using Diabetes Health Indicators Dataset from Kaggle’s data as the research data. We compare and evaluate eight machine learning algorithms. Our investigation begins with comprehensive data preprocessing, including feature engineering and dimensionality reduction, aimed at enhancing data quality. The dataset, comprising health indicators and medical data, serves as a foundation for training and testing these algorithms. A rigorous cross-validation process is applied, and we assess their performance using five key metrics like accuracy, precision, recall, F1-score, and area under the receiver operating characteristic curve (AUC-ROC). After analyzing the data characteristics, investigate their impact on the likelihood of diabetes and develop corresponding risk indicators.

Keywords: diabetes, risk factors, predictive model, risk assessment, data science techniques, early detection, data analysis, Kaggle

Procedia PDF Downloads 71
4181 Establishment of Precision System for Underground Facilities Based on 3D Absolute Positioning Technology

Authors: Yonggu Jang, Jisong Ryu, Woosik Lee

Abstract:

The study aims to address the limitations of existing underground facility exploration equipment in terms of exploration depth range, relative depth measurement, data processing time, and human-centered ground penetrating radar image interpretation. The study proposed the use of 3D absolute positioning technology to develop a precision underground facility exploration system. The aim of this study is to establish a precise exploration system for underground facilities based on 3D absolute positioning technology, which can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The study developed software and hardware technologies to build the precision exploration system. The software technologies developed include absolute positioning technology, ground surface location synchronization technology of GPR exploration equipment, GPR exploration image AI interpretation technology, and integrated underground space map-based composite data processing technology. The hardware systems developed include a vehicle-type exploration system and a cart-type exploration system. The data was collected using the developed exploration system, which employs 3D absolute positioning technology. The GPR exploration images were analyzed using AI technology, and the three-dimensional location information of the explored precise underground facilities was compared to the integrated underground space map. The study successfully developed a precision underground facility exploration system based on 3D absolute positioning technology. The developed exploration system can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The system comprises software technologies that build a 3D precise DEM, synchronize the GPR sensor's ground surface 3D location coordinates, automatically analyze and detect underground facility information in GPR exploration images and improve accuracy through comparative analysis of the three-dimensional location information, and hardware systems, including a vehicle-type exploration system and a cart-type exploration system. The study's findings and technological advancements are essential for underground safety management in Korea. The proposed precision exploration system significantly contributes to establishing precise location information of underground facility information, which is crucial for underground safety management and improves the accuracy and efficiency of exploration. The study addressed the limitations of existing equipment in exploring underground facilities, proposed 3D absolute positioning technology-based precision exploration system, developed software and hardware systems for the exploration system, and contributed to underground safety management by providing precise location information. The developed precision underground facility exploration system based on 3D absolute positioning technology has the potential to provide accurate and efficient exploration of underground facilities up to a depth of 5m. The system's technological advancements contribute to the establishment of precise location information of underground facility information, which is essential for underground safety management in Korea.

Keywords: 3D absolute positioning, AI interpretation of GPR exploration images, complex data processing, integrated underground space maps, precision exploration system for underground facilities

Procedia PDF Downloads 58
4180 Development and Verification of the Idom Shielding Optimization Tool

Authors: Omar Bouhassoun, Cristian Garrido, César Hueso

Abstract:

The radiation shielding design is an optimization problem with multiple -constrained- objective functions (radiation dose, weight, price, etc.) that depend on several parameters (material, thickness, position, etc.). The classical approach for shielding design consists of a brute force trial-and-error process subject to previous designer experience. Therefore, the result is an empirical solution but not optimal, which can degrade the overall performance of the shielding. In order to automate the shielding design procedure, the IDOM Shielding Optimization Tool (ISOT) has been developed. This software combines optimization algorithms with the capabilities to read/write input files, run calculations, as well as parse output files for different radiation transport codes. In the first stage, the software was established to adjust the input files for two well-known Monte Carlo codes (MCNP and Serpent) and optimize the result (weight, volume, price, dose rate) using multi-objective genetic algorithms. Nevertheless, its modular implementation easily allows the inclusion of more radiation transport codes and optimization algorithms. The work related to the development of ISOT and its verification on a simple 3D multi-layer shielding problem using both MCNP and Serpent will be presented. ISOT looks very promising for achieving an optimal solution to complex shielding problems.

Keywords: optimization, shielding, nuclear, genetic algorithm

Procedia PDF Downloads 104
4179 Analyzing Migration Patterns Using Public Disorder Event Data

Authors: Marie E. Docken

Abstract:

At some point in the lifecycle of a country, patterns of political and social unrest of varying degrees are observed. Events involving public disorder or civil disobedience may produce effects that range a wide spectrum of varying outcomes, depending on the level of unrest. Many previous studies, primarily theoretical in nature, have attempted to measure public disorder in answering why or how it occurs in society by examining causal factors or underlying issues in the social or political position of a population. The main objective in doing so is to understand how these activities evolve or seek some predictive capability for the events. In contrast, this research involves the fusion of analytics and social studies to provide more knowledge of the public disorder and civil disobedience intensity in populations. With a greater understanding of the magnitude of these events, it is believed that we may learn how they relate to extreme actions such as mass migration or violence. Upon establishing a model for measuring civil unrest based upon empirical data, a case study on various Latin American countries is performed. Interpretations of historical events are combined with analytical results to provide insights regarding the magnitude and effect of social and political activism.

Keywords: public disorder, civil disobedience, Latin America, metrics, data analysis

Procedia PDF Downloads 142
4178 Lockit: A Logic Locking Automation Software

Authors: Nemanja Kajtez, Yue Zhan, Basel Halak

Abstract:

The significant rise in the cost of manufacturing of nanoscale integrated circuits (IC) has led the majority of IC design companies to outsource the fabrication of their products to other companies, often located in different countries. This multinational nature of the hardware supply chain has led to a host of security threats, including IP piracy, IC overproduction, and Trojan insertion. To combat that, researchers have proposed logic locking techniques to protect the intellectual properties of the design and increase the difficulty of malicious modification of its functionality. However, the adoption of logic locking approaches is rather slow due to the lack of the integration with IC production process and the lack of efficacy of existing algorithms. This work automates the logic locking process by developing software using Python that performs the locking on a gate-level netlist and can be integrated with the existing digital synthesis tools. Analysis of the latest logic locking algorithms has demonstrated that the SFLL-HD algorithm is one of the most secure and versatile in trading-off levels of protection against different types of attacks and was thus selected for implementation. The presented tool can also be expanded to incorporate the latest locking mechanisms to keep up with the fast-paced development in this field. The paper also presents a case study to demonstrate the functionality of the tool and how it could be used to explore the design space and compare different locking solutions. The source code of this tool is available freely from (https://www.researchgate.net/publication/353195333_Source_Code_for_The_Lockit_Tool).

Keywords: design automation, hardware security, IP piracy, logic locking

Procedia PDF Downloads 173
4177 Diversity for Safety and Security of Autonomous Vehicles against Accidental and Deliberate Faults

Authors: Anil Ranjitbhai Patel, Clement John Shaji, Peter Liggesmeyer

Abstract:

Safety and security of autonomous vehicles (AVs) is a growing concern, first, due to the increased number of safety-critical functions taken over by automotive embedded systems; second, due to the increased exposure of the software-intensive systems to potential attackers; third, due to dynamic interaction in an uncertain and unknown environment at runtime which results in changed functional and non-functional properties of the system. Frequently occurring environmental uncertainties, random component failures, and compromise security of the AVs might result in hazardous events, sometimes even in an accident, if left undetected. Beyond these technical issues, we argue that the safety and security of AVs against accidental and deliberate faults are poorly understood and rarely implemented. One possible way to overcome this is through a well-known diversity approach. As an effective approach to increase safety and security, diversity has been widely used in the aviation, railway, and aerospace industries. Thus, the paper proposes fault-tolerance by diversity model takes into consideration the mitigation of accidental and deliberate faults by application of structure and variant redundancy. The model can be used to design the AVs with various types of diversity in hardware and software-based multi-version system. The paper evaluates the presented approach by employing an example from adaptive cruise control, followed by discussing the case study with initial findings.

Keywords: autonomous vehicles, diversity, fault-tolerance, adaptive cruise control, safety, security

Procedia PDF Downloads 125
4176 A Graph-Based Retrieval Model for Passage Search

Authors: Junjie Zhong, Kai Hong, Lei Wang

Abstract:

Passage Retrieval (PR) plays an important role in many Natural Language Processing (NLP) tasks. Traditional efficient retrieval models relying on exact term-matching, such as TF-IDF or BM25, have nowadays been exceeded by pre-trained language models which match by semantics. Though they gain effectiveness, deep language models often require large memory as well as time cost. To tackle the trade-off between efficiency and effectiveness in PR, this paper proposes Graph Passage Retriever (GraphPR), a graph-based model inspired by the development of graph learning techniques. Different from existing works, GraphPR is end-to-end and integrates both term-matching information and semantics. GraphPR constructs a passage-level graph from BM25 retrieval results and trains a GCN-like model on the graph with graph-based objectives. Passages were regarded as nodes in the constructed graph and were embedded in dense vectors. PR can then be implemented using embeddings and a fast vector-similarity search. Experiments on a variety of real-world retrieval datasets show that the proposed model outperforms related models in several evaluation metrics (e.g., mean reciprocal rank, accuracy, F1-scores) while maintaining a relatively low query latency and memory usage.

Keywords: efficiency, effectiveness, graph learning, language model, passage retrieval, term-matching model

Procedia PDF Downloads 136
4175 Urban Development Criteria with a Focus on Resilience to Pandemics: A Case Study of Corona Virus (Covid-19)

Authors: Elham Zabetian Targhi, Niusha Fardnava, Saba Saghafi

Abstract:

Urban resilience to Corona Virus has become a major concern for cities these days. Our country also has not been safe from the destructive effects of this virus in social, economic, physical, governance, and management dimensions; and according to official statistics, hundreds of thousands of people in Iran have been infected with this virus and tens of thousands have died so far. Therefore, to measure urban resilience to this pandemic, some criteria and sub-criteria were developed based on the authors’ documentary and field studies, and their significance or weights were determined using analytical-comparative research method using a questionnaire of paired or L-Saati comparisons from the viewpoint of experts in urban sciences and urban development using AHP hierarchical analysis in EXPERT CHOICE software. Then, designing a questionnaire with a five-point Likert scale, the satisfaction of Tehran residents with the extracted criteria and sub-criteria was measured and the correlation between the important criteria in each dimension was assessed using correlation tests in SPSS16 software. According to the obtained results of AHP analysis and the scores of each sub-criterion, the weight of all criteria was normal. In the next stage, according to the pairwise correlation tests between the important criteria in each dimension from the viewpoint of urban science experts and Tehran residents, it was concluded that the reliability of the correlation between the criteria is 99%. In all the cases, the P-value or the same significance level was less than 0.05, which indicated the significance of the pairwise relations between the variables.

Keywords: Urban Resilience, Pandemics, Corona Virus (Covid-19), Criteria.

Procedia PDF Downloads 80
4174 3D Steady and Transient Centrifugal Pump Flow within Ansys CFX and OpenFOAM

Authors: Clement Leroy, Guillaume Boitel

Abstract:

This paper presents a comparative benchmarking review of a steady and transient three-dimensional (3D) flow computations in centrifugal pump using commercial (AnsysCFX) and open source (OpenFOAM) computational fluid dynamics (CFD) software. In centrifugal rotor-dynamic pump, the fluid enters in the impeller along to the rotating axis to be accelerated in order to increase the pressure, flowing radially outward into another stage, vaned diffuser or volute casing, from where it finally exits into a downstream pipe. Simulations are carried out at the best efficiency point (BEP) and part load, for single-phase flow with several turbulence models. The results are compared with overall performance report from experimental data. The use of CFD technology in industry is still limited by the high computational costs, and even more by the high cost of commercial CFD software and high-performance computing (HPC) licenses. The main objectives of the present study are to define OpenFOAM methodology for high-quality 3D steady and transient turbomachinery CFD simulation to conduct a thorough time-accurate performance analysis. On the other hand a detailed comparisons between computational methods, features on latest Ansys release 18 and OpenFOAM is investigated to assess the accuracy and industrial applications of those solvers. Finally an automated connected workflow (IoT) for turbine blade applications is presented.

Keywords: benchmarking, CFX, internet of things, openFOAM, time-accurate, turbomachinery

Procedia PDF Downloads 201
4173 Performance Evaluation of Clustered Routing Protocols for Heterogeneous Wireless Sensor Networks

Authors: Awatef Chniguir, Tarek Farah, Zouhair Ben Jemaa, Safya Belguith

Abstract:

Optimal routing allows minimizing energy consumption in wireless sensor networks (WSN). Clustering has proven its effectiveness in organizing WSN by reducing channel contention and packet collision and enhancing network throughput under heavy load. Therefore, nowadays, with the emergence of the Internet of Things, heterogeneity is essential. Stable election protocol (SEP) that has increased the network stability period and lifetime is the first clustering protocol for heterogeneous WSN. SEP and its descendants, namely SEP, Threshold Sensitive SEP (TSEP), Enhanced TSEP (ETSSEP) and Current Energy Allotted TSEP (CEATSEP), were studied. These algorithms’ performance was evaluated based on different metrics, especially first node death (FND), to compare their stability. Simulations were conducted on the MATLAB tool considering two scenarios: The first one demonstrates the fraction variation of advanced nodes by setting the number of total nodes. The second considers the interpretation of the number of nodes while keeping the number of advanced nodes permanent. CEATSEP outperforms its antecedents by increasing stability and, at the same time, keeping a low throughput. It also operates very well in a large-scale network. Consequently, CEATSEP has a useful lifespan and energy efficiency compared to the other routing protocol for heterogeneous WSN.

Keywords: clustering, heterogeneous, stability, scalability, IoT, WSN

Procedia PDF Downloads 128
4172 Digital Wellbeing: A Multinational Study and Global Index

Authors: Fahad Al Beyahi, Justin Thomas, Md Mamunur Rashid

Abstract:

Various definitions of digital well-being have emerged in recent years, most of which center on the impacts -beneficial and detrimental- of digital technology on health and well-being (psychological, social, and financial). Other definitions go further, emphasizing the attainment of balance, viewing digital well-being as wholly subjective, the individual’s perception of optimal balance between the benefits and ills associated with online connectivity. Based on this broad conceptualization of digital well-being, we undertook a global survey measuring various dimensions of this emerging construct. The survey was administered across 35 nations and 7 world regions, with 1000 participants within each territory (N= 35000). Along with attitudinal, behavioral, and sociodemographic variables, the survey included measures of depression, anxiety, problematic social media use, gaming disorder, and other relevant metrics. Coupled with nation-level policy audits, these data were used to create a multinational (global) digital well-being index. Nations are ranked based on various dimensions of digital well-being, and predictive models are used to identify resilience and risk factors for problem technology use. In this paper, we will discuss key findings from the survey and the index. This work can inform public policy and shape our responses to the emerging implications of lives increasingly lived online and interconnected with digital technology.

Keywords: technology, health, behavioral addiction, digital wellbeing

Procedia PDF Downloads 74