Search results for: analysis and real time information about liquefaction
45074 Real Time Detection of Application Layer DDos Attack Using Log Based Collaborative Intrusion Detection System
Authors: Farheen Tabassum, Shoab Ahmed Khan
Abstract:
The brutality of attacks on networks and decisive infrastructures are on the climb over recent years and appears to continue to do so. Distributed Denial of service attack is the most prevalent and easy attack on the availability of a service due to the easy availability of large botnet computers at cheap price and the general lack of protection against these attacks. Application layer DDoS attack is DDoS attack that is targeted on wed server, application server or database server. These types of attacks are much more sophisticated and challenging as they get around most conventional network security devices because attack traffic often impersonate normal traffic and cannot be recognized by network layer anomalies. Conventional techniques of single-hosted security systems are becoming gradually less effective in the face of such complicated and synchronized multi-front attacks. In order to protect from such attacks and intrusion, corporation among all network devices is essential. To overcome this issue, a collaborative intrusion detection system (CIDS) is proposed in which multiple network devices share valuable information to identify attacks, as a single device might not be capable to sense any malevolent action on its own. So it helps us to take decision after analyzing the information collected from different sources. This novel attack detection technique helps to detect seemingly benign packets that target the availability of the critical infrastructure, and the proposed solution methodology shall enable the incident response teams to detect and react to DDoS attacks at the earliest stage to ensure that the uptime of the service remain unaffected. Experimental evaluation shows that the proposed collaborative detection approach is much more effective and efficient than the previous approaches.Keywords: Distributed Denial-of-Service (DDoS), Collaborative Intrusion Detection System (CIDS), Slowloris, OSSIM (Open Source Security Information Management tool), OSSEC HIDS
Procedia PDF Downloads 35445073 Robotic Arm Control with Neural Networks Using Genetic Algorithm Optimization Approach
Authors: Arbnor Pajaziti, Hasan Cana
Abstract:
In this paper, the structural genetic algorithm is used to optimize the neural network to control the joint movements of robotic arm. The robotic arm has also been modeled in 3D and simulated in real-time in MATLAB. It is found that Neural Networks provide a simple and effective way to control the robot tasks. Computer simulation examples are given to illustrate the significance of this method. By combining Genetic Algorithm optimization method and Neural Networks for the given robotic arm with 5 D.O.F. the obtained the results shown that the base joint movements overshooting time without controller was about 0.5 seconds, while with Neural Network controller (optimized with Genetic Algorithm) was about 0.2 seconds, and the population size of 150 gave best results.Keywords: robotic arm, neural network, genetic algorithm, optimization
Procedia PDF Downloads 52345072 Analysis of a Discrete-time Geo/G/1 Queue Integrated with (s, Q) Inventory Policy at a Service Facility
Authors: Akash Verma, Sujit Kumar Samanta
Abstract:
This study examines a discrete-time Geo/G/1 queueing-inventory system attached with (s, Q) inventory policy. Assume that the customers follow the Bernoulli process on arrival. Each customer demands a single item with arbitrarily distributed service time. The inventory is replenished by an outside supplier, and the lead time for the replenishment is determined by a geometric distribution. There is a single server and infinite waiting space in this facility. Demands must wait in the specified waiting area during a stock-out period. The customers are served on a first-come-first-served basis. With the help of the embedded Markov chain technique, we determine the joint probability distributions of the number of customers in the system and the number of items in stock at the post-departure epoch using the Matrix Analytic approach. We relate the system length distribution at post-departure and outside observer's epochs to determine the joint probability distribution at the outside observer's epoch. We use probability distributions at random epochs to determine the waiting time distribution. We obtain the performance measures to construct the cost function. The optimum values of the order quantity and reordering point are found numerically for the variety of model parameters.Keywords: discrete-time queueing inventory model, matrix analytic method, waiting-time analysis, cost optimization
Procedia PDF Downloads 4445071 A Preparatory Method for Building Construction Implemented in a Case Study in Brazil
Authors: Aline Valverde Arroteia, Tatiana Gondim do Amaral, Silvio Burrattino Melhado
Abstract:
During the last twenty years, the construction field in Brazil has evolved significantly in response to its market growing and competitiveness. However, this evolving path has faced many obstacles such as cultural barriers and the lack of efforts to achieve quality at the construction site. At the same time, the greatest amount of information generated on the designing or construction phases is lost due to the lack of an effective coordination of these activities. Face this problem, the aim of this research was to implement a French method named PEO which means preparation for building construction (in Portuguese) seeking to understand the design management process and its interface with the building construction phase. The research method applied was qualitative, and it was carried out through two case studies in the city of Goiania, in Goias, Brazil. The research was divided into two stages called pilot study at Company A and implementation of PEO at Company B. After the implementation; the results demonstrated the PEO method's effectiveness and feasibility while a booster on the quality improvement of design management. The analysis showed that the method has a purpose to improve the design and allow the reduction of failures, errors and rework commonly found in the production of buildings. Therefore, it can be concluded that the PEO is feasible to be applied to real estate and building companies. But, companies need to believe in the contribution they can make to the discovery of design failures in conjunction with other stakeholders forming a construction team. The result of PEO can be maximized when adopting the principles of simultaneous engineering and insertion of new computer technologies, which use a three-dimensional model of the building with BIM process.Keywords: communication, design and construction interface management, preparation for building construction (PEO), proactive coordination (CPA)
Procedia PDF Downloads 16245070 Application of Universal Distribution Factors for Real-Time Complex Power Flow Calculation
Authors: Abdullah M. Alodhaiani, Yasir A. Alturki, Mohamed A. Elkady
Abstract:
Complex power flow distribution factors, which relate line complex power flows to the bus injected complex powers, have been widely used in various power system planning and analysis studies. In particular, AC distribution factors have been used extensively in the recent power and energy pricing studies in free electricity market field. As was demonstrated in the existing literature, many of the electricity market related costing studies rely on the use of the distribution factors. These known distribution factors, whether the injection shift factors (ISF’s) or power transfer distribution factors (PTDF’s), are linear approximations of the first order sensitivities of the active power flows with respect to various variables. This paper presents a novel model for evaluating the universal distribution factors (UDF’s), which are appropriate for an extensive range of power systems analysis and free electricity market studies. These distribution factors are used for the calculations of lines complex power flows and its independent of bus power injections, they are compact matrix-form expressions with total flexibility in determining the position on the line at which line flows are measured. The proposed approach was tested on IEEE 9-Bus system. Numerical results demonstrate that the proposed approach is very accurate compared with exact method.Keywords: distribution factors, power system, sensitivity factors, electricity market
Procedia PDF Downloads 47345069 Moving Target Defense against Various Attack Models in Time Sensitive Networks
Authors: Johannes Günther
Abstract:
Time Sensitive Networking (TSN), standardized in the IEEE 802.1 standard, has been lent increasing attention in the context of mission critical systems. Such mission critical systems, e.g., in the automotive domain, aviation, industrial, and smart factory domain, are responsible for coordinating complex functionalities in real time. In many of these contexts, a reliable data exchange fulfilling hard time constraints and quality of service (QoS) conditions is of critical importance. TSN standards are able to provide guarantees for deterministic communication behaviour, which is in contrast to common best-effort approaches. Therefore, the superior QoS guarantees of TSN may aid in the development of new technologies, which rely on low latencies and specific bandwidth demands being fulfilled. TSN extends existing Ethernet protocols with numerous standards, providing means for synchronization, management, and overall real-time focussed capabilities. These additional QoS guarantees, as well as management mechanisms, lead to an increased attack surface for potential malicious attackers. As TSN guarantees certain deadlines for priority traffic, an attacker may degrade the QoS by delaying a packet beyond its deadline or even execute a denial of service (DoS) attack if the delays lead to packets being dropped. However, thus far, security concerns have not played a major role in the design of such standards. Thus, while TSN does provide valuable additional characteristics to existing common Ethernet protocols, it leads to new attack vectors on networks and allows for a range of potential attacks. One answer to these security risks is to deploy defense mechanisms according to a moving target defense (MTD) strategy. The core idea relies on the reduction of the attackers' knowledge about the network. Typically, mission-critical systems suffer from an asymmetric disadvantage. DoS or QoS-degradation attacks may be preceded by long periods of reconnaissance, during which the attacker may learn about the network topology, its characteristics, traffic patterns, priorities, bandwidth demands, periodic characteristics on links and switches, and so on. Here, we implemented and tested several MTD-like defense strategies against different attacker models of varying capabilities and budgets, as well as collaborative attacks of multiple attackers within a network, all within the context of TSN networks. We modelled the networks and tested our defense strategies on an OMNET++ testbench, with networks of different sizes and topologies, ranging from a couple dozen hosts and switches to significantly larger set-ups.Keywords: network security, time sensitive networking, moving target defense, cyber security
Procedia PDF Downloads 7345068 Estimating Knowledge Flow Patterns of Business Method Patents with a Hidden Markov Model
Authors: Yoonjung An, Yongtae Park
Abstract:
Knowledge flows are a critical source of faster technological progress and stouter economic growth. Knowledge flows have been accelerated dramatically with the establishment of a patent system in which each patent is required by law to disclose sufficient technical information for the invention to be recreated. Patent analysis, thus, has been widely used to help investigate technological knowledge flows. However, the existing research is limited in terms of both subject and approach. Particularly, in most of the previous studies, business method (BM) patents were not covered although they are important drivers of knowledge flows as other patents. In addition, these studies usually focus on the static analysis of knowledge flows. Some use approaches that incorporate the time dimension, yet they still fail to trace a true dynamic process of knowledge flows. Therefore, we investigate dynamic patterns of knowledge flows driven by BM patents using a Hidden Markov Model (HMM). An HMM is a popular statistical tool for modeling a wide range of time series data, with no general theoretical limit in regard to statistical pattern classification. Accordingly, it enables characterizing knowledge patterns that may differ by patent, sector, country and so on. We run the model in sets of backward citations and forward citations to compare the patterns of knowledge utilization and knowledge dissemination.Keywords: business method patents, dynamic pattern, Hidden-Markov Model, knowledge flow
Procedia PDF Downloads 32845067 Process Modeling in an Aeronautics Context
Authors: Sophie Lemoussu, Jean-Charles Chaudemar, Robertus A. Vingerhoeds
Abstract:
Many innovative projects exist in the field of aeronautics, each addressing specific areas so to reduce weight, increase autonomy, reduction of CO2, etc. In many cases, such innovative developments are being carried out by very small enterprises (VSE’s) or small and medium sized-enterprises (SME’s). A good example concerns airships that are being studied as a real alternative to passenger and cargo transportation. Today, no international regulations propose a precise and sufficiently detailed framework for the development and certification of airships. The absence of such a regulatory framework requires a very close contact with regulatory instances. However, VSE’s/SME’s do not always have sufficient resources and internal knowledge to handle this complexity and to discuss these issues. This poses an additional challenge for those VSE’s/SME’s, in particular those that have system integration responsibilities and that must provide all the necessary evidence to demonstrate their ability to design, produce, and operate airships with the expected level of safety and reliability. The main objective of this research is to provide a methodological framework enabling VSE’s/SME’s with limited resources to organize the development of airships while taking into account the constraints of safety, cost, time and performance. This paper proposes to provide a contribution to this problematic by proposing a Model-Based Systems Engineering approach. Through a comprehensive process modeling approach applied to the development processes, the regulatory constraints, existing best practices, etc., a good image can be obtained as to the process landscape that may influence the development of airships. To this effect, not only the necessary regulatory information is taken on board, also other international standards and norms on systems engineering and project management are being modeled and taken into account. In a next step, the model can be used for analysis of the specific situation for given developments, derive critical paths for the development, identify eventual conflicting aspects between the norms, standards, and regulatory expectations, or also identify those areas where not enough information is available. Once critical paths are known, optimization approaches can be used and decision support techniques can be applied so to better support VSE’s/SME’s in their innovative developments. This paper reports on the adopted modeling approach, the retained modeling languages, and how they all fit together.Keywords: aeronautics, certification, process modeling, project management, regulation, SME, systems engineering, VSE
Procedia PDF Downloads 16145066 Spatial Information and Urbanizing Futures
Authors: Mohammad Talei, Neda Ranjbar Nosheri, Reza Kazemi Gorzadini
Abstract:
Today municipalities are searching for the new tools for increasing the public participation in different levels of urban planning. This approach of urban planning involves the community in planning process using participatory approaches instead of the long traditional top-down planning methods. These tools can be used to obtain the particular problems of urban furniture form the residents’ point of view. One of the tools that is designed with this goal is public participation GIS (PPGIS) that enables citizen to record and following up their feeling and spatial knowledge regarding main problems of the city, specifically urban furniture, in the form of maps. However, despite the good intentions of PPGIS, its practical implementation in developing countries faces many problems including the lack of basic supporting infrastructure and services and unavailability of sophisticated public participatory models. In this research we develop a PPGIS using of Web 2 to collect voluntary geodataand to perform spatial analysis based on Spatial OnLine Analytical Processing (SOLAP) and Spatial Data Mining (SDM). These tools provide urban planners with proper informationregarding the type, spatial distribution and the clusters of reported problems. This system is implemented in a case study area in Tehran, Iran and the challenges to make it applicable and its potential for real urban planning have been evaluated. It helps decision makers to better understand, plan and allocate scarce resources for providing most requested urban furniture.Keywords: PPGIS, spatial information, urbanizing futures, urban planning
Procedia PDF Downloads 72645065 Sentiment Analysis of Fake Health News Using Naive Bayes Classification Models
Authors: Danielle Shackley, Yetunde Folajimi
Abstract:
As more people turn to the internet seeking health-related information, there is more risk of finding false, inaccurate, or dangerous information. Sentiment analysis is a natural language processing technique that assigns polarity scores to text, ranging from positive, neutral, and negative. In this research, we evaluate the weight of a sentiment analysis feature added to fake health news classification models. The dataset consists of existing reliably labeled health article headlines that were supplemented with health information collected about COVID-19 from social media sources. We started with data preprocessing and tested out various vectorization methods such as Count and TFIDF vectorization. We implemented 3 Naive Bayes classifier models, including Bernoulli, Multinomial, and Complement. To test the weight of the sentiment analysis feature on the dataset, we created benchmark Naive Bayes classification models without sentiment analysis, and those same models were reproduced, and the feature was added. We evaluated using the precision and accuracy scores. The Bernoulli initial model performed with 90% precision and 75.2% accuracy, while the model supplemented with sentiment labels performed with 90.4% precision and stayed constant at 75.2% accuracy. Our results show that the addition of sentiment analysis did not improve model precision by a wide margin; while there was no evidence of improvement in accuracy, we had a 1.9% improvement margin of the precision score with the Complement model. Future expansion of this work could include replicating the experiment process and substituting the Naive Bayes for a deep learning neural network model.Keywords: sentiment analysis, Naive Bayes model, natural language processing, topic analysis, fake health news classification model
Procedia PDF Downloads 9745064 Tracing the Direction of Media Activism: Public Perspective
Authors: G. Arockiasamy, B. Sujeevan Kumar, Surendheran
Abstract:
Human progress and development are highly influenced by the power of information access and technology. A global and multi-national transformation all over the word is possible due to digitalization. In the process of exchanging information, experience, and resources, there is a radical shift in who controls them. Mass media has turned the world into a global village by strengthening communication network. As a result, a new digital culture has emerged as a social network commonly known as new media. Today the advancement of technology is at the doorstep of everyone linking to anywhere. The traditional social restrictions are broken down by the new type of virtual communication modality that transcends people beyond boundaries At the same time media empire has invaded every nook and corner of the world through great expansion. Media activism is growing stronger and stronger but the truth and true meaning lost in the process. This paper explores the peoples’ attitude to media activism and tracing its direction. The methodology employed is random sampling survey and content analysis method. Both qualitatively and quantitatively measured. The findings tend to show 60 percent indicate media activism as positive and others indicate as negative. As a conclusion, media activism has danger within but depends on nature of the development of human orientation.Keywords: media activism, media industry, program, truth information, orientation and nature
Procedia PDF Downloads 21045063 Analysis of OPG Gene Polymorphism T245G (rs3134069) in Slovak Postmenopausal Women
Authors: I. Boroňová, J. Bernasovská, J. Kľoc, Z. Tomková, E. Petrejčíková, S. Mačeková, J. Poráčová, M. M. Blaščáková
Abstract:
Osteoporosis is a common multifactorial disease with a strong genetic component characterized by reduced bone mass and increased risk of fractures. Genetic factors play an important role in the pathogenesis of osteoporosis. The aim of our study was to identify the genotype and allele distribution of T245G polymorphism in OPG gene in Slovak postmenopausal women. A total of 200 unrelated Slovak postmenopausal women with diagnosed osteoporosis and 200 normal controls were genotyped for T245G (rs3134069) polymorphism of OPG gene. Genotyping was performed using the Custom Taqman®SNP Genotyping assays. Genotypes and alleles frequencies showed no significant differences (p=0.5551; p=0.6022). The results of the present study confirm the importance of T245G polymorphism in OPG gene in the pathogenesis of osteoporosis.Keywords: OPG gene, T245G polymorphism, osteoporosis, T245G polymorphism, real-time PCR
Procedia PDF Downloads 40945062 Astronomical Panels of Measuring and Dividing Time in Ancient Egypt
Authors: Mohamed Saeed Ahmed Salman
Abstract:
The ancient Egyptians used the stars to measure time or, in a more precise sense, as one of the astronomical means of measuring time. These methods differed throughout the historical ages. They began with simple observations of observing astronomical phenomena and watching them, such as observing the movements of the stars in the sky. The year, to know the days, nights, and other means used to help set the time when the sky overcast, and so the researcher tries through archaeological evidence to demonstrate the knowledge of the ancient Egyptian stars of heaven, and movements through the first pre-history. It is not believed that the astronomical information possessed by the Egyptian was limited, and simple, it was reaching a level of almost optimal in terms of importance, and the goal he wanted to reach the ancient Egyptian, and also help him to know the time, and the passage of time; which ended in finally trying to find a system of timing and calculation of time. It was noted that there were signs that the stellar creed was known, and prosperous, especially since the pre-family ages, and this is evident on the inscriptions that come back to that period. The Egyptian realized that some of the stars remain visible at night, The ancient Egyptian was familiar with the daily journey of the stars. This is what was adopted in many paragraphs of the texts of the pyramids and its references to the rise of the deceased king of the heavenly world between the stars of the eternal sky. It was noted that the ancient Egyptian link between the doctrine of the star, we find that the public The lunar was known to the ancient Egyptians, and sang it for two years, and the stellar solar; but it was based on the appearance of the star Sirius, and this is the first means used to measure time and know the calendar stars.Keywords: ancient Egyptian, astronomical panels, Egyptian, astronomical
Procedia PDF Downloads 2345061 Corporate Societal Disclosure and Corporate Governance: A By-Contextual Analysis
Authors: Zineb Meniaoui, Fatma Zehri, Kamoussi Halioui
Abstract:
The amplified awareness of companies towards the social and environmental concerns has nowadays become a challenge for firms around the globe. Our study investigates the effects of corporate governance mechanisms on voluntarily social and environmental information disclosure in Canada and France. The study use the content analysis approach, applied on a total of 245 year-observation for the Canadian sample and 245 year-observation for the French sample from 2005 to 2011. Our results show a significant correlation between the board's independence, Corporate Social Responsibility (CSR) committee and expertise as well as the audit quality along with the extent of the social and environmental disclosure. The French firms are found disclosing more societal information than Canadian firms, which might be due to the stakeholders' pressure put on French companies to disclose such societal information.Keywords: Canada, corporate governance, disclosure determinants , France, social and environmental disclosure
Procedia PDF Downloads 35345060 Non-Parametric Regression over Its Parametric Couterparts with Large Sample Size
Authors: Jude Opara, Esemokumo Perewarebo Akpos
Abstract:
This paper is on non-parametric linear regression over its parametric counterparts with large sample size. Data set on anthropometric measurement of primary school pupils was taken for the analysis. The study used 50 randomly selected pupils for the study. The set of data was subjected to normality test, and it was discovered that the residuals are not normally distributed (i.e. they do not follow a Gaussian distribution) for the commonly used least squares regression method for fitting an equation into a set of (x,y)-data points using the Anderson-Darling technique. The algorithms for the nonparametric Theil’s regression are stated in this paper as well as its parametric OLS counterpart. The use of a programming language software known as “R Development” was used in this paper. From the analysis, the result showed that there exists a significant relationship between the response and the explanatory variable for both the parametric and non-parametric regression. To know the efficiency of one method over the other, the Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) are used, and it is discovered that the nonparametric regression performs better than its parametric regression counterparts due to their lower values in both the AIC and BIC. The study however recommends that future researchers should study a similar work by examining the presence of outliers in the data set, and probably expunge it if detected and re-analyze to compare results.Keywords: Theil’s regression, Bayesian information criterion, Akaike information criterion, OLS
Procedia PDF Downloads 30545059 From Binary Solutions to Real Bio-Oils: A Multi-Step Extraction Story of Phenolic Compounds with Ionic Liquid
Authors: L. Cesari, L. Canabady-Rochelle, F. Mutelet
Abstract:
The thermal conversion of lignin produces bio-oils that contain many compounds with high added-value such as phenolic compounds. In order to efficiently extract these compounds, the possible use of choline bis(trifluoromethylsulfonyl)imide [Choline][NTf2] ionic liquid was explored. To this end, a multistep approach was implemented. First, binary (phenolic compound and solvent) and ternary (phenolic compound and solvent and ionic liquid) solutions were investigated. Eight binary systems of phenolic compound and water were investigated at atmospheric pressure. These systems were quantified using the turbidity method and UV-spectroscopy. Ternary systems (phenolic compound and water and [Choline][NTf2]) were investigated at room temperature and atmospheric pressure. After stirring, the solutions were let to settle down, and a sample of each phase was collected. The analysis of the phases was performed using gas chromatography with an internal standard. These results were used to quantify the values of the interaction parameters of thermodynamic models. Then, extractions were performed on synthetic solutions to determine the influence of several operating conditions (temperature, kinetics, amount of [Choline][NTf2]). With this knowledge, it has been possible to design and simulate an extraction process composed of one extraction column and one flash. Finally, the extraction efficiency of [Choline][NTf2] was quantified with real bio-oils from lignin pyrolysis. Qualitative and quantitative analysis were performed using gas chromatographic connected to mass spectroscopy and flame ionization detector. The experimental measurements show that the extraction of phenolic compounds is efficient at room temperature, quick and does not require a high amount of [Choline][NTf2]. Moreover, the simulations of the extraction process demonstrate that [Choline][NTf2] process requires less energy than an organic one. Finally, the efficiency of [Choline][NTf2] was confirmed in real situations with the experiments on lignin pyrolysis bio-oils.Keywords: bio-oils, extraction, lignin, phenolic compounds
Procedia PDF Downloads 11045058 Thermal and Caloric Imperfections Effect on the Supersonic Flow Parameters with Application for Air in Nozzles
Authors: Merouane Salhi, Toufik Zebbiche, Omar Abada
Abstract:
When the stagnation pressure of perfect gas increases, the specific heat and their ratio do not remain constant anymore and start to vary with this pressure. The gas does not remain perfect. Its state equation change and it becomes a real gas. In this case, the effects of molecular size and inter molecular attraction forces intervene to correct the state equation. The aim of this work is to show and discuss the effect of stagnation pressure on supersonic thermo dynamical, physical and geometrical flow parameters, to find a general case for real gas. With the assumptions that Berthelot’s state equation accounts for molecular size and inter molecular force effects, expressions are developed for analyzing supersonic flow for thermally and calorically imperfect gas lower than the dissociation molecules threshold. The designs parameters for supersonic nozzle like thrust coefficient depend directly on stagnation parameters of the combustion chamber. The application is for air. A computation of error is made in this case to give a limit of perfect gas model compared to real gas model.Keywords: supersonic flow, real gas model, Berthelot’s state equation, Simpson’s method, condensation function, stagnation pressure
Procedia PDF Downloads 52545057 Attribute Index and Classification Method of Earthquake Damage Photographs of Engineering Structure
Authors: Ming Lu, Xiaojun Li, Bodi Lu, Juehui Xing
Abstract:
Earthquake damage phenomenon of each large earthquake gives comprehensive and profound real test to the dynamic performance and failure mechanism of different engineering structures. Cognitive engineering structure characteristics through seismic damage phenomenon are often far superior to expensive shaking table experiments. After the earthquake, people will record a variety of different types of engineering damage photos. However, a large number of earthquake damage photographs lack sufficient information and reduce their using value. To improve the research value and the use efficiency of engineering seismic damage photographs, this paper objects to explore and show seismic damage background information, which includes the earthquake magnitude, earthquake intensity, and the damaged structure characteristics. From the research requirement in earthquake engineering field, the authors use the 2008 China Wenchuan M8.0 earthquake photographs, and provide four kinds of attribute indexes and classification, which are seismic information, structure types, earthquake damage parts and disaster causation factors. The final object is to set up an engineering structural seismic damage database based on these four attribute indicators and classification, and eventually build a website providing seismic damage photographs.Keywords: attribute index, classification method, earthquake damage picture, engineering structure
Procedia PDF Downloads 76545056 A Conceptual Framework of Digital Twin for Homecare
Authors: Raja Omman Zafar, Yves Rybarczyk, Johan Borg
Abstract:
This article proposes a conceptual framework for the application of digital twin technology in home care. The main goal is to bridge the gap between advanced digital twin concepts and their practical implementation in home care. This study uses a literature review and thematic analysis approach to synthesize existing knowledge and proposes a structured framework suitable for homecare applications. The proposed framework integrates key components such as IoT sensors, data-driven models, cloud computing, and user interface design, highlighting the importance of personalized and predictive homecare solutions. This framework can significantly improve the efficiency, accuracy, and reliability of homecare services. It paves the way for the implementation of digital twins in home care, promoting real-time monitoring, early intervention, and better outcomes.Keywords: digital twin, homecare, older adults, healthcare, IoT, artificial intelligence
Procedia PDF Downloads 7145055 Effect of Outliers in Assessing Significant Wave Heights Through a Time-Dependent GEV Model
Authors: F. Calderón-Vega, A. D. García-Soto, C. Mösso
Abstract:
Recorded significant wave heights sometimes exhibit large uncommon values (outliers) that can be associated with extreme phenomena such as hurricanes and cold fronts. In this study, some extremely large wave heights recorded in NOAA buoys (National Data Buoy Center, noaa.gov) are used to investigate their effect in the prediction of future wave heights associated with given return periods. Extreme waves are predicted through a time-dependent model based on the so-called generalized extreme value distribution. It is found that the outliers do affect the estimated wave heights. It is concluded that a detailed inspection of outliers is envisaged to determine whether they are real recorded values since this will impact defining design wave heights for coastal protection purposes.Keywords: GEV model, non-stationary, seasonality, outliers
Procedia PDF Downloads 19545054 Relationship Between Pain Intensity at the Time of the Hamstring Muscle Injury and Hamstring Muscle Lesion Volume Measured by Magnetic Resonance Imaging
Authors: Grange Sylvain, Plancher Ronan, Reurink Guustav, Croisille Pierre, Edouard Pascal
Abstract:
The primary objective of this study was to analyze the potential correlation between the pain experienced at the time of a hamstring muscle injury and the volume of the lesion measured on MRI. The secondary objectives were to analyze a correlation between this pain and the lesion grade as well as the affected hamstring muscle. We performed a retrospective analysis of the data collected in a prospective, multicenter, non-interventional cohort study (HAMMER). Patients with suspected hamstring muscle injury had an MRI after the injury and at the same time were evaluated for their pain intensity experienced at the time of the injury with a Numerical Pain Rating Scale (NPRS) from 0 to 10. A total of 61 patients were included in the present analysis. MRIs were performed in an average of less than 8 days. There was a significant correlation between pain and the injury volume (r=0.287; p=0.025). There was no significant correlation between the pain and the lesion grade (p>0.05), nor between the pain and affected hamstring muscle (p>0.05). Pain at the time of injury appeared to be correlated with the volume of muscle affected. These results confirm the value of a clinical approach in the initial evaluation of hamstring injuries to better select patients eligible for further imaging.Keywords: hamstring muscle injury, MRI, volume lesion, pain
Procedia PDF Downloads 9845053 ArcGIS as a Tool for Infrastructure Documentation and Asset Management: Establishing a GIS for Computer Network Documentation
Authors: John Segars
Abstract:
Built out of a real-world need to have better, more detailed, asset and infrastructure documentation, this project will lay out the case for using the database functionality of ArcGIS as a tool to track and maintain infrastructure location, status, maintenance and serviceability. Workflows and processes will be presented and detailed which may be applied to an organizations’ infrastructure needs that might allow them to make use of the robust tools which surround the ArcGIS platform. The end result is a value-added information system framework with a geographic component e.g., the spatial location of various I.T. assets, a detailed set of records which not only documents location but also captures the maintenance history for assets along with photographs and documentation of these various assets as attachments to the numerous feature class items. In addition to the asset location and documentation benefits, the staff will be able to log into the devices and pull SNMP (Simple Network Management Protocol) based query information from within the user interface. The entire collection of information may be displayed in ArcGIS, via a JavaScript based web application or via queries to the back-end database. The project is applicable to all organizations which maintain an IT infrastructure but specifically targets post-secondary educational institutions where access to ESRI resources is generally already available in house.Keywords: ESRI, GIS, infrastructure, network documentation, PostgreSQL
Procedia PDF Downloads 18145052 Analysis of Possible Causes of Fukushima Disaster
Authors: Abid Hossain Khan, Syam Hasan, M. A. R. Sarkar
Abstract:
Fukushima disaster is one of the most publicly exposed accidents in a nuclear facility which has changed the outlook of people towards nuclear power. Some have used it as an example to establish nuclear energy as an unsafe source, while others have tried to find the real reasons behind this accident. Many papers have tried to shed light on the possible causes, some of which are purely based on assumptions while others rely on rigorous data analysis. To our best knowledge, none of the works can say with absolute certainty that there is a single prominent reason that has paved the way to this unexpected incident. This paper attempts to compile all the apparent reasons behind Fukushima disaster and tries to analyze and identify the most likely one.Keywords: fuel meltdown, Fukushima disaster, Manmade calamity, nuclear facility, tsunami
Procedia PDF Downloads 26645051 Automatic Detection and Update of Region of Interest in Vehicular Traffic Surveillance Videos
Authors: Naydelis Brito Suárez, Deni Librado Torres Román, Fernando Hermosillo Reynoso
Abstract:
Automatic detection and generation of a dynamic ROI (Region of Interest) in vehicle traffic surveillance videos based on a static camera in Intelligent Transportation Systems is challenging for computer vision-based systems. The dynamic ROI, being a changing ROI, should capture any other moving object located outside of a static ROI. In this work, the video is represented by a Tensor model composed of a Background and a Foreground Tensor, which contains all moving vehicles or objects. The values of each pixel over a time interval are represented by time series, and some pixel rows were selected. This paper proposes a pixel entropy-based algorithm for automatic detection and generation of a dynamic ROI in traffic videos under the assumption of two types of theoretical pixel entropy behaviors: (1) a pixel located at the road shows a high entropy value due to disturbances in this zone by vehicle traffic, (2) a pixel located outside the road shows a relatively low entropy value. To study the statistical behavior of the selected pixels, detecting the entropy changes and consequently moving objects, Shannon, Tsallis, and Approximate entropies were employed. Although Tsallis entropy achieved very high results in real-time, Approximate entropy showed results slightly better but in greater time.Keywords: convex hull, dynamic ROI detection, pixel entropy, time series, moving objects
Procedia PDF Downloads 7445050 Applying Augmented Reality Technology for an E-Learning System
Authors: Fetoon K. Algarawi, Wejdan A. Alslamah, Ahlam A. Alhabib, Afnan S. Alfehaid, Dina M. Ibrahim
Abstract:
Over the past 20 years, technology was rapidly developed and no one expected what will come next. Advancements in technology open new opportunities for immersive learning environments. There is a need to transmit education to a level that makes it more effective for the student. Augmented reality is one of the most popular technologies these days. This paper is an experience of applying Augmented Reality (AR) technology using a marker-based approach in E-learning system to transmitting virtual objects into the real-world scenes. We present a marker-based approach for transmitting virtual objects into real-world scenes to explain information in a better way after we developed a mobile phone application. The mobile phone application was then tested on students to determine the extent to which it encouraged them to learn and understand the subjects. In this paper, we talk about how the beginnings of AR, the fields using AR, how AR is effective in education, the spread of AR these days and the architecture of our work. Therefore, the aim of this paper is to prove how creating an interactive e-learning system using AR technology will encourage students to learn more.Keywords: augmented reality, e-learning, marker-based, monitor-based
Procedia PDF Downloads 22345049 Visualization of PM₂.₅ Time Series and Correlation Analysis of Cities in Bangladesh
Authors: Asif Zaman, Moinul Islam Zaber, Amin Ahsan Ali
Abstract:
In recent years of industrialization, the South Asian countries are being affected by air pollution due to a severe increase in fine particulate matter 2.5 (PM₂.₅). Among them, Bangladesh is one of the most polluting countries. In this paper, statistical analyses were conducted on the time series of PM₂.₅ from various districts in Bangladesh, mostly around Dhaka city. Research has been conducted on the dynamic interactions and relationships between PM₂.₅ concentrations in different zones. The study is conducted toward understanding the characteristics of PM₂.₅, such as spatial-temporal characterization, correlation of other contributors behind air pollution such as human activities, driving factors and environmental casualties. Clustering on the data gave an insight on the districts groups based on their AQI frequency as representative districts. Seasonality analysis on hourly and monthly frequency found higher concentration of fine particles in nighttime and winter season, respectively. Cross correlation analysis discovered a phenomenon of correlations among cities based on time-lagged series of air particle readings and visualization framework is developed for observing interaction in PM₂.₅ concentrations between cities. Significant time-lagged correlations were discovered between the PM₂.₅ time series in different city groups throughout the country by cross correlation analysis. Additionally, seasonal heatmaps depict that the pooled series correlations are less significant in warmer months, and among cities of greater geographic distance as well as time lag magnitude and direction of the best shifted correlated particulate matter time series among districts change seasonally. The geographic map visualization demonstrates spatial behaviour of air pollution among districts around Dhaka city and the significant effect of wind direction as the vital actor on correlated shifted time series. The visualization framework has multipurpose usage from gathering insight of general and seasonal air quality of Bangladesh to determining the pathway of regional transportation of air pollution.Keywords: air quality, particles, cross correlation, seasonality
Procedia PDF Downloads 10545048 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques
Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev
Abstract:
Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.Keywords: data analysis, demand modeling, healthcare, medical facilities
Procedia PDF Downloads 14445047 Validation of a Placebo Method with Potential for Blinding in Ultrasound-Guided Dry Needling
Authors: Johnson C. Y. Pang, Bo Peng, Kara K. L. Reeves, Allan C. L. Fud
Abstract:
Objective: Dry needling (DN) has long been used as a treatment method for various musculoskeletal pain conditions. However, the evidence level of the studies was low due to the limitations of the methodology. Lack of randomization and inappropriate blinding is potentially the main sources of bias. A method that can differentiate clinical results due to the targeted experimental procedure from its placebo effect is needed to enhance the validity of the trial. Therefore, this study aimed to validate the method as a placebo ultrasound(US)-guided DN for patients with knee osteoarthritis (KOA). Design: This is a randomized controlled trial (RCT). Ninety subjects (25 males and 65 females) aged between 51 and 80 (61.26 ± 5.57) with radiological KOA were recruited and randomly assigned into three groups with a computer program. Group 1 (G1) received real US-guided DN, Group 2 (G2) received placebo US-guided DN, and Group 3 (G3) was the control group. Both G1 and G2 subjects received the same procedure of US-guided DN, except the US monitor was turned off in G2, blinding the G2 subjects to the incorporation of faux US guidance. This arrangement created the placebo effect intended to permit comparison of their results to those who received actual US-guided DN. Outcome measures, including the visual analog scale (VAS) and Knee injury and Osteoarthritis Outcome Score (KOOS) subscales of pain, symptoms, and quality of life (QOL), were analyzed by repeated measures analysis of covariance (ANCOVA) for time effects and group effects. The data regarding the perception of receiving real US-guided DN or placebo US-guided DN were analyzed by the chi-squared test. The missing data were analyzed with the intention-to-treat (ITT) approach if more than 5% of the data were missing. Results: The placebo US-guided DN (G2) subjects had the same perceptions as the use of real US guidance in the advancement of DN (p<0.128). G1 had significantly higher pain reduction (VAS and KOOS-pain) than G2 and G3 at 8 weeks (both p<0.05) only. There was no significant difference between G2 and G3 at 8 weeks (both p>0.05). Conclusion: The method with the US monitor turned off during the application of DN is credible for blinding the participants and allowing researchers to incorporate faux US guidance. The validated placebo US-guided DN technique can aid in investigations of the effects of US-guided DN with short-term effects of pain reduction for patients with KOA. Acknowledgment: This work was supported by the Caritas Institute of Higher Education [grant number IDG200101].Keywords: ultrasound-guided dry needling, dry needling, knee osteoarthritis, physiotheraphy
Procedia PDF Downloads 12045046 Algorithm Research on Traffic Sign Detection Based on Improved EfficientDet
Authors: Ma Lei-Lei, Zhou You
Abstract:
Aiming at the problems of low detection accuracy of deep learning algorithm in traffic sign detection, this paper proposes improved EfficientDet based traffic sign detection algorithm. Multi-head self-attention is introduced in the minimum resolution layer of the backbone of EfficientDet to achieve effective aggregation of local and global depth information, and this study proposes an improved feature fusion pyramid with increased vertical cross-layer connections, which improves the performance of the model while introducing a small amount of complexity, the Balanced L1 Loss is introduced to replace the original regression loss function Smooth L1 Loss, which solves the problem of balance in the loss function. Experimental results show, the algorithm proposed in this study is suitable for the task of traffic sign detection. Compared with other models, the improved EfficientDet has the best detection accuracy. Although the test speed is not completely dominant, it still meets the real-time requirement.Keywords: convolutional neural network, transformer, feature pyramid networks, loss function
Procedia PDF Downloads 9845045 Developing A Third Degree Of Freedom For Opinion Dynamics Models Using Scales
Authors: Dino Carpentras, Alejandro Dinkelberg, Michael Quayle
Abstract:
Opinion dynamics models use an agent-based modeling approach to model people’s opinions. Model's properties are usually explored by testing the two 'degrees of freedom': the interaction rule and the network topology. The latter defines the connection, and thus the possible interaction, among agents. The interaction rule, instead, determines how agents select each other and update their own opinion. Here we show the existence of the third degree of freedom. This can be used for turning one model into each other or to change the model’s output up to 100% of its initial value. Opinion dynamics models represent the evolution of real-world opinions parsimoniously. Thus, it is fundamental to know how real-world opinion (e.g., supporting a candidate) could be turned into a number. Specifically, we want to know if, by choosing a different opinion-to-number transformation, the model’s dynamics would be preserved. This transformation is typically not addressed in opinion dynamics literature. However, it has already been studied in psychometrics, a branch of psychology. In this field, real-world opinions are converted into numbers using abstract objects called 'scales.' These scales can be converted one into the other, in the same way as we convert meters to feet. Thus, in our work, we analyze how this scale transformation may affect opinion dynamics models. We perform our analysis both using mathematical modeling and validating it via agent-based simulations. To distinguish between scale transformation and measurement error, we first analyze the case of perfect scales (i.e., no error or noise). Here we show that a scale transformation may change the model’s dynamics up to a qualitative level. Meaning that a researcher may reach a totally different conclusion, even using the same dataset just by slightly changing the way data are pre-processed. Indeed, we quantify that this effect may alter the model’s output by 100%. By using two models from the standard literature, we show that a scale transformation can transform one model into the other. This transformation is exact, and it holds for every result. Lastly, we also test the case of using real-world data (i.e., finite precision). We perform this test using a 7-points Likert scale, showing how even a small scale change may result in different predictions or a number of opinion clusters. Because of this, we think that scale transformation should be considered as a third-degree of freedom for opinion dynamics. Indeed, its properties have a strong impact both on theoretical models and for their application to real-world data.Keywords: degrees of freedom, empirical validation, opinion scale, opinion dynamics
Procedia PDF Downloads 155