Search results for: data center
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26223

Search results for: data center

25143 Localization of Geospatial Events and Hoax Prediction in the UFO Database

Authors: Harish Krishnamurthy, Anna Lafontant, Ren Yi

Abstract:

Unidentified Flying Objects (UFOs) have been an interesting topic for most enthusiasts and hence people all over the United States report such findings online at the National UFO Report Center (NUFORC). Some of these reports are a hoax and among those that seem legitimate, our task is not to establish that these events confirm that they indeed are events related to flying objects from aliens in outer space. Rather, we intend to identify if the report was a hoax as was identified by the UFO database team with their existing curation criterion. However, the database provides a wealth of information that can be exploited to provide various analyses and insights such as social reporting, identifying real-time spatial events and much more. We perform analysis to localize these time-series geospatial events and correlate with known real-time events. This paper does not confirm any legitimacy of alien activity, but rather attempts to gather information from likely legitimate reports of UFOs by studying the online reports. These events happen in geospatial clusters and also are time-based. We look at cluster density and data visualization to search the space of various cluster realizations to decide best probable clusters that provide us information about the proximity of such activity. A random forest classifier is also presented that is used to identify true events and hoax events, using the best possible features available such as region, week, time-period and duration. Lastly, we show the performance of the scheme on various days and correlate with real-time events where one of the UFO reports strongly correlates to a missile test conducted in the United States.

Keywords: time-series clustering, feature extraction, hoax prediction, geospatial events

Procedia PDF Downloads 372
25142 Assessing Effectiveness of Schema Mode Therapy and Emotionally Focused Couples Therapy in Attachment Styles among Couples with Marital Conflict

Authors: Reza Johari Fard, Najmeh Cheraghi, Parvin Ehtesham Zadeh, Parviz Asgari

Abstract:

The aim of this study was to investigate and comparison of the effectiveness of schema mode therapy and emotionally focused couples therapy in attachment styles (secure, avoidant, and anxious) in couples with marital conflict in a quasiexperimental method in a pretest, posttest, and follow up design with a control group. The statistical population of the study included all the couples with marital conflict who visited the Mehrana counseling center in 2019 in Ahvaz, Iran 45 couples were selected by voluntary sampling method and randomly divided into two experimental groups and one control group (15 pairs in each group). The participants completed the Adult Attachment Scale (Hazan and Shaver). The experimental groups underwent schema mode therapy and emotionally focused couples therapy for 12 sessions, but the control group did not receive any intervention. The data were analyzed by the statistical analysis of repeated measures in SPSS-19 software. The results showed that both schema mode therapy and emotionally focused couples therapy are effective in increasing the secure attachment style and reducing avoidant and ambivalent attachment styles in couples with marital conflict. There was no significant difference between the schema mode therapy group and the emotionally focused couple's therapy group in attachment styles. Therefore, it is recommended that therapists and family counselors use these therapies along with other therapeutic interventions to increase secure attachment styles and reduce marital conflicts.

Keywords: schema mode therapy, emotional focused couple therapy, attachment styles, marital conflict

Procedia PDF Downloads 108
25141 Offshore Wind Assessment and Analysis for South Western Mediterranean Sea

Authors: Abdallah Touaibia, Nachida Kasbadji Merzouk, Mustapha Merzouk, Ryma Belarbi

Abstract:

accuracy assessment and a better understand of the wind resource distribution are the most important tasks for decision making before installing wind energy operating systems in a given region, there where our interest come to the Algerian coastline and its Mediterranean sea area. Despite its large coastline overlooking the border of Mediterranean Sea, there is still no strategy encouraging the development of offshore wind farms in Algerian waters. The present work aims to estimate the offshore wind fields for the Algerian Mediterranean Sea based on wind data measurements ranging from 1995 to 2018 provided of 24 years of measurement by seven observation stations focusing on three coastline cities in Algeria under a different measurement time step recorded from 30 min, 60 min, and 180 min variate from one to each other, two stations in Spain, two other ones in Italy and three in the coast of Algeria from the east Annaba, at the center Algiers, and to Oran taken place at the west of it. The idea behind consists to have multiple measurement points that helping to characterize this area in terms of wind potential by the use of interpolation method of their average wind speed values between these available data to achieve the approximate values of others locations where aren’t any available measurement because of the difficulties against the implementation of masts within the deep depth water. This study is organized as follow: first, a brief description of the studied area and its climatic characteristics were done. After that, the statistical properties of the recorded data were checked by evaluating wind histograms, direction roses, and average speeds using MatLab programs. Finally, ArcGIS and MapInfo soft-wares were used to establish offshore wind maps for better understanding the wind resource distribution, as well as to identify windy sites for wind farm installation and power management. The study pointed out that Cap Carbonara is the windiest site with an average wind speed of 7.26 m/s at 10 m, inducing a power density of 902 W/m², then the site of Cap Caccia with 4.88 m/s inducing a power density of 282 W/m². The average wind speed of 4.83 m/s is occurred for the site of Oran, inducing a power density of 230 W/m². The results indicated also that the dominant wind direction where the frequencies are highest for the site of Cap Carbonara is the West with 34%, an average wind speed of 9.49 m/s, and a power density of 1722 W/m². Then comes the site of Cap Caccia, where the prevailing wind direction is the North-west, about 20% and 5.82 m/s occurring a power density of 452 W/m². The site of Oran comes in third place with the North dominant direction with 32% inducing an average wind speed of 4.59 m/s and power density of 189 W/m². It also shown that the proposed method is either crucial in understanding wind resource distribution for revealing windy sites over a large area and more effective for wind turbines micro-siting.

Keywords: wind ressources, mediterranean sea, offshore, arcGIS, mapInfo, wind maps, wind farms

Procedia PDF Downloads 140
25140 Proposal for a Monster Village in Namsan Mountain, Seoul: Significance from a Phenomenological Perspective

Authors: Hyuk-Jin Lee

Abstract:

Korea is a country with thousands of years of history, like its neighbors China and Japan. However, compared to China, which is famous for its ancient fantasy novel "Journey to the West", and Japan, which is famous for its monsters, its “monster culture” is not actively used for tourism. The reason is that the culture closest to the present, from the 17th to 20th centuries, was the Joseon Dynasty, when Neo-Confucianism, which suppressed a monster culture, was the strongest. This trend became stronger after Neo-Confucianism became dogmatic in the mid-17th century. However, Korea, which has a history of Taoism for thousands of years, clearly has many literatures on monsters that can be used as tourism resources. The problem is that these data are buried in texts and are unfamiliar even to Koreans. This study examines the possibility of developing them into attractive tourism resources based on the literary records of the so-called 'monsters densely located in Namsan Mountain, located in the center of Seoul' buried in texts from the 16th to early 17th centuries. In particular, we introduce the surprising consistency in the description of the area north of Namsan Mountain in terms of 'feng shui geography', an oriental philosophy, in a contemporary Korean newspaper. Finally, based on the theoretical foundation through the phenomenological classification table of cultural heritage, we examine phenomenologically how important this ‘visualization of imaginary or text-based entities’ is to changes in the perception of specific cultural resources in a society. In addition, we will deeply analyze related cases, including Japan's ninja culture.

Keywords: monster culture, Namsan mountain, neo-confucianism, phenomenology, tourism

Procedia PDF Downloads 23
25139 Importance of Remote Sensing and Information Communication Technology to Improve Climate Resilience in Low Land of Ethiopia

Authors: Hasen Keder Edris, Ryuji Matsunaga, Toshi Yamanaka

Abstract:

The issue of climate change and its impact is a major contemporary global concern. Ethiopia is one of the countries experiencing adverse climate change impact including frequent extreme weather events that are exacerbating drought and water scarcity. Due to this reason, the government of Ethiopia develops a strategic document which focuses on the climate resilience green economy. One of the major components of the strategic framework is designed to improve community adaptation capacity and mitigation of drought. For effective implementation of the strategy, identification of regions relative vulnerability to drought is vital. There is a growing tendency of applying Geographic Information System (GIS) and Remote Sensing technologies for collecting information on duration and severity of drought by direct measure of the topography as well as an indirect measure of land cover. This study aims to show an application of remote sensing technology and GIS for developing drought vulnerability index by taking lowland of Ethiopia as a case study. In addition, it assesses integrated Information Communication Technology (ICT) potential of Ethiopia lowland and proposes integrated solution. Satellite data is used to detect the beginning of the drought. The severity of drought risk prone areas of livestock keeping pastoral is analyzed through normalized difference vegetation index (NDVI) and ten years rainfall data. The change from the existing and average SPOT NDVI and vegetation condition index is used to identify the onset of drought and potential risks. Secondary data is used to analyze geographical coverage of mobile and internet usage in the region. For decades, the government of Ethiopia introduced some technologies and approach to overcoming climate change related problems. However, lack of access to information and inadequate technical support for the pastoral area remains a major challenge. In conventional business as usual approach, the lowland pastorals continue facing a number of challenges. The result indicated that 80% of the region face frequent drought occurrence and out of this 60% of pastoral area faces high drought risk. On the other hand, the target area mobile phone and internet coverage is rapidly growing. One of identified ICT solution enabler technology is telecom center which covers 98% of the region. It was possible to identify the frequently affected area and potential drought risk using the NDVI remote-sensing data analyses. We also found that ICT can play an important role in mitigating climate change challenge. Hence, there is a need to strengthen implementation efforts of climate change adaptation through integrated Remote Sensing and web based information dissemination and mobile alert of extreme events.

Keywords: climate changes, ICT, pastoral, remote sensing

Procedia PDF Downloads 310
25138 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R

Authors: Pavel H. Llamocca, Victoria Lopez

Abstract:

The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.

Keywords: open data, R language, data integration, environmental data

Procedia PDF Downloads 311
25137 Analysis of Coloring Styles of Brazilian Urban Heritage

Authors: Natalia Naoumova

Abstract:

Facing changes and continuous growth of the contemporary cities, along with the globalization effects that accelerate cultural dissolution, the maintenance of cultural authenticity, which is implicit in historical areas as a part of cultural diversity, can be considered one of the key elements of a sustainable society. This article focuses on the polychromy of buildings in a historical context as an important feature of urban settings. It analyses the coloring of Brazilian urban heritage, characterized by the study of historical districts in Pelotas and Piratini, located in the State of Rio Grande do Sul, Brazil. The objective is to reveal the coloring characteristics of different historical periods, determine the chromatic typologies of the corresponding building styles, and clarify the connection between the historical chromatic aspects and their relationship with the contemporary urban identity. Architectural style data were collected by different techniques such as stratigraphic prospects of buildings, survey of historical records and descriptions, analysis of images and study of projects with colored facades kept in historical archives. Three groups of characteristics were considered in searching for working criteria in the formation of chromatic model typologies: 1) coloring palette; 2) morphology of the facade, and 3) their relationship. The performed analysis shows the formation of the urban chromatic image of the historical center as a continuous and dynamic process with the development of constant chromatic resources. It establishes that the changes in the formal language of subsequent historical periods lead to the changes in the chromatic schemes, providing a different reading of the facades both in terms of formal interpretation and symbolic meaning.

Keywords: building style, historic colors, urban heritage, urban polychromy

Procedia PDF Downloads 136
25136 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.

Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making

Procedia PDF Downloads 71
25135 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule

Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu

Abstract:

Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.

Keywords: instance selection, data reduction, MapReduce, kNN

Procedia PDF Downloads 250
25134 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking

Authors: Trevor Toy, Josef Langerman

Abstract:

Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.

Keywords: big data markets, open banking, blockchain, personal data management

Procedia PDF Downloads 72
25133 Dimensions of Public Spaces in Indian Market Places Feelings through Human Senses

Authors: Piyush Hajela

Abstract:

Public spaces in Indian market places are vibrant, colorful and carry latent dimensions that make them attractive and popular gathering spaces. These markets satisfy the household needs of the people and also their social, cultural and traditional aspirations. Going to a market place for shopping in India is a great source of entertainment for the people. They would love to spend as much time as possible and stay for longer durations than otherwise required. It is this desire of the people that generates public spaces. Much of these public spaces emerge as squares, plazas, corners of varied shapes and sizes at different locations, and yet provide a conducive environment. Such public spaces grow organically and are discovered by the people themselves. Indian markets serve people of different culture, religion, caste, age, gender which keeps them alive all the year round. Indian is a diverse country and this diversity is reflected clearly in the market places. They hold the people together and promote harmony across cultures. Free access to these market places makes them magnets for social interaction. Public spaces are spread across a city and more or less have established their existence and prominence in a social set up. While few of them are created, others are discovered by the people themselves in their constant search for desirable interactive public spaces. These are the most sought after gathering spaces that have the quality of promoting social interaction, providing free accessibility, provide desirable scale etc. The paper aims at identifying these freely accessible public spaces and the dimensions within it that make these public spaces hold the people for significant duration of time. The dimensions present shall be judged through collective response of human senses in form of safety, comfort and so on through the expressions of the participants. The aim therefore would be to trace the freely accessible public spaces emerged in Indian markets and evaluate them for human response and behavior. The hierarchy of market places in the city of Bhopal is well established as, city center level, sub city-center level, community level, local and convenient level market places. While many city-centers are still referred to as the old or traditional or the core area of the city, the others are part of the planned city. These different levels of market places are studied for emerged public spaces. These emerged public spaces are then documented in detail for unveiling the dimensions they offer through, photographs, visual observations, questionnaires and response of the participants of these public spaces.

Keywords: human comfort, enclosure, safety, social interaction

Procedia PDF Downloads 415
25132 Experimental Evaluation of Succinct Ternary Tree

Authors: Dmitriy Kuptsov

Abstract:

Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.

Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation

Procedia PDF Downloads 158
25131 Diagnosis of Intermittent High Vibration Peaks in Industrial Gas Turbine Using Advanced Vibrations Analysis

Authors: Abubakar Rashid, Muhammad Saad, Faheem Ahmed

Abstract:

This paper provides a comprehensive study pertaining to diagnosis of intermittent high vibrations on an industrial gas turbine using detailed vibrations analysis, followed by its rectification. Engro Polymer & Chemicals Limited, a Chlor-Vinyl complex located in Pakistan has a captive combined cycle power plant having two 28 MW gas turbines (make Hitachi) & one 15 MW steam turbine. In 2018, the organization faced an issue of high vibrations on one of the gas turbines. These high vibration peaks appeared intermittently on both compressor’s drive end (DE) & turbine’s non-drive end (NDE) bearing. The amplitude of high vibration peaks was between 150-170% on the DE bearing & 200-300% on the NDE bearing from baseline values. In one of these episodes, the gas turbine got tripped on “High Vibrations Trip” logic actuated at 155µm. Limited instrumentation is available on the machine, which is monitored with GE Bently Nevada 3300 system having two proximity probes installed at Turbine NDE, Compressor DE &at Generator DE & NDE bearings. Machine’s transient ramp-up & steady state data was collected using ADRE SXP & DSPI 408. Since only 01 key phasor is installed at Turbine high speed shaft, a derived drive key phasor was configured in ADRE to obtain low speed shaft rpm required for data analysis. By analyzing the Bode plots, Shaft center line plot, Polar plot & orbit plots; rubbing was evident on Turbine’s NDE along with increased bearing clearance of Turbine’s NDE radial bearing. The subject bearing was then inspected & heavy deposition of carbonized coke was found on the labyrinth seals of bearing housing with clear rubbing marks on shaft & housing covering at 20-25 degrees on the inner radius of labyrinth seals. The collected coke sample was tested in laboratory & found to be the residue of lube oil in the bearing housing. After detailed inspection & cleaning of shaft journal area & bearing housing, new radial bearing was installed. Before assembling the bearing housing, cleaning of bearing cooling & sealing air lines was also carried out as inadequate flow of cooling & sealing air can accelerate coke formation in bearing housing. The machine was then taken back online & data was collected again using ADRE SXP & DSPI 408 for health analysis. The vibrations were found in acceptable zone as per ISO standard 7919-3 while all other parameters were also within vendor defined range. As a learning from subject case, revised operating & maintenance regime has also been proposed to enhance machine’s reliability.

Keywords: ADRE, bearing, gas turbine, GE Bently Nevada, Hitachi, vibration

Procedia PDF Downloads 141
25130 Comparison of Microleakage of Composite Restorations Using Fifth and Seventh Generation of Bonding Agents

Authors: Karina Nabilla, Dedi Sumantri, Nurul T. Rizal, Siti H. Yavitha

Abstract:

Background: Composite resin is the most frequently used material for restoring teeth, but still failure cases are seen which leading to microleakage. Microleakage might be attributed to various factors, one of them is bonding agent. Various generations of bonding agents have been introduced to overcome the microleakage. The aim of this study was to evaluate the microleakage of composite restorations using the fifth and seventh bonding agent. Methods: Class I cavities (3X2X2 mm) were prepared on the occlusal surfaces of 32 human upper premolars. Teeth were classified into two groups according to the type of bonding agent used (n =16). Group I: Fifth Generation of Bonding Agent-Adper Single Bond2. Group II: Seventh Generation of Bonding Agent-Single Bond Universal. All cavities were restored with Filtek Z250 XT composite resin, stored in sterile aquades water at 370C for 24 h. The root apices were sealed with sticky wax, and all the surfaces, except for 2 mm from the margins, were coated with nail varnish. The teeth were immersed in a 1% methylene blue dye solution for 24 h, and then rinsed in running water, blot-dried and sectioned longitudinally through the center of restorations from the buccal to palatal surface. The sections were blindly assessed for microleakage of dye penetration by using a stereomicroscope. Dye penetration along margin was measured in µm then calculated into the percentage and classified into scoring system 1 to 3. Data were collected and statistically analyzed by Chi-Square test. Result: There was no significant difference (p > 0,05) between two groups. Conclusion: Fifth generation of bonding agent revealed less leakage compared to the seventh generation even statistically there was no significant difference.

Keywords: composite restoration, fifth generation of bonding agent, microleakage, seventh generation of bonding agent

Procedia PDF Downloads 267
25129 Prosperous Digital Image Watermarking Approach by Using DCT-DWT

Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar

Abstract:

In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacks

Keywords: watermarking, digital, DCT-DWT, security

Procedia PDF Downloads 419
25128 Machine Learning Data Architecture

Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap

Abstract:

Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.

Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning

Procedia PDF Downloads 61
25127 Design for Filter and Transitions to Substrat Integated Waveguide at Ka Band

Authors: Damou Mehdi, Nouri Keltouma, Fahem Mohammed

Abstract:

In this paper, the concept of substrate integrated waveguide (SIW) technology is used to design filter for 30 GHz communication systems. SIW is created in the substrate of RT/Duroid 5880 having relative permittivity ε_r= 2.2 and loss tangent tanφ = 0.0009. Four Via are placed on the century filter the structures of SIW are modeled using and have been optimized in software HFSS (High Frequency Structure Simulator), à transition is designed for a Ka-band transceiver module with a 28.5GHz center frequency, . and then the results are verified using another simulation CST Microwave Studio (Computer Simulation Technology). The return loss are less than -18 dB, and -13 dB respectively. The insertion loss is divided equally -1.2 dB and -1.4 respectively.

Keywords: transition, microstrip, substrat integrated wave guide, filter, via

Procedia PDF Downloads 651
25126 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 367
25125 Korean Smart Cities: Strategic Foci, Characteristics and Effects

Authors: Sang Ho Lee, Yountaik Leem

Abstract:

This paper reviews Korean cases of smart cities through the analysis framework of strategic foci, characteristics and effects. Firstly, national strategies including c(cyber), e(electronic), u(ubiquitous) and s(smart) Korea strategies were considered from strategic angles. Secondly, the characteristics of smart cities in Korea were looked through the smart cities examples such as Seoul, Busan, Songdo and Sejong cities etc. from the views on the by STIM (Service, Technology, Infrastructure and Management) analysis. Finally, the effects of smart cities on socio-economies were investigated from industrial perspective using the input-output model and structural path analysis. Korean smart city strategies revealed that there were different kinds of strategic foci. c-Korea strategy focused on information and communications network building and user IT literacy. e-Korea strategy encouraged e-government and e-business through utilizing high-speed information and communications network. u-Korea strategy made ubiquitous service as well as integrated information and communication operations center. s-Korea strategy is propelling 4th industrial platform. Smart cities in Korea showed their own features and trends such as eco-intelligence, high efficiency and low cost oriented IoT, citizen sensored city, big data city. Smart city progress made new production chains fostering ICTs (Information Communication Technologies) and knowledge intermediate inputs to industries.

Keywords: Korean smart cities, Korean smart city strategies, STIM, smart service, infrastructure, technologies, management, effect of smart city

Procedia PDF Downloads 364
25124 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System

Authors: Karima Qayumi, Alex Norta

Abstract:

The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.

Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)

Procedia PDF Downloads 427
25123 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design

Authors: Qing K. Zhu

Abstract:

Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.

Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise

Procedia PDF Downloads 251
25122 From Bureaucracy to Organizational Learning Model: An Organizational Change Process Study

Authors: Vania Helena Tonussi Vidal, Ester Eliane Jeunon

Abstract:

This article aims to analyze the change processes of management related bureaucracy and learning organization model. The theoretical framework was based on Beer and Nohria (2001) model, identified as E and O Theory. Based on this theory the empirical research was conducted in connection with six key dimensions: goal, leadership, focus, process, reward systems and consulting. We used a case study of an educational Institution located in Barbacena, Minas Gerais. This traditional center of technical knowledge for long time adopted the bureaucratic way of management. After many changes in a business model, as the creation of graduate and undergraduate courses they decided to make a deep change in management model that is our research focus. The data were collected through semi-structured interviews with director, managers and courses supervisors. The analysis were processed by the procedures of Collective Subject Discourse (CSD) method, develop by Lefèvre & Lefèvre (2000), Results showed the incremental growing of management model toward a learning organization. Many impacts could be seeing. As negative factors we have: people resistance; poor information about the planning and implementation process; old politics inside the new model and so on. Positive impacts are: new procedures in human resources, mainly related to manager skills and empowerment; structure downsizing, open discussions channel; integrated information system. The process is still under construction and now great stimulus is done to managers and employee commitment in the process.

Keywords: bureaucracy, organizational learning, organizational change, E and O theory

Procedia PDF Downloads 432
25121 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations

Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe

Abstract:

In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.

Keywords: electronic health records, electronic emergency department information system, emergency department, data quality

Procedia PDF Downloads 271
25120 Data Presentation of Lane-Changing Events Trajectories Using HighD Dataset

Authors: Basma Khelfa, Antoine Tordeux, Ibrahima Ba

Abstract:

We present a descriptive analysis data of lane-changing events in multi-lane roads. The data are provided from The Highway Drone Dataset (HighD), which are microscopic trajectories in highway. This paper describes and analyses the role of the different parameters and their significance. Thanks to HighD data, we aim to find the most frequent reasons that motivate drivers to change lanes. We used the programming language R for the processing of these data. We analyze the involvement and relationship of different variables of each parameter of the ego vehicle and the four vehicles surrounding it, i.e., distance, speed difference, time gap, and acceleration. This was studied according to the class of the vehicle (car or truck), and according to the maneuver it undertook (overtaking or falling back).

Keywords: autonomous driving, physical traffic model, prediction model, statistical learning process

Procedia PDF Downloads 251
25119 Integrating Road Safety into Mainstreaming Education and Other Initiatives with Holistic Approach in the State: A Case Study of Madhya Pradesh, India

Authors: Yogesh Mahor, Subhash Nigam, Abhai Khare

Abstract:

Road safety education is a composite subject which should be viewed holistically if taken into accoubehavior change communication, safe road infrastructure and low enforcement. Specific and customized road safety education is crucial for each type of road user and learners in the formal and informal teaching and various kind of training programs directly sponsored by state and center government, as they are active contributors to shaping a community and responsible citizens. The aim of this discussion article is to explore a strategy to integrate road safety education into the formal curriculum of schools, higher education institutions, driving schools, skill development centers, various government funded urban and rural development training institutions and their work plans as standing agenda. By applying the desktop research method, the article conceptualizes what the possible focus of road safety education and training should be. The article then explores international common practices in road safety education and training, and considers the necessary synergy between education, road engineering and low enforcement. The article uses secondary data collected from documents which are then analysed in a sectoral way. A well-designed road safety strategy for mainstreaming education and government-sponsored training is urgently needed, facilitating partnerships in various sectors to implement such education in the students and learners in multidisciplinary ways.

Keywords: road safety education, curriculum-based road safety education, behavior change communication, low enforcement, road engineering, safe system approach, infrastructure development consultants

Procedia PDF Downloads 123
25118 Effectiveness of ATMS (Advanced Transport Management Systems) in Asuncion, Paraguay

Authors: Sung Ho Oh

Abstract:

The advanced traffic lights, the system of traffic information collection and provision, the CCTVs for traffic control, and the traffic information center were installed in Asuncion, capital of Paraguay. After pre-post comparison of the installation, significant changes were found. Even though the traffic volumes were increased, travel speed was higher, so that travel time from origin to destination was decreased. the saving values for travel time, gas cost, and environmental cost are about 47 million US dollars per year. Satisfaction survey results for the installation were presented with statistical significance analysis.

Keywords: advanced transport management systems, effectiveness, Paraguay, traffic lights

Procedia PDF Downloads 349
25117 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator

Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain

Abstract:

Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.

Keywords: percent depth dose, flatness, symmetry, golden beam data

Procedia PDF Downloads 485
25116 Variable-Fidelity Surrogate Modelling with Kriging

Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans

Abstract:

Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.

Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients

Procedia PDF Downloads 554
25115 Robust Barcode Detection with Synthetic-to-Real Data Augmentation

Authors: Xiaoyan Dai, Hsieh Yisan

Abstract:

Barcode processing of captured images is a huge challenge, as different shooting conditions can result in different barcode appearances. This paper proposes a deep learning-based barcode detection using synthetic-to-real data augmentation. We first augment barcodes themselves; we then augment images containing the barcodes to generate a large variety of data that is close to the actual shooting environments. Comparisons with previous works and evaluations with our original data show that this approach achieves state-of-the-art performance in various real images. In addition, the system uses hybrid resolution for barcode “scan” and is applicable to real-time applications.

Keywords: barcode detection, data augmentation, deep learning, image-based processing

Procedia PDF Downloads 163
25114 Correlations between Pushing Skills and Pushing Perceptions, Second-Stage Labor Duration, Postpartum Fatigue, and Birth Satisfaction

Authors: Yu-Ching Huang

Abstract:

Background: Delivery bridges the antepartum and postpartum period. Subsequent fatigue can affect indices, including postpartum recovery and life quality. Milk secretion, breastfeeding quality, and newborn participation may be compromised. Correspondingly, using proper pushing skills during the second stage of labor has the potential to effectively reduce postpartum fatigue and enhance birth satisfaction in new mothers. Purpose: To compare the effects of using different pushing skills on maternal pushing perception, postpartum fatigue, and birth satisfaction. Methodology: The present study used a descriptive research approach and recruited 382 participants from a medical center in northern Taiwan. Data were collected using a structured questionnaire, which included a demographic and obstetrics information datasheet, the Labor Pushing Experience Scale, a fatigue scale, and a birth satisfaction scale. Research Results: Using pushing skills (including upright position [t= 2.28, p < .05] and delayed pushing [t= -1.98, p < .05] during the second stage of labor was shown to enhance birth satisfaction in participants. Additionally, open glottis pushing ( t = 5.46, p < .001) resulted in a mean duration of second-stage labor that was 17.67 minutes less than that achieved using Valsalva pushing. Moreover, a better perceived pushing experience was associated with lower perceived postpartum fatigue (r = .46, p < .05) and higher birth satisfaction (r = -.16, p < .05). Finally, postpartum fatigue perception was negatively associated with birth satisfaction (r = -.16, p < .05). Conclusion and Clinical Application: The findings suggest that midwives should advocate that women adopt upright positions, delayed pushing, and open glottis pushing during the second stage of labor in order to enhance their birth satisfaction.

Keywords: second stage labor duration of pushing skill, pushing experience perception, postpartum fatigue, birth satisfaction

Procedia PDF Downloads 265