Search results for: ground truth data
25035 Customer Satisfaction and Effective HRM Policies: Customer and Employee Satisfaction
Authors: S. Anastasiou, C. Nathanailides
Abstract:
The purpose of this study is to examine the possible link between employee and customer satisfaction. The service provided by employees, help to build a good relationship with customers and can help at increasing their loyalty. Published data for job satisfaction and indicators of customer services were gathered from relevant published works which included data from five different countries. The reviewed data indicate a significant correlation between indicators of customer and employee satisfaction in the Banking sector. There was a significant correlation between the two parameters (Pearson correlation R2=0.52 P<0.05) The reviewed data provide evidence that there is some practical evidence which links these two parameters.Keywords: job satisfaction, job performance, customer’ service, banks, human resources management
Procedia PDF Downloads 32125034 Study the Influence of the Type of Cast Iron Chips on the Quality of Briquettes Obtained with Controlled Impact
Authors: Dimitar N. Karastoianov, Stanislav D. Gyoshev, Todor N. Penchev
Abstract:
Preparation of briquettes of metal chips with good density and quality is of great importance for the efficiency of this process. In this paper are presented the results of impact briquetting of grey cast iron chips with rectangular shape and dimensions 15x25x1 mm. Density and quality of briquettes of these chips are compared with those obtained in another work of the authors using cast iron chips with smaller sizes. It has been found that by using a rectangular chips with a large size are produced briquettes with a very low density and poor quality. From the photographs taken by X-ray tomography, it is clear that the reason for this is the orientation of the chip in the peripheral wall of the briquettes, which does not allow of the air to escape from it. It was concluded that in order to obtain briquettes of cast iron chips with a large size, these chips must first be ground, for example in a small ball mill.Keywords: briquetting, chips, impact, rocket engine
Procedia PDF Downloads 52525033 4-Channel CWDM Optical Transceiver Applying Silicon Photonics Ge-Photodiode and MZ-Modulator
Authors: Do-Won Kim, Andy Eu Jin Lim, Raja Muthusamy Kumarasamy, Vishal Vinayak, Jacky Wang Yu-Shun, Jason Liow Tsung Yang, Patrick Lo Guo Qiang
Abstract:
In this study, we demonstrate 4-channel coarse wavelength division multiplexing (CWDM) optical transceiver based on silicon photonics integrated circuits (PIC) of waveguide Ge-photodiode (Ge-PD) and Mach Zehnder (MZ)-modulator. 4-channel arrayed PICs of Ge-PD and MZ-modulator are verified to operate at 25 Gbps/ch achieving 4x25 Gbps of total data rate. 4 bare dies of single-channel commercial electronics ICs (EICs) of trans-impedance amplifier (TIA) for Ge-PD and driver IC for MZ-modulator are packaged with PIC on printed circuit board (PCB) in a chip-on-board (COB) manner. Each single-channel EIC is electrically connected to the one channel of 4-channel PICs by wire bonds to trace. The PICs have 4-channel multiplexer for MZ-modulator and 4-channel demultiplexer for Ge-PD. The 4-channel multiplexer/demultiplexer have echelle gratings for4 CWDM optic signals of which center wavelengths are 1511, 1531, 1553, and 1573 nm. Its insertion loss is around 4dB with over 15dB of extinction ratio.The dimension of 4-channel Ge-PD is 3.6x1.4x0.3mm, and its responsivity is 1A/W with dark current of less than 20 nA.Its measured 3dB bandwidth is around 20GHz. The dimension of the 4-channel MZ-modulator is 3.6x4.8x0.3mm, and its 3dB bandwidth is around 11Ghz at -2V of reverse biasing voltage. It has 2.4V•cmbyVπVL of 6V for π shift to 4 mm length modulator.5x5um of Inversed tapered mode size converter with less than 2dB of coupling loss is used for the coupling of the lensed fiber which has 5um of mode field diameter.The PCB for COB packaging and signal transmission is designed to have 6 layers in the hybrid layer structure. 0.25 mm-thick Rogers Duroid RT5880 is used as the first core dielectric layer for high-speed performance over 25 Gbps. It has 0.017 mm-thick of copper layers and its dielectric constant is 2.2and dissipation factor is 0.0009 at 10 GHz. The dimension of both single ended and differential microstrip transmission lines are calculated using full-wave electromagnetic (EM) field simulator HFSS which RF industry is using most. It showed 3dB bandwidth at around 15GHz in S-parameter measurement using network analyzer. The wire bond length for transmission line and ground connection from EIC is done to have less than 300 µm to minimize the parasitic effect to the system.Single layered capacitors (SLC) of 100pF and 1000pF are connected as close as possible to the EICs for stabilizing the DC biasing voltage by decoupling. Its signal transmission performance is under measurement at 25Gbps achieving 100Gbps by 4chx25Gbps. This work can be applied for the active optical cable (AOC) and quad small form-factor pluggable (QSFP) for high-speed optical interconnections. Its demands are quite large in data centers targeting 100 Gbps, 400 Gbps, and 1 Tbps. As the demands of high-speed AOC and QSFP for the application to intra/inter data centers increase, this silicon photonics based high-speed 4 channel CWDM scheme can have advantages not only in data throughput but also cost effectiveness since it reduces fiber cost dramatically through WDM.Keywords: active optical cable(AOC), 4-channel coarse wavelength division multiplexing (CWDM), communication system, data center, ge-photodiode, Mach Zehnder (MZ) modulator, optical interconnections, optical transceiver, photonics integrated circuits (PIC), quad small form-factor pluggable (QSFP), silicon photonics
Procedia PDF Downloads 41825032 Evaluation of Australian Open Banking Regulation: Balancing Customer Data Privacy and Innovation
Authors: Suman Podder
Abstract:
As Australian ‘Open Banking’ allows customers to share their financial data with accredited Third-Party Providers (‘TPPs’), it is necessary to evaluate whether the regulators have achieved the balance between protecting customer data privacy and promoting data-related innovation. Recognising the need to increase customers’ influence on their own data, and the benefits of data-related innovation, the Australian Government introduced ‘Consumer Data Right’ (‘CDR’) to the banking sector through Open Banking regulation. Under Open Banking, TPPs can access customers’ banking data that allows the TPPs to tailor their products and services to meet customer needs at a more competitive price. This facilitated access and use of customer data will promote innovation by providing opportunities for new products and business models to emerge and grow. However, the success of Open Banking depends on the willingness of the customers to share their data, so the regulators have augmented the protection of data by introducing new privacy safeguards to instill confidence and trust in the system. The dilemma in policymaking is that, on the one hand, lenient data privacy laws will help the flow of information, but at the risk of individuals’ loss of privacy, on the other hand, stringent laws that adequately protect privacy may dissuade innovation. Using theoretical and doctrinal methods, this paper examines whether the privacy safeguards under Open Banking will add to the compliance burden of the participating financial institutions, resulting in the undesirable effect of stifling other policy objectives such as innovation. The contribution of this research is three-fold. In the emerging field of customer data sharing, this research is one of the few academic studies on the objectives and impact of Open Banking in the Australian context. Additionally, Open Banking is still in the early stages of implementation, so this research traces the evolution of Open Banking through policy debates regarding the desirability of customer data-sharing. Finally, the research focuses not only on the customers’ data privacy and juxtaposes it with another important objective of promoting innovation, but it also highlights the critical issues facing the data-sharing regime. This paper argues that while it is challenging to develop a regulatory framework for protecting data privacy without impeding innovation and jeopardising yet unknown opportunities, data privacy and innovation promote different aspects of customer welfare. This paper concludes that if a regulation is appropriately designed and implemented, the benefits of data-sharing will outweigh the cost of compliance with the CDR.Keywords: consumer data right, innovation, open banking, privacy safeguards
Procedia PDF Downloads 14125031 Generation of Automated Alarms for Plantwide Process Monitoring
Authors: Hyun-Woo Cho
Abstract:
Earlier detection of incipient abnormal operations in terms of plant-wide process management is quite necessary in order to improve product quality and process safety. And generating warning signals or alarms for operating personnel plays an important role in process automation and intelligent plant health monitoring. Various methodologies have been developed and utilized in this area such as expert systems, mathematical model-based approaches, multivariate statistical approaches, and so on. This work presents a nonlinear empirical monitoring methodology based on the real-time analysis of massive process data. Unfortunately, the big data includes measurement noises and unwanted variations unrelated to true process behavior. Thus the elimination of such unnecessary patterns of the data is executed in data processing step to enhance detection speed and accuracy. The performance of the methodology was demonstrated using simulated process data. The case study showed that the detection speed and performance was improved significantly irrespective of the size and the location of abnormal events.Keywords: detection, monitoring, process data, noise
Procedia PDF Downloads 25225030 Meanings and Concepts of Standardization in Systems Medicine
Authors: Imme Petersen, Wiebke Sick, Regine Kollek
Abstract:
In systems medicine, high-throughput technologies produce large amounts of data on different biological and pathological processes, including (disturbed) gene expressions, metabolic pathways and signaling. The large volume of data of different types, stored in separate databases and often located at different geographical sites have posed new challenges regarding data handling and processing. Tools based on bioinformatics have been developed to resolve the upcoming problems of systematizing, standardizing and integrating the various data. However, the heterogeneity of data gathered at different levels of biological complexity is still a major challenge in data analysis. To build multilayer disease modules, large and heterogeneous data of disease-related information (e.g., genotype, phenotype, environmental factors) are correlated. Therefore, a great deal of attention in systems medicine has been put on data standardization, primarily to retrieve and combine large, heterogeneous datasets into standardized and incorporated forms and structures. However, this data-centred concept of standardization in systems medicine is contrary to the debate in science and technology studies (STS) on standardization that rather emphasizes the dynamics, contexts and negotiations of standard operating procedures. Based on empirical work on research consortia that explore the molecular profile of diseases to establish systems medical approaches in the clinic in Germany, we trace how standardized data are processed and shaped by bioinformatics tools, how scientists using such data in research perceive such standard operating procedures and which consequences for knowledge production (e.g. modeling) arise from it. Hence, different concepts and meanings of standardization are explored to get a deeper insight into standard operating procedures not only in systems medicine, but also beyond.Keywords: data, science and technology studies (STS), standardization, systems medicine
Procedia PDF Downloads 34125029 Structural-Geotechnical Effects of the Foundation of a Medium-Height Structure
Authors: Valentina Rodas, Luis Almache
Abstract:
The interaction effects between the existing soil and the substructure of a 5-story building with an underground one were evaluated in such a way that the structural-geotechnical concepts were validated through the method of impedance factors with a program based on the method of the finite elements. The continuous wall-type foundation had a constant thickness and followed inclined and orthogonal directions, while the ground had homogeneous and medium-type characteristics. The soil considered was type C according to the Ecuadorian Construction Standard (NEC) and the corresponding foundation comprised a depth of 4.00 meters and a basement wall thickness of 40 centimeters. This project is part of a mid-rise building in the city of Azogues (Ecuador). The hypotheses raised responded to the objectives in such a way that the model implemented with springs had a variation with respect to the embedded base, obtaining conservative results.Keywords: interaction, soil, substructure, springs, effects, modeling , embedment
Procedia PDF Downloads 23025028 Integrated On-Board Diagnostic-II and Direct Controller Area Network Access for Vehicle Monitoring System
Authors: Kavian Khosravinia, Mohd Khair Hassan, Ribhan Zafira Abdul Rahman, Syed Abdul Rahman Al-Haddad
Abstract:
The CAN (controller area network) bus is introduced as a multi-master, message broadcast system. The messages sent on the CAN are used to communicate state information, referred as a signal between different ECUs, which provides data consistency in every node of the system. OBD-II Dongles that are based on request and response method is the wide-spread solution for extracting sensor data from cars among researchers. Unfortunately, most of the past researches do not consider resolution and quantity of their input data extracted through OBD-II technology. The maximum feasible scan rate is only 9 queries per second which provide 8 data points per second with using ELM327 as well-known OBD-II dongle. This study aims to develop and design a programmable, and latency-sensitive vehicle data acquisition system that improves the modularity and flexibility to extract exact, trustworthy, and fresh car sensor data with higher frequency rates. Furthermore, the researcher must break apart, thoroughly inspect, and observe the internal network of the vehicle, which may cause severe damages to the expensive ECUs of the vehicle due to intrinsic vulnerabilities of the CAN bus during initial research. Desired sensors data were collected from various vehicles utilizing Raspberry Pi3 as computing and processing unit with using OBD (request-response) and direct CAN method at the same time. Two types of data were collected for this study. The first, CAN bus frame data that illustrates data collected for each line of hex data sent from an ECU and the second type is the OBD data that represents some limited data that is requested from ECU under standard condition. The proposed system is reconfigurable, human-readable and multi-task telematics device that can be fitted into any vehicle with minimum effort and minimum time lag in the data extraction process. The standard operational procedure experimental vehicle network test bench is developed and can be used for future vehicle network testing experiment.Keywords: CAN bus, OBD-II, vehicle data acquisition, connected cars, telemetry, Raspberry Pi3
Procedia PDF Downloads 20525027 Big Data in Construction Project Management: The Colombian Northeast Case
Authors: Sergio Zabala-Vargas, Miguel Jiménez-Barrera, Luz VArgas-Sánchez
Abstract:
In recent years, information related to project management in organizations has been increasing exponentially. Performance data, management statistics, indicator results have forced the collection, analysis, traceability, and dissemination of project managers to be essential. In this sense, there are current trends to facilitate efficient decision-making in emerging technology projects, such as: Machine Learning, Data Analytics, Data Mining, and Big Data. The latter is the most interesting in this project. This research is part of the thematic line Construction methods and project management. Many authors present the relevance that the use of emerging technologies, such as Big Data, has taken in recent years in project management in the construction sector. The main focus is the optimization of time, scope, budget, and in general mitigating risks. This research was developed in the northeastern region of Colombia-South America. The first phase was aimed at diagnosing the use of emerging technologies (Big-Data) in the construction sector. In Colombia, the construction sector represents more than 50% of the productive system, and more than 2 million people participate in this economic segment. The quantitative approach was used. A survey was applied to a sample of 91 companies in the construction sector. Preliminary results indicate that the use of Big Data and other emerging technologies is very low and also that there is interest in modernizing project management. There is evidence of a correlation between the interest in using new data management technologies and the incorporation of Building Information Modeling BIM. The next phase of the research will allow the generation of guidelines and strategies for the incorporation of technological tools in the construction sector in Colombia.Keywords: big data, building information modeling, tecnology, project manamegent
Procedia PDF Downloads 12825026 The Financial Impact of Covid 19 on the Hospitality Industry in New Zealand
Authors: Kay Fielden, Eelin Tan, Lan Nguyen
Abstract:
In this research project, data was gathered at a Covid 19 Conference held in June 2021 from industry leaders who discussed the impact of the global pandemic on the status of the New Zealand hospitality industry. Panel discussions on financials, human resources, health and safety, and recovery were conducted. The themes explored for the finance panel were customer demographics, hospitality sectors, financial practices, government impact, and cost of compliance. The aim was to see how the hospitality industry has responded to the global pandemic and the steps that have been taken for the industry to recover or sustain their business. The main research question for this qualitative study is: What are the factors that have impacted on finance for the hospitality industry in New Zealand due to Covid 19? For financials, literature has been gathered to study global effects, and this is being compared with the data gathered from the discussion panel through the lens of resilience theory. Resilience theory applied to the hospitality industry suggests that the challenges imposed by Covid 19 have been the catalyst for government initiatives, technical innovation, engaging local communities, and boosting confidence. Transformation arising from these ground shifts have been a move towards sustainability, wellbeing, more awareness of climate change, and community engagement. Initial findings suggest that there has been a shift in customer base that has prompted regional accommodation providers to realign offers and to become more flexible to attract and maintain this realigned customer base. Dynamic pricing structures have been required to meet changing customer demographics. Flexible staffing arrangements include sharing staff between different accommodation providers, owners with multiple properties adopting different staffing arrangements, maintaining a good working relationship with the bank, and conserving cash. Uncertain times necessitate changing revenue strategies to cope with external factors. Financial support offered by the government has cushioned the financial downturn for many in the hospitality industry, and managed isolation and quarantine (MIQ) arrangements have offered immediate financial relief for those hotels involved. However, there is concern over the long-term effects. Compliance with mandated health and safety requirements has meant that the hospitality industry has streamlined its approach to meeting those requirements and has invested in customer relations to keep paying customers informed of the health measures in place. Initial findings from this study lie within the resilience theory framework and are consistent with findings from the literature.Keywords: global pandemic, hospitality industry, new Zealand, resilience
Procedia PDF Downloads 10125025 Surface Deformation Studies in South of Johor Using the Integration of InSAR and Resistivity Methods
Authors: Sirajo Abubakar, Ismail Ahmad Abir, Muhammad Sabiu Bala, Muhammad Mustapha Adejo, Aravind Shanmugaveloo
Abstract:
Over the years, land subsidence has been a serious threat mostly to urban areas. Land subsidence is the sudden sinking or gradual downward settling of the ground’s surface with little or no horizontal motion. In most areas, land subsidence is a slow process that covers a large area; therefore, it is sometimes left unnoticed. South of Johor is the area of interest for this project because it is going through rapid urbanization. The objective of this research is to evaluate and identify potential deformations in the south of Johor using integrated remote sensing and 2D resistivity methods. Synthetic aperture radar interferometry (InSAR) which is a remote sensing technique has the potential to map coherent displacements at centimeter to millimeter resolutions. Persistent scatterer interferometry (PSI) stacking technique was applied to Sentinel-1 data to detect the earth deformation in the study area. A dipole-dipole configuration resistivity profiling was conducted in three areas to determine the subsurface features in that area. This subsurface features interpreted were then correlated with the remote sensing technique to predict the possible causes of subsidence and uplifts in the south of Johor. Based on the results obtained, West Johor Bahru (0.63mm/year) and Ulu Tiram (1.61mm/year) are going through uplift due to possible geological uplift. On the other end, East Johor Bahru (-0.26mm/year) and Senai (-1.16mm/year) undergo subsidence due to possible fracture and granitic boulders loading. Land subsidence must be taken seriously as it can cause serious damages to infrastructures and human life. Monitoring land subsidence and taking preventive actions must be done to prevent any disasters.Keywords: interferometric synthetic aperture radar, persistent scatter, minimum spanning tree, resistivity, subsidence
Procedia PDF Downloads 14725024 Structural Analysis of Archaeoseismic Records Linked to the 5 July 408 - 410 AD Utica Strong Earthquake (NE Tunisia)
Authors: Noureddine Ben Ayed, Abdelkader Soumaya, Saïd Maouche, Ali Kadri, Mongi Gueddiche, Hayet Khayati-Ammar, Ahmed Braham
Abstract:
The archaeological monument of Utica, located in north-eastern Tunisia, was founded (8th century BC) By the Phoenicians as a port installed on the trade route connecting Phoenicia and the Straits of Gibraltar in the Mediterranean Sea. The flourishment of this city as an important settlement during the Roman period was followed by a sudden abandonment, disuse and progressive oblivion in the first half of the fifth century AD. This decadence can be attributed to the destructive earthquake of 5 July 408 - 410 AD, affecting this historic city as documented in 1906 by the seismologist Fernand De Montessus De Ballore. The magnitude of the Utica earthquake was estimated at 6.8 by the Tunisian National Institute of Meteorology (INM). In order to highlight the damage caused by this earthquake, a field survey was carried out at the Utica ruins to detect and analyse the earthquake archaeological effects (EAEs) using structural geology methods. This approach allowed us to highlight several structural damages, including: (1) folded mortar pavements, (2) cracks affecting the mosaic and walls of a water basin in the "House of the Grand Oecus", (3) displaced columns, (4) block extrusion in masonry walls, (5) undulations in mosaic pavements, (6) tilted walls. The structural analysis of these EAEs and data measurements reveal a seismic cause for all evidence of deformation in the Utica monument. The maximum horizontal strain of the ground (e.g. SHmax) inferred from the building oriented damage in Utica shows a NNW-SSE direction under a compressive tectonic regime. For the seismogenic source of this earthquake, we propose the active E-W to NE-SW trending Utique - Ghar El Melh reverse fault, passing through the Utica Monument and extending towards the Ghar El Melh Lake, as the causative tectonic structure. The active fault trace is well supported by instrumental seismicity, geophysical data (e.g., gravity, seismic profiles) and geomorphological analyses. In summary, we find that the archaeoseismic records detected at Utica are similar to those observed at many other archaeological sites affected by destructive ancient earthquakes around the world. Furthermore, the calculated orientation of the average maximum horizontal stress (SHmax) closely match the state of the actual stress field, as highlighted by some earthquake focal mechanisms in this region.Keywords: Tunisia, utica, seimogenic fault, archaeological earthquake effects
Procedia PDF Downloads 4525023 Environmental and Toxicological Impacts of Glyphosate with Its Formulating Adjuvant
Authors: I. Székács, Á. Fejes, S. Klátyik, E. Takács, D. Patkó, J. Pomóthy, M. Mörtl, R. Horváth, E. Madarász, B. Darvas, A. Székács
Abstract:
Environmental and toxicological characteristics of formulated pesticides may substantially differ from those of their active ingredients or other components alone. This phenomenon is demonstrated in the case of the herbicide active ingredient glyphosate. Due to its extensive application, this active ingredient was found in surface and ground water samples collected in Békés County, Hungary, in the concentration range of 0.54–0.98 ng/ml. The occurrence of glyphosate appeared to be somewhat higher at areas under intensive agriculture, industrial activities and public road services, but the compound was detected at areas under organic (ecological) farming or natural grasslands, indicating environmental mobility. Increased toxicity of the formulated herbicide product Roundup, compared to that of glyphosate was observed on the indicator aquatic organism Daphnia magna Straus. Acute LC50 values of Roundup and its formulating adjuvant Polyethoxylated Tallowamine (POEA) exceeded 20 and 3.1 mg/ml, respectively, while that of glyphosate (as isopropyl salt) was found to be substantially lower (690-900 mg/ml) showing good agreement with literature data. Cytotoxicity of Roundup, POEA and glyphosate has been determined on the neuroectodermal cell line, NE-4C measured both by cell viability test and holographic microscopy. Acute toxicity (LC50) of Roundup, POEA and glyphosate on NE-4C cells was found to be 0.013±0.002%, 0.017±0.009% and 6.46±2.25%, respectively (in equivalents of diluted Roundup solution), corresponding to 0.022±0.003 and 53.1±18.5 mg/ml for POEA and glyphosate, respectively, indicating no statistical difference between Roundup and POEA and 2.5 orders of magnitude difference between these and glyphosate. The same order of cellular toxicity seen in average cell area has been indicated under quantitative cell visualization. The results indicate that toxicity of the formulated herbicide is caused by the formulating agent, but in some parameters toxicological synergy occurs between POEA and glyphosate.Keywords: glyphosate, polyethoxylated tallowamine, Roundup, combined aquatic and cellular toxicity, synergy
Procedia PDF Downloads 31925022 A Case Study on Re-Assessment Study of an Earthfill Dam at Latamber, Pakistan
Authors: Afnan Ahmad, Shahid Ali, Mujahid Khan
Abstract:
This research presents the parametric study of an existing earth fill dam located at Latamber, Karak city, Pakistan. The study consists of carrying out seepage analysis, slope stability analysis, and Earthquake analysis of the dam for the existing dam geometry and do the same for modified geometry. Dams are massive as well as expensive hydraulic structure, therefore it needs proper attention. Additionally, this dam falls under zone 2B region of Pakistan, which is an earthquake-prone area and where ground accelerations range from 0.16g to 0.24g peak. So it should be deal with great care, as the failure of any dam can cause irreparable losses. Similarly, seepage as well as slope failure can also cause damages which can lead to failure of the dam. Therefore, keeping in view of the importance of dam construction and associated costs, our main focus is to carry out parametric study of newly constructed dam. GeoStudio software is used for this analysis in the study in which Seep/W is used for seepage analysis, Slope/w is used for Slope stability analysis and Quake/w is used for earthquake analysis. Based on the geometrical, hydrological and geotechnical data, Seepage and slope stability analysis of different proposed geometries of the dam are carried out along with the Seismic analysis. A rigorous analysis was carried out in 2-D limit equilibrium using finite element analysis. The seismic study began with the static analysis, continuing by the dynamic response analysis. The seismic analyses permitted evaluation of the overall patterns of the Latamber dam behavior in terms of displacements, stress, strain, and acceleration fields. Similarly, the seepage analysis allows evaluation of seepage through the foundation and embankment of the dam, while slope stability analysis estimates the factor of safety of the upstream and downstream of the dam. The results of the analysis demonstrate that among multiple geometries, Latamber dam is secure against seepage piping failure and slope stability (upstream and downstream) failure. Moreover, the dam is safe against any dynamic loading and no liquefaction has been observed while changing its geometry in permissible limits.Keywords: earth-fill dam, finite element, liquefaction, seepage analysis
Procedia PDF Downloads 16425021 Minimum Data of a Speech Signal as Special Indicators of Identification in Phonoscopy
Authors: Nazaket Gazieva
Abstract:
Voice biometric data associated with physiological, psychological and other factors are widely used in forensic phonoscopy. There are various methods for identifying and verifying a person by voice. This article explores the minimum speech signal data as individual parameters of a speech signal. Monozygotic twins are believed to be genetically identical. Using the minimum data of the speech signal, we came to the conclusion that the voice imprint of monozygotic twins is individual. According to the conclusion of the experiment, we can conclude that the minimum indicators of the speech signal are more stable and reliable for phonoscopic examinations.Keywords: phonogram, speech signal, temporal characteristics, fundamental frequency, biometric fingerprints
Procedia PDF Downloads 14425020 Heat Transfer Correlations for Exhaust Gas Flow
Authors: Fatih Kantas
Abstract:
Exhaust systems are key contributors to ground vehicles as a heat source. Understanding heat transfer in exhaust systems is related to defining effective parameter on heat transfer in exhaust system. In this journal, over 20 Nusselt numbers are investigated. This study shows advantages and disadvantages of various Nusselt numbers in different range Re, Pr and pulsating flow amplitude and frequency. Also (CAF) Convective Augmentation Factors are defined to correct standard Nusselt number for geometry and location of exhaust system. Finally, optimum Nusselt number and Convective Augmentation Factors are recommended according to Re, Pr and pulsating flow amplitude and frequency, geometry and location effect of exhaust system.Keywords: exhaust gas flow, heat transfer correlation, Nusselt, Prandtl, pulsating flow
Procedia PDF Downloads 35525019 Improvement of Soft Clay Soil with Biopolymer
Authors: Majid Bagherinia
Abstract:
Lime and cement are frequently used as binders in the Deep Mixing Method (DMM) to improve soft clay soils. The most significant disadvantages of these materials are carbon dioxide emissions and the consumption of natural resources. In this study, three different biopolymers, guar gum, locust bean gum, and sodium alginate, were investigated for the improvement of soft clay using DMM. In the experimental study, the effects of the additive ratio and curing time on the Unconfined Compressive Strength (UCS) of stabilized specimens were investigated. According to the results, the UCS values of the specimens increased as the additive ratio and curing time increased. The most effective additive was sodium alginate, and the highest strength was obtained after 28 days.Keywords: deep mixing method, soft clays, ground improvement, biopolymers, unconfined compressive strength
Procedia PDF Downloads 8025018 Indo-US Strategic Collaboration in Space Capabilities and its Effect on the Stability of South Asian Region
Authors: Shahab Khan, Damiya Saghir
Abstract:
With the advent of space technology, a new era began where space, considered the new ‘High ground,’ is used for a variety of commercial (communications, weather and navigational information, Earth resources monitoring and imagery) and military applications (surveillance, tracking, reconnaissance and espionage of adversaries). With the ever-evolving geo-political environment, where now the US foreseeing India as a counterbalance to China’s economic and military rise, significant growth in strategic collaboration between US and India has been witnessed, particularly in the space domain. This is creating a strategic imbalance in South Asia with implications for all regional countries. This research explores the present and future of Indo-US strategic collaboration in the space domain with envisaged effects and challenges for countries in the South Asian region.Keywords: space, satellites, Indo-US strategic agreements in space domain, balance of power in South Asian region
Procedia PDF Downloads 12925017 A Non-parametric Clustering Approach for Multivariate Geostatistical Data
Authors: Francky Fouedjio
Abstract:
Multivariate geostatistical data have become omnipresent in the geosciences and pose substantial analysis challenges. One of them is the grouping of data locations into spatially contiguous clusters so that data locations within the same cluster are more similar while clusters are different from each other, in some sense. Spatially contiguous clusters can significantly improve the interpretation that turns the resulting clusters into meaningful geographical subregions. In this paper, we develop an agglomerative hierarchical clustering approach that takes into account the spatial dependency between observations. It relies on a dissimilarity matrix built from a non-parametric kernel estimator of the spatial dependence structure of data. It integrates existing methods to find the optimal cluster number and to evaluate the contribution of variables to the clustering. The capability of the proposed approach to provide spatially compact, connected and meaningful clusters is assessed using bivariate synthetic dataset and multivariate geochemical dataset. The proposed clustering method gives satisfactory results compared to other similar geostatistical clustering methods.Keywords: clustering, geostatistics, multivariate data, non-parametric
Procedia PDF Downloads 47725016 Big Data in Telecom Industry: Effective Predictive Techniques on Call Detail Records
Authors: Sara ElElimy, Samir Moustafa
Abstract:
Mobile network operators start to face many challenges in the digital era, especially with high demands from customers. Since mobile network operators are considered a source of big data, traditional techniques are not effective with new era of big data, Internet of things (IoT) and 5G; as a result, handling effectively different big datasets becomes a vital task for operators with the continuous growth of data and moving from long term evolution (LTE) to 5G. So, there is an urgent need for effective Big data analytics to predict future demands, traffic, and network performance to full fill the requirements of the fifth generation of mobile network technology. In this paper, we introduce data science techniques using machine learning and deep learning algorithms: the autoregressive integrated moving average (ARIMA), Bayesian-based curve fitting, and recurrent neural network (RNN) are employed for a data-driven application to mobile network operators. The main framework included in models are identification parameters of each model, estimation, prediction, and final data-driven application of this prediction from business and network performance applications. These models are applied to Telecom Italia Big Data challenge call detail records (CDRs) datasets. The performance of these models is found out using a specific well-known evaluation criteria shows that ARIMA (machine learning-based model) is more accurate as a predictive model in such a dataset than the RNN (deep learning model).Keywords: big data analytics, machine learning, CDRs, 5G
Procedia PDF Downloads 13925015 A Data Mining Approach for Analysing and Predicting the Bank's Asset Liability Management Based on Basel III Norms
Authors: Nidhin Dani Abraham, T. K. Sri Shilpa
Abstract:
Asset liability management is an important aspect in banking business. Moreover, the today’s banking is based on BASEL III which strictly regulates on the counterparty default. This paper focuses on prediction and analysis of counter party default risk, which is a type of risk occurs when the customers fail to repay the amount back to the lender (bank or any financial institutions). This paper proposes an approach to reduce the counterparty risk occurring in the financial institutions using an appropriate data mining technique and thus predicts the occurrence of NPA. It also helps in asset building and restructuring quality. Liability management is very important to carry out banking business. To know and analyze the depth of liability of bank, a suitable technique is required. For that a data mining technique is being used to predict the dormant behaviour of various deposit bank customers. Various models are implemented and the results are analyzed of saving bank deposit customers. All these data are cleaned using data cleansing approach from the bank data warehouse.Keywords: data mining, asset liability management, BASEL III, banking
Procedia PDF Downloads 55325014 Parallel Coordinates on a Spiral Surface for Visualizing High-Dimensional Data
Authors: Chris Suma, Yingcai Xiao
Abstract:
This paper presents Parallel Coordinates on a Spiral Surface (PCoSS), a parallel coordinate based interactive visualization method for high-dimensional data, and a test implementation of the method. Plots generated by the test system are compared with those generated by XDAT, a software implementing traditional parallel coordinates. Traditional parallel coordinate plots can be cluttered when the number of data points is large or when the dimensionality of the data is high. PCoSS plots display multivariate data on a 3D spiral surface and allow users to see the whole picture of high-dimensional data with less cluttering. Taking advantage of the 3D display environment in PCoSS, users can further reduce cluttering by zooming into an axis of interest for a closer view or by moving vantage points and by reorienting the viewing angle to obtain a desired view of the plots.Keywords: human computer interaction, parallel coordinates, spiral surface, visualization
Procedia PDF Downloads 1225013 A Dynamic Ensemble Learning Approach for Online Anomaly Detection in Alibaba Datacenters
Authors: Wanyi Zhu, Xia Ming, Huafeng Wang, Junda Chen, Lu Liu, Jiangwei Jiang, Guohua Liu
Abstract:
Anomaly detection is a first and imperative step needed to respond to unexpected problems and to assure high performance and security in large data center management. This paper presents an online anomaly detection system through an innovative approach of ensemble machine learning and adaptive differentiation algorithms, and applies them to performance data collected from a continuous monitoring system for multi-tier web applications running in Alibaba data centers. We evaluate the effectiveness and efficiency of this algorithm with production traffic data and compare with the traditional anomaly detection approaches such as a static threshold and other deviation-based detection techniques. The experiment results show that our algorithm correctly identifies the unexpected performance variances of any running application, with an acceptable false positive rate. This proposed approach has already been deployed in real-time production environments to enhance the efficiency and stability in daily data center operations.Keywords: Alibaba data centers, anomaly detection, big data computation, dynamic ensemble learning
Procedia PDF Downloads 20125012 Unsupervised Text Mining Approach to Early Warning System
Authors: Ichihan Tai, Bill Olson, Paul Blessner
Abstract:
Traditional early warning systems that alarm against crisis are generally based on structured or numerical data; therefore, a system that can make predictions based on unstructured textual data, an uncorrelated data source, is a great complement to the traditional early warning systems. The Chicago Board Options Exchange (CBOE) Volatility Index (VIX), commonly referred to as the fear index, measures the cost of insurance against market crash, and spikes in the event of crisis. In this study, news data is consumed for prediction of whether there will be a market-wide crisis by predicting the movement of the fear index, and the historical references to similar events are presented in an unsupervised manner. Topic modeling-based prediction and representation are made based on daily news data between 1990 and 2015 from The Wall Street Journal against VIX index data from CBOE.Keywords: early warning system, knowledge management, market prediction, topic modeling.
Procedia PDF Downloads 33825011 The Role of Synthetic Data in Aerial Object Detection
Authors: Ava Dodd, Jonathan Adams
Abstract:
The purpose of this study is to explore the characteristics of developing a machine learning application using synthetic data. The study is structured to develop the application for the purpose of deploying the computer vision model. The findings discuss the realities of attempting to develop a computer vision model for practical purpose, and detail the processes, tools, and techniques that were used to meet accuracy requirements. The research reveals that synthetic data represents another variable that can be adjusted to improve the performance of a computer vision model. Further, a suite of tools and tuning recommendations are provided.Keywords: computer vision, machine learning, synthetic data, YOLOv4
Procedia PDF Downloads 22525010 Perception-Oriented Model Driven Development for Designing Data Acquisition Process in Wireless Sensor Networks
Authors: K. Indra Gandhi
Abstract:
Wireless Sensor Networks (WSNs) have always been characterized for application-specific sensing, relaying and collection of information for further analysis. However, software development was not considered as a separate entity in this process of data collection which has posed severe limitations on the software development for WSN. Software development for WSN is a complex process since the components involved are data-driven, network-driven and application-driven in nature. This implies that there is a tremendous need for the separation of concern from the software development perspective. A layered approach for developing data acquisition design based on Model Driven Development (MDD) has been proposed as the sensed data collection process itself varies depending upon the application taken into consideration. This work focuses on the layered view of the data acquisition process so as to ease the software point of development. A metamodel has been proposed that enables reusability and realization of the software development as an adaptable component for WSN systems. Further, observing users perception indicates that proposed model helps in improving the programmer's productivity by realizing the collaborative system involved.Keywords: data acquisition, model-driven development, separation of concern, wireless sensor networks
Procedia PDF Downloads 43425009 Comparative Analysis of Data Gathering Protocols with Multiple Mobile Elements for Wireless Sensor Network
Authors: Bhat Geetalaxmi Jairam, D. V. Ashoka
Abstract:
Wireless Sensor Networks are used in many applications to collect sensed data from different sources. Sensed data has to be delivered through sensors wireless interface using multi-hop communication towards the sink. The data collection in wireless sensor networks consumes energy. Energy consumption is the major constraints in WSN .Reducing the energy consumption while increasing the amount of generated data is a great challenge. In this paper, we have implemented two data gathering protocols with multiple mobile sinks/elements to collect data from sensor nodes. First, is Energy-Efficient Data Gathering with Tour Length-Constrained Mobile Elements in Wireless Sensor Networks (EEDG), in which mobile sinks uses vehicle routing protocol to collect data. Second is An Intelligent Agent-based Routing Structure for Mobile Sinks in WSNs (IAR), in which mobile sinks uses prim’s algorithm to collect data. Authors have implemented concepts which are common to both protocols like deployment of mobile sinks, generating visiting schedule, collecting data from the cluster member. Authors have compared the performance of both protocols by taking statistics based on performance parameters like Delay, Packet Drop, Packet Delivery Ratio, Energy Available, Control Overhead. Authors have concluded this paper by proving EEDG is more efficient than IAR protocol but with few limitations which include unaddressed issues likes Redundancy removal, Idle listening, Mobile Sink’s pause/wait state at the node. In future work, we plan to concentrate more on these limitations to avail a new energy efficient protocol which will help in improving the life time of the WSN.Keywords: aggregation, consumption, data gathering, efficiency
Procedia PDF Downloads 49725008 The Effectiveness of Tehran Municipality's Transformation of a Metro Station into Pedestrian-Friendly Public Spaces
Authors: Homa Hedayat
Abstract:
Public spaces have been a central concern of urban planners for centuries but have been neglected for a long time. In the modernist planning, the focus has been on the requirements of cars rather than the needs and expectations of pedestrians, and therefore, cities have lost many qualities. Urban public space is a space within the city area which is accessible to all people and is the ground for their activity. People’s public life occurs in urban public spaces in a complex set of forms and functions. These spaces must facilitate diverse behavior, uses, and activities such as shopping, walking, conversation, entertainment, relaxation or even passing the time during festivities and events. One of the public spaces is the surrounding space of public transportation stations. Subway stations, although potentially encompass many different groups of people accommodate few social interactions. Making the surrounding areas of subway stations pedestrian-oriented, potentially increases the socialization capacity. The Sadeghieh Subway Station can be considered as the most important subway station in Tehran, which on the one hand is the rail port of Tehran's western entrance, and on the other is the port for railway journeys inside the city. The main concern of this study is to assess the success or failure of the interventions made by the municipality for changing the surrounding area of the Sadeghieh Subway Station into a pedestrian-oriented space and examine the amount of the area's improvement into a desirable space. The method used in this study is surveying, in which the data were collected using a questionnaire and interview. The study's population is all people who use Sadeghieh Subway, and the sample size for the study was 140 subjects. Using parametric one-sample t-test, we found improvement in factors such as transportation, security, pedestrian infrastructure, vitality and climate comfort. However, there was no improvement in mix use, recreational activity, readability.Keywords: public space, public transportation stations, pedestrian-oriented space, socialization
Procedia PDF Downloads 20825007 Status and Results from EXO-200
Authors: Ryan Maclellan
Abstract:
EXO-200 has provided one of the most sensitive searches for neutrinoless double-beta decay utilizing 175 kg of enriched liquid xenon in an ultra-low background time projection chamber. This detector has demonstrated excellent energy resolution and background rejection capabilities. Using the first two years of data, EXO-200 has set a limit of 1.1x10^25 years at 90% C.L. on the neutrinoless double-beta decay half-life of Xe-136. The experiment has experienced a brief hiatus in data taking during a temporary shutdown of its host facility: the Waste Isolation Pilot Plant. EXO-200 expects to resume data taking in earnest this fall with upgraded detector electronics. Results from the analysis of EXO-200 data and an update on the current status of EXO-200 will be presented.Keywords: double-beta, Majorana, neutrino, neutrinoless
Procedia PDF Downloads 41425006 Examining the Contemporary Relevance of Mahatma Gandhi’s Thought: A Bulwark against Terrorism
Authors: Jayita Mukhopadhyay
Abstract:
Even though more than six decades has passed since the death of India’s iconic thinker and mass leader Mahatma Gandhi, the world besieged by terrorism may still take a leaf out of his philosophical discourse on non-violence and attempt to turn his theory into praxis to save mankind. The greatest soul world has ever produced, a man of divine fire, an apostle of peace and non-violence, a revolutionary, a visionary, a social reformer and deliverer of the downtrodden, Father of the nation, these and numerous other epithets have been used by eminent personalities and scholars while describing Mahatma Gandhi. Gandhi was a relentless fighter and mass mobiliser who awakened a sleeping giant, the common men and women of India, shook them out of their docile, fatalistic mould, invigorated them with his doctrine of ahimsa and satyagraha (non violence and strict adherence to truth), instilled in them nationalist zeal and patriotic fervour and turned them into determined, steadfast freedom fighters. Under his leadership, the national liberation movement got a new life and ultimately succeeded in ending the era of foreign domination. And he did all these while resisting a natural tendency of his people to respond violently to unspeakable violence and atrocities unleashed by the colonial British administration desperate to keep India in its empire. In this paper, an attempt will be made to unravel Gandhi’s elucidation of the concept of non-violent resistance, along with non-cooperation and civil disobedience and their actual application through political practices which succeeded in capturing the imagination of not only India’s teeming millions but the entire world. The methodology of analytical study will be used as Gandhi’s own writings and those by noted scholars on Gandhi will be examined extensively to establish contemporary relevance of his thought, his invaluable guidelines about how to cope with poverty, inequality, exploitation, repression and marginalization of some sections of society and resultant radicalization of some disturbed members of human race, the very conditions which spawn terrorism in today’s world.Keywords: India, non cooperation, non violence, terrorism
Procedia PDF Downloads 324