Search results for: spatio-temporal data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 24428

Search results for: spatio-temporal data

22538 A Survey of Domain Name System Tunneling Attacks: Detection and Prevention

Authors: Lawrence Williams

Abstract:

As the mechanism which converts domains to internet protocol (IP) addresses, Domain Name System (DNS) is an essential part of internet usage. It was not designed securely and can be subject to attacks. DNS attacks have become more frequent and sophisticated and the need for detecting and preventing them becomes more important for the modern network. DNS tunnelling attacks are one type of attack that are primarily used for distributed denial-of-service (DDoS) attacks and data exfiltration. Discussion of different techniques to detect and prevent DNS tunneling attacks is done. The methods, models, experiments, and data for each technique are discussed. A proposal about feasibility is made. Future research on these topics is proposed.

Keywords: DNS, tunneling, exfiltration, botnet

Procedia PDF Downloads 56
22537 Comparison of Number of Waves Surfed and Duration Using Global Positioning System and Inertial Sensors

Authors: João Madureira, Ricardo Lagido, Inês Sousa, Fraunhofer Portugal

Abstract:

Surf is an increasingly popular sport and its performance evaluation is often qualitative. This work aims at using a smartphone to collect and analyze the GPS and inertial sensors data in order to obtain quantitative metrics of the surfing performance. Two approaches are compared for detection of wave rides, computing the number of waves rode in a surfing session, the starting time of each wave and its duration. The first approach is based on computing the velocity from the Global Positioning System (GPS) signal and finding the velocity thresholds that allow identifying the start and end of each wave ride. The second approach adds information from the Inertial Measurement Unit (IMU) of the smartphone, to the velocity thresholds obtained from the GPS unit, to determine the start and end of each wave ride. The two methods were evaluated using GPS and IMU data from two surfing sessions and validated with similar metrics extracted from video data collected from the beach. The second method, combining GPS and IMU data, was found to be more accurate in determining the number of waves, start time and duration. This paper shows that it is feasible to use smartphones for quantification of performance metrics during surfing. In particular, detection of the waves rode and their duration can be accurately determined using the smartphone GPS and IMU.

Keywords: inertial measurement unit (IMU), global positioning system (GPS), smartphone, surfing performance

Procedia PDF Downloads 386
22536 The Arts of Walisanga's Mosques in Java: Structure/Architecture Studies and Its Meaning in Anthropological Perspective

Authors: Slamet Subiyantoro, Mulyanto

Abstract:

Revealing the structure and symbolism meaning of the walisanga’s mosque arts in Java is very important to explain the philosophy of religious foundation which is a manifestation of the norms/ value system and behavior of the Javanese Islam society that support the culture. This research's aims are also to find the structure pattern of walisanga’s mosque and its symbolic meaning in the context of Javanese Islam society. In order to obtain the research objectives, the research were done in several walisanga’s mosques in Java using anthropological approach which is focused on its interpretation and semiotic analysis. The data were collected through interviews with key informants who well informed about the shape and symbolism of walisanga’s mosques in Java. The observation technique is done through visiting walisanga’s mosques to see directly about its structure/ architecture. In completing the information of comprehensive result of the research, it is also used documents and archives as well as any other source which is analyzed to deepen the discussion in answering the problems research. The flow of analysis is done using an interactive model through stages of data collection, data reduction, data presentation and verification. The analysis is done continuously in a cycle system to draw valid conclusions. The research result indicates that the structure/architecture of walisanga’s mosque in Java is structured/built up vertically as well as horizontally. Its structure/architecture is correlated to each other which is having a sacred meaning that is a process represents the mystical belief such as sangkan paraning dumadi and manuggaling kawula gusti.

Keywords: Walisanga’s mosques, Java, structure and architecture, meaning

Procedia PDF Downloads 345
22535 A Distribution Free Test for Censored Matched Pairs

Authors: Ayman Baklizi

Abstract:

This paper discusses the problem of testing hypotheses about the lifetime distributions of a matched pair based on censored data. A distribution free test based on a runs statistic is proposed. Its null distribution and power function are found in a simple convenient form. Some properties of the test statistic and its power function are studied.

Keywords: censored data, distribution free, matched pair, runs statistics

Procedia PDF Downloads 266
22534 Hybrid Approach for Country’s Performance Evaluation

Authors: C. Slim

Abstract:

This paper presents an integrated model, which hybridized data envelopment analysis (DEA) and support vector machine (SVM) together, to class countries according to their efficiency and performance. This model takes into account aspects of multi-dimensional indicators, decision-making hierarchy and relativity of measurement. Starting from a set of indicators of performance as exhaustive as possible, a process of successive aggregations has been developed to attain an overall evaluation of a country’s competitiveness.

Keywords: Artificial Neural Networks (ANN), Support vector machine (SVM), Data Envelopment Analysis (DEA), Aggregations, indicators of performance

Procedia PDF Downloads 317
22533 Visco-Acoustic Full Wave Inversion in the Frequency Domain with Mixed Grids

Authors: Sheryl Avendaño, Miguel Ospina, Hebert Montegranario

Abstract:

Full Wave Inversion (FWI) is a variant of seismic tomography for obtaining velocity profiles by an optimization process that combine forward modelling (or solution of wave equation) with the misfit between synthetic and observed data. In this research we are modelling wave propagation in a visco-acoustic medium in the frequency domain. We apply finite differences for the numerical solution of the wave equation with a mix between usual and rotated grids, where density depends on velocity and there exists a damping function associated to a linear dissipative medium. The velocity profiles are obtained from an initial one and the data have been modeled for a frequency range 0-120 Hz. By an iterative procedure we obtain an estimated velocity profile in which are detailed the remarkable features of the velocity profile from which synthetic data were generated showing promising results for our method.

Keywords: seismic inversion, full wave inversion, visco acoustic wave equation, finite diffrence methods

Procedia PDF Downloads 443
22532 Performance Comparison of Situation-Aware Models for Activating Robot Vacuum Cleaner in a Smart Home

Authors: Seongcheol Kwon, Jeongmin Kim, Kwang Ryel Ryu

Abstract:

We assume an IoT-based smart-home environment where the on-off status of each of the electrical appliances including the room lights can be recognized in a real time by monitoring and analyzing the smart meter data. At any moment in such an environment, we can recognize what the household or the user is doing by referring to the status data of the appliances. In this paper, we focus on a smart-home service that is to activate a robot vacuum cleaner at right time by recognizing the user situation, which requires a situation-aware model that can distinguish the situations that allow vacuum cleaning (Yes) from those that do not (No). We learn as our candidate models a few classifiers such as naïve Bayes, decision tree, and logistic regression that can map the appliance-status data into Yes and No situations. Our training and test data are obtained from simulations of user behaviors, in which a sequence of user situations such as cooking, eating, dish washing, and so on is generated with the status of the relevant appliances changed in accordance with the situation changes. During the simulation, both the situation transition and the resulting appliance status are determined stochastically. To compare the performances of the aforementioned classifiers we obtain their learning curves for different types of users through simulations. The result of our empirical study reveals that naïve Bayes achieves a slightly better classification accuracy than the other compared classifiers.

Keywords: situation-awareness, smart home, IoT, machine learning, classifier

Procedia PDF Downloads 403
22531 Efficient Recommendation System for Frequent and High Utility Itemsets over Incremental Datasets

Authors: J. K. Kavitha, D. Manjula, U. Kanimozhi

Abstract:

Mining frequent and high utility item sets have gained much significance in the recent years. When the data arrives sporadically, incremental and interactive rule mining and utility mining approaches can be adopted to handle user’s dynamic environmental needs and avoid redundancies, using previous data structures, and mining results. The dependence on recommendation systems has exponentially risen since the advent of search engines. This paper proposes a model for building a recommendation system that suggests frequent and high utility item sets over dynamic datasets for a cluster based location prediction strategy to predict user’s trajectories using the Efficient Incremental Rule Mining (EIRM) algorithm and the Fast Update Utility Pattern Tree (FUUP) algorithm. Through comprehensive evaluations by experiments, this scheme has shown to deliver excellent performance.

Keywords: data sets, recommendation system, utility item sets, frequent item sets mining

Procedia PDF Downloads 277
22530 The Development of the Website Learning the Local Wisdom in Phra Nakhon Si Ayutthaya Province

Authors: Bunthida Chunngam, Thanyanan Worasesthaphong

Abstract:

This research had objective to develop of the website learning the local wisdom in Phra Nakhon Si Ayutthaya province and studied satisfaction of system user. This research sample was multistage sample for 100 questionnaires, analyzed data to calculated reliability value with Cronbach’s alpha coefficient method α=0.82. This system had 3 functions which were system using, system feather evaluation and system accuracy evaluation which the statistics used for data analysis was descriptive statistics to explain sample feature so these statistics were frequency, percentage, mean and standard deviation. This data analysis result found that the system using performance quality had good level satisfaction (4.44 mean), system feather function analysis had good level satisfaction (4.11 mean) and system accuracy had good level satisfaction (3.74 mean).

Keywords: website, learning, local wisdom, Phra Nakhon Si Ayutthaya province

Procedia PDF Downloads 102
22529 Brain-Computer Interfaces That Use Electroencephalography

Authors: Arda Ozkurt, Ozlem Bozkurt

Abstract:

Brain-computer interfaces (BCIs) are devices that output commands by interpreting the data collected from the brain. Electroencephalography (EEG) is a non-invasive method to measure the brain's electrical activity. Since it was invented by Hans Berger in 1929, it has led to many neurological discoveries and has become one of the essential components of non-invasive measuring methods. Despite the fact that it has a low spatial resolution -meaning it is able to detect when a group of neurons fires at the same time-, it is a non-invasive method, making it easy to use without possessing any risks. In EEG, electrodes are placed on the scalp, and the voltage difference between a minimum of two electrodes is recorded, which is then used to accomplish the intended task. The recordings of EEGs include, but are not limited to, the currents along dendrites from synapses to the soma, the action potentials along the axons connecting neurons, and the currents through the synaptic clefts connecting axons with dendrites. However, there are some sources of noise that may affect the reliability of the EEG signals as it is a non-invasive method. For instance, the noise from the EEG equipment, the leads, and the signals coming from the subject -such as the activity of the heart or muscle movements- affect the signals detected by the electrodes of the EEG. However, new techniques have been developed to differentiate between those signals and the intended ones. Furthermore, an EEG device is not enough to analyze the data from the brain to be used by the BCI implication. Because the EEG signal is very complex, to analyze it, artificial intelligence algorithms are required. These algorithms convert complex data into meaningful and useful information for neuroscientists to use the data to design BCI devices. Even though for neurological diseases which require highly precise data, invasive BCIs are needed; non-invasive BCIs - such as EEGs - are used in many cases to help disabled people's lives or even to ease people's lives by helping them with basic tasks. For example, EEG is used to detect before a seizure occurs in epilepsy patients, which can then prevent the seizure with the help of a BCI device. Overall, EEG is a commonly used non-invasive BCI technique that has helped develop BCIs and will continue to be used to detect data to ease people's lives as more BCI techniques will be developed in the future.

Keywords: BCI, EEG, non-invasive, spatial resolution

Procedia PDF Downloads 51
22528 Unravelling the Knot: Towards a Definition of ‘Digital Labor’

Authors: Marta D'Onofrio

Abstract:

The debate on the digitalization of the economy has raised questions about how both labor and the regulation of work processes are changing due to the introduction of digital technologies in the productive system. Within the literature, the term ‘digital labor’ is commonly used to identify the impact of digitalization on labor. Despite the wide use of this term, it is still not available an unambiguous definition of it, and this could create confusion in the use of terminology and in the attempts of classification. As a consequence, the purpose of this paper is to provide for a definition and to propose a classification of ‘digital labor’, resorting to the theoretical approach of organizational studies.

Keywords: digital labor, digitalization, data-driven algorithms, big data, organizational studies

Procedia PDF Downloads 133
22527 Efficient Antenna Array Beamforming with Robustness against Random Steering Mismatch

Authors: Ju-Hong Lee, Ching-Wei Liao, Kun-Che Lee

Abstract:

This paper deals with the problem of using antenna sensors for adaptive beamforming in the presence of random steering mismatch. We present an efficient adaptive array beamformer with robustness to deal with the considered problem. The robustness of the proposed beamformer comes from the efficient designation of the steering vector. Using the received array data vector, we construct an appropriate correlation matrix associated with the received array data vector and a correlation matrix associated with signal sources. Then, the eigenvector associated with the largest eigenvalue of the constructed signal correlation matrix is designated as an appropriate estimate of the steering vector. Finally, the adaptive weight vector required for adaptive beamforming is obtained by using the estimated steering vector and the constructed correlation matrix of the array data vector. Simulation results confirm the effectiveness of the proposed method.

Keywords: adaptive beamforming, antenna array, linearly constrained minimum variance, robustness, steering vector

Procedia PDF Downloads 183
22526 Development of a Serial Signal Monitoring Program for Educational Purposes

Authors: Jungho Moon, Lae-Jeong Park

Abstract:

This paper introduces a signal monitoring program developed with a view to helping electrical engineering students get familiar with sensors with digital output. Because the output of digital sensors cannot be simply monitored by a measuring instrument such as an oscilloscope, students tend to have a hard time dealing with digital sensors. The monitoring program runs on a PC and communicates with an MCU that reads the output of digital sensors via an asynchronous communication interface. Receiving the sensor data from the MCU, the monitoring program shows time and/or frequency domain plots of the data in real time. In addition, the monitoring program provides a serial terminal that enables the user to exchange text information with the MCU while the received data is plotted. The user can easily observe the output of digital sensors and configure the digital sensors in real time, which helps students who do not have enough experiences with digital sensors. Though the monitoring program was programmed in the Matlab programming language, it runs without the Matlab since it was compiled as a standalone executable.

Keywords: digital sensor, MATLAB, MCU, signal monitoring program

Procedia PDF Downloads 476
22525 A Similar Image Retrieval System for Auroral All-Sky Images Based on Local Features and Color Filtering

Authors: Takanori Tanaka, Daisuke Kitao, Daisuke Ikeda

Abstract:

The aurora is an attractive phenomenon but it is difficult to understand the whole mechanism of it. An approach of data-intensive science might be an effective approach to elucidate such a difficult phenomenon. To do that we need labeled data, which shows when and what types of auroras, have appeared. In this paper, we propose an image retrieval system for auroral all-sky images, some of which include discrete and diffuse aurora, and the other do not any aurora. The proposed system retrieves images which are similar to the query image by using a popular image recognition method. Using 300 all-sky images obtained at Tromso Norway, we evaluate two methods of image recognition methods with or without our original color filtering method. The best performance is achieved when SIFT with the color filtering is used and its accuracy is 81.7% for discrete auroras and 86.7% for diffuse auroras.

Keywords: data-intensive science, image classification, content-based image retrieval, aurora

Procedia PDF Downloads 429
22524 Effect of Diamagnetic Additives on Defects Level of Soft LiTiZn Ferrite Ceramics

Authors: Andrey V. Malyshev, Anna B. Petrova, Anatoly P. Surzhikov

Abstract:

The article presents the results of the influence of diamagnetic additives on the defects level of ferrite ceramics. For this purpose, we use a previously developed method based on the mathematical analysis of experimental temperature dependences of the initial permeability. A phenomenological expression for the description of such dependence was suggested and an interpretation of its main parameters was given. It was shown, that the main criterion of the integral defects level of ferrite ceramics is the relation of two parameters correlating with elastic stress value in a material. Model samples containing a controlled number of intergranular phase inclusions served to prove the validity of the proposed method, as well as to assess its sensitivity in comparison with the traditional XRD (X-ray diffraction) analysis. The broadening data of diffraction reflexes of model samples have served for such comparison. The defects level data obtained by the proposed method are in good agreement with the X-ray data. The method showed high sensitivity. Therefore, the legitimacy of the selection relationship β/α parameters of phenomenological expression as a characteristic of the elastic state of the ferrite ceramics confirmed. In addition, the obtained data can be used in the detection of non-magnetic phases and testing the optimal sintering production technology of soft magnetic ferrites.

Keywords: cure point, initial permeability, integral defects level, homogeneity

Procedia PDF Downloads 123
22523 TAXAPRO, A Streamlined Pipeline to Analyze Shotgun Metagenomes

Authors: Sofia Sehli, Zainab El Ouafi, Casey Eddington, Soumaya Jbara, Kasambula Arthur Shem, Islam El Jaddaoui, Ayorinde Afolayan, Olaitan I. Awe, Allissa Dillman, Hassan Ghazal

Abstract:

The ability to promptly sequence whole genomes at a relatively low cost has revolutionized the way we study the microbiome. Microbiologists are no longer limited to studying what can be grown in a laboratory and instead are given the opportunity to rapidly identify the makeup of microbial communities in a wide variety of environments. Analyzing whole genome sequencing (WGS) data is a complex process that involves multiple moving parts and might be rather unintuitive for scientists that don’t typically work with this type of data. Thus, to help lower the barrier for less-computationally inclined individuals, TAXAPRO was developed at the first Omics Codeathon held virtually by the African Society for Bioinformatics and Computational Biology (ASBCB) in June 2021. TAXAPRO is an advanced metagenomics pipeline that accurately assembles organelle genomes from whole-genome sequencing data. TAXAPRO seamlessly combines WGS analysis tools to create a pipeline that automatically processes raw WGS data and presents organism abundance information in both a tabular and graphical format. TAXAPRO was evaluated using COVID-19 patient gut microbiome data. Analysis performed by TAXAPRO demonstrated a high abundance of Clostridia and Bacteroidia genera and a low abundance of Proteobacteria genera relative to others in the gut microbiome of patients hospitalized with COVID-19, consistent with the original findings derived using a different analysis methodology. This provides crucial evidence that the TAXAPRO workflow dispenses reliable organism abundance information overnight without the hassle of performing the analysis manually.

Keywords: metagenomics, shotgun metagenomic sequence analysis, COVID-19, pipeline, bioinformatics

Procedia PDF Downloads 187
22522 Investigation of External Pressure Coefficients on Large Antenna Parabolic Reflector Using Computational Fluid Dynamics

Authors: Varun K, Pramod B. Balareddy

Abstract:

Estimation of wind forces plays a significant role in the in the design of large antenna parabolic reflectors. Reflector surface accuracies are very sensitive to the gain of the antenna system at higher frequencies. Hence accurate estimation of wind forces becomes important, which is primary input for design and analysis of the reflector system. In the present work, numerical simulation of wind flow using Computational Fluid Dynamics (CFD) software is used to investigate the external pressure coefficients. An extensive comparative study has been made between the CFD results and the published wind tunnel data for different wind angle of attacks (α) acting over concave to convex surfaces respectively. Flow simulations using CFD are carried out to estimate the coefficients of Drag, Lift and Moment for the parabolic reflector. Coefficients of pressures (Cp) over the front and the rear face of the reflector are extracted over surface of the reflector to study the net pressure variations. These resultant pressure variations are compared with the published wind tunnel data for different angle of attacks. It was observed from the CFD simulations, both convex and concave face of reflector system experience a band of pressure variations for the positive and negative angle of attacks respectively. In the published wind tunnel data, Pressure variations over convex surfaces are assumed to be uniform and vice versa. Chordwise and spanwise pressure variations were calculated and compared with the published experimental data. In the present work, it was observed that the maximum pressure coefficients for α ranging from +30° to -90° and α=+90° was lower. For α ranging from +45° to +75°, maximum pressure coefficients were higher as compared to wind tunnel data. This variation is due to non-uniform pressure distribution observed over front and back faces of reflector. Variations in Cd, Cl and Cm over α=+90° to α=-90° was in close resemblance with the experimental data.

Keywords: angle of attack, drag coefficient, lift coefficient, pressure coefficient

Procedia PDF Downloads 233
22521 Beliefs on Reproduction of Women in Fish Port Community: An Explorative Study on the Beliefs on Conception, Childbirth, and Maternal Care of Women in Navotas Fish Port Community

Authors: Marie Kristel A. Gabawa

Abstract:

The accessibility of health programs, specifically family planning programs and maternal and child health care (FP/MCH), are generally low in urban poor communities. Moreover, most of FP/MCH programs are directed toward medical terms that are usually not included in ideation of the body of urban poor dwellers. This study aims to explore the beliefs on reproduction that will encompass, but not limited to, beliefs on conception, pregnancy, and maternal and child health care. The site of study will be the 2 barangays of North Bay Boulevard South 1 (NBBS1) and North Bay Boulevard South 2 (NBBS2). These 2 barangays are the nearest residential community within the Navotas Fish Port Complex (NFPC). Data gathered will be analyzed using grounded-theory method of analysis, with the theories of cultural materialism and equity feminism as foundation. Survey questionnaires, key informant interviews, and focus group discussions will be utilized in gathering data. Further, the presentation of data will be recommended to health program initiators and use the data gathered as a tool to customize FP/MCH programs to the perception and beliefs of women residing in NBBS1and NBBS2, and to aid any misinformation for FP/MCH techniques.

Keywords: beliefs on reproduction, fish port community, family planning, maternal and child health care, Navotas

Procedia PDF Downloads 242
22520 Estimation of the Upper Tail Dependence Coefficient for Insurance Loss Data Using an Empirical Copula-Based Approach

Authors: Adrian O'Hagan, Robert McLoughlin

Abstract:

Considerable focus in the world of insurance risk quantification is placed on modeling loss values from lines of business (LOBs) that possess upper tail dependence. Copulas such as the Joe, Gumbel and Student-t copula may be used for this purpose. The copula structure imparts a desired level of tail dependence on the joint distribution of claims from the different LOBs. Alternatively, practitioners may possess historical or simulated data that already exhibit upper tail dependence, through the impact of catastrophe events such as hurricanes or earthquakes. In these circumstances, it is not desirable to induce additional upper tail dependence when modeling the joint distribution of the loss values from the individual LOBs. Instead, it is of interest to accurately assess the degree of tail dependence already present in the data. The empirical copula and its associated upper tail dependence coefficient are presented in this paper as robust, efficient means of achieving this goal.

Keywords: empirical copula, extreme events, insurance loss reserving, upper tail dependence coefficient

Procedia PDF Downloads 269
22519 Blockchain in Saudi E-Government: A Systematic Literature Review

Authors: Haitham Assiri, Priyadarsi Nanda

Abstract:

The world is gradually entering the fourth industrial revolution. E-Government services are scaling government operations across the globe. However, as promising as an e-Government system would be, it is also susceptible to malicious attacks if not properly secured. This study found out that, in Saudi Arabia, the e-Government website, Yesser is vulnerable to external attacks. Obviously, this can lead to a breach of data integrity and privacy. In this paper, a Systematic Literature Review was conducted to explore possible ways the Kingdom of Saudi Arabia can take necessary measures to strengthen its e-Government system using Blockchain. Blockchain is one of the emerging technologies shaping the world through its applications in finance, elections, healthcare, etc. It secures systems and brings more transparency. A total of 28 papers were selected for this SLR, and 19 of the papers significantly showed that blockchain could enhance the security and privacy of Saudi’s e-government system. Other papers also concluded that blockchain is effective, albeit with the integration of other technologies like IoT, AI and big data. These papers have been analysed to sieve out the findings and set the stage for future research into the subject.

Keywords: blockchain, data integrity, e-government, security threats

Procedia PDF Downloads 227
22518 Geospatial Information for Smart City Development

Authors: Simangele Dlamini

Abstract:

Smart city development is seen as a way of facing the challenges brought about by the growing urban population the world over. Research indicates that cities have a role to play in combating urban challenges like crime, waste disposal, greenhouse gas emissions, and resource efficiency. These solutions should be such that they do not make city management less sustainable but should be solutions-driven, cost and resource-efficient, and smart. This study explores opportunities on how the City of Johannesburg, South Africa, can use Geographic Information Systems, Big Data and the Internet of Things (IoT) in identifying opportune areas to initiate smart city initiatives such as smart safety, smart utilities, smart mobility, and smart infrastructure in an integrated manner. The study will combine Big Data, using real-time data sources to identify hotspot areas that will benefit from ICT interventions. The GIS intervention will assist the city in avoiding a silo approach in its smart city development initiatives, an approach that has led to the failure of smart city development in other countries.

Keywords: smart cities, internet of things, geographic information systems, johannesburg

Procedia PDF Downloads 116
22517 Language Errors Used in “The Space between Us” Movie and Their Effects on Translation Quality: Translation Study toward Discourse Analysis Approach

Authors: Mochamad Nuruz Zaman, Mangatur Rudolf Nababan, M. A. Djatmika

Abstract:

Both society and education areas teach to have good communication for building the interpersonal skills up. Everyone has the capacity to understand something new, either well comprehension or worst understanding. Worst understanding makes the language errors when the interactions are done by someone in the first meeting, and they do not know before it because of distance area. “The Space between Us” movie delivers the love-adventure story between Mars Boy and Earth Girl. They are so many missing conversations because of the different climate and environment. As the moviegoer also must be focused on the subtitle in order to enjoy well the movie. Furthermore, Indonesia subtitle and English conversation on the movie still have overlapping understanding in the translation. Translation hereby consists of source language -SL- (English conversation) and target language -TL- (Indonesia subtitle). These research gap above is formulated in research question by how the language errors happened in that movie and their effects on translation quality which is deepest analyzed by translation study toward discourse analysis approach. The research goal is to expand the language errors and their translation qualities in order to create a good atmosphere in movie media. The research is studied by embedded research in qualitative design. The research locations consist of setting, participant, and event as focused determined boundary. Sources of datum are “The Space between Us” movie and informant (translation quality rater). The sampling is criterion-based sampling (purposive sampling). Data collection techniques use content analysis and questioner. Data validation applies data source and method triangulation. Data analysis delivers domain, taxonomy, componential, and cultural theme analysis. Data findings on the language errors happened in the movie are referential, register, society, textual, receptive, expressive, individual, group, analogical, transfer, local, and global errors. Data discussions on their effects to translation quality are concentrated by translation techniques on their data findings; they are amplification, borrowing, description, discursive creation, established equivalent, generalization, literal, modulation, particularization, reduction, substitution, and transposition.

Keywords: discourse analysis, language errors, The Space between Us movie, translation techniques, translation quality instruments

Procedia PDF Downloads 199
22516 A Coupling Study of Public Service Facilities and Land Price Based on Big Data Perspective in Wuxi City

Authors: Sisi Xia, Dezhuan Tao, Junyan Yang, Weiting Xiong

Abstract:

Under the background of Chinese urbanization changing from incremental development to stock development, the completion of urban public service facilities is essential to urban spatial quality. As public services facilities is a huge and complicated system, clarifying the various types of internal rules associated with the land market price is key to optimizing spatial layout. This paper takes Wuxi City as a representative sample location and establishes the digital analysis platform using urban price and several high-precision big data acquisition methods. On this basis, it analyzes the coupling relationship between different public service categories and land price, summarizing the coupling patterns of urban public facilities distribution and urban land price fluctuations. Finally, the internal mechanism within each of the two elements is explored, providing the reference of the optimum layout of urban planning and public service facilities.

Keywords: public service facilities, land price, urban spatial morphology, big data

Procedia PDF Downloads 184
22515 Structural Damage Detection Using Modal Data Employing Teaching Learning Based Optimization

Authors: Subhajit Das, Nirjhar Dhang

Abstract:

Structural damage detection is a challenging work in the field of structural health monitoring (SHM). The damage detection methods mainly focused on the determination of the location and severity of the damage. Model updating is a well known method to locate and quantify the damage. In this method, an error function is defined in terms of difference between the signal measured from ‘experiment’ and signal obtained from undamaged finite element model. This error function is minimised with a proper algorithm, and the finite element model is updated accordingly to match the measured response. Thus, the damage location and severity can be identified from the updated model. In this paper, an error function is defined in terms of modal data viz. frequencies and modal assurance criteria (MAC). MAC is derived from Eigen vectors. This error function is minimized by teaching-learning-based optimization (TLBO) algorithm, and the finite element model is updated accordingly to locate and quantify the damage. Damage is introduced in the model by reduction of stiffness of the structural member. The ‘experimental’ data is simulated by the finite element modelling. The error due to experimental measurement is introduced in the synthetic ‘experimental’ data by adding random noise, which follows Gaussian distribution. The efficiency and robustness of this method are explained through three examples e.g., one truss, one beam and one frame problem. The result shows that TLBO algorithm is efficient to detect the damage location as well as the severity of damage using modal data.

Keywords: damage detection, finite element model updating, modal assurance criteria, structural health monitoring, teaching learning based optimization

Procedia PDF Downloads 198
22514 Deployed Confidence: The Testing in Production

Authors: Shreya Asthana

Abstract:

Testers know that the feature they tested on stage is working perfectly in production only after release went live. Sometimes something breaks in production and testers get to know through the end user’s bug raised. The panic mode starts when your staging test results do not reflect current production behavior. And you started doubting your testing skills when finally the user reported a bug to you. Testers can deploy their confidence on release day by testing on production. Once you start doing testing in production, you will see test result accuracy because it will be running on real time data and execution will be a little faster as compared to staging one due to elimination of bad data. Feature flagging, canary releases, and data cleanup can help to achieve this technique of testing. By this paper it will be easier to understand the steps to achieve production testing before making your feature live, and to modify IT company’s testing procedure, so testers can provide the bug free experience to the end users. This study is beneficial because too many people think that testing should be done in staging but not in production and now this is high time to pull out people from their old mindset of testing into a new testing world. At the end of the day, it all just matters if the features are working in production or not.

Keywords: bug free production, new testing mindset, testing strategy, testing approach

Procedia PDF Downloads 49
22513 Cross-Validation of the Data Obtained for ω-6 Linoleic and ω-3 α-Linolenic Acids Concentration of Hemp Oil Using Jackknife and Bootstrap Resampling

Authors: Vibha Devi, Shabina Khanam

Abstract:

Hemp (Cannabis sativa) possesses a rich content of ω-6 linoleic and ω-3 linolenic essential fatty acid in the ratio of 3:1, which is a rare and most desired ratio that enhances the quality of hemp oil. These components are beneficial for the development of cell and body growth, strengthen the immune system, possess anti-inflammatory action, lowering the risk of heart problem owing to its anti-clotting property and a remedy for arthritis and various disorders. The present study employs supercritical fluid extraction (SFE) approach on hemp seed at various conditions of parameters; temperature (40 - 80) °C, pressure (200 - 350) bar, flow rate (5 - 15) g/min, particle size (0.430 - 1.015) mm and amount of co-solvent (0 - 10) % of solvent flow rate through central composite design (CCD). CCD suggested 32 sets of experiments, which was carried out. As SFE process includes large number of variables, the present study recommends the application of resampling techniques for cross-validation of the obtained data. Cross-validation refits the model on each data to achieve the information regarding the error, variability, deviation etc. Bootstrap and jackknife are the most popular resampling techniques, which create a large number of data through resampling from the original dataset and analyze these data to check the validity of the obtained data. Jackknife resampling is based on the eliminating one observation from the original sample of size N without replacement. For jackknife resampling, the sample size is 31 (eliminating one observation), which is repeated by 32 times. Bootstrap is the frequently used statistical approach for estimating the sampling distribution of an estimator by resampling with replacement from the original sample. For bootstrap resampling, the sample size is 32, which was repeated by 100 times. Estimands for these resampling techniques are considered as mean, standard deviation, variation coefficient and standard error of the mean. For ω-6 linoleic acid concentration, mean value was approx. 58.5 for both resampling methods, which is the average (central value) of the sample mean of all data points. Similarly, for ω-3 linoleic acid concentration, mean was observed as 22.5 through both resampling. Variance exhibits the spread out of the data from its mean. Greater value of variance exhibits the large range of output data, which is 18 for ω-6 linoleic acid (ranging from 48.85 to 63.66 %) and 6 for ω-3 linoleic acid (ranging from 16.71 to 26.2 %). Further, low value of standard deviation (approx. 1 %), low standard error of the mean (< 0.8) and low variance coefficient (< 0.2) reflect the accuracy of the sample for prediction. All the estimator value of variance coefficients, standard deviation and standard error of the mean are found within the 95 % of confidence interval.

Keywords: resampling, supercritical fluid extraction, hemp oil, cross-validation

Procedia PDF Downloads 125
22512 Evaluation of Hydrocarbon Prospects of 'ADE' Field, Niger Delta

Authors: Oluseun A. Sanuade, Sanlinn I. Kaka, Adesoji O. Akanji, Olukole A. Akinbiyi

Abstract:

Prospect evaluation of ‘the ‘ADE’ field was done using 3D seismic data and well log data. The field is located in the offshore Niger Delta where water depth ranges from 450 to 800 m. The objectives of this study are to explore deeper prospects and to ascertain the kind of traps that are favorable for the accumulation of hydrocarbon in the field. Six horizons with major and minor faults were identified and mapped in the field. Time structure maps of these horizons were generated and using the available check-shot data the maps were converted to top structure maps which were used to calculate the hydrocarbon volume. The results show that regional structural highs that are trending in northeast-southwest (NE-SW) characterized a large portion of the field. These highs were observed across all horizons revealing a regional post-depositional deformation. Three prospects were identified and evaluated to understand the different opportunities in the field. These include stratigraphic pinch out and bi-directional downlap. The results of this study show that the field has potentials for new opportunities that could be explored for further studies.

Keywords: hydrocarbon, play, prospect, stratigraphy

Procedia PDF Downloads 243
22511 D3Advert: Data-Driven Decision Making for Ad Personalization through Personality Analysis Using BiLSTM Network

Authors: Sandesh Achar

Abstract:

Personalized advertising holds greater potential for higher conversion rates compared to generic advertisements. However, its widespread application in the retail industry faces challenges due to complex implementation processes. These complexities impede the swift adoption of personalized advertisement on a large scale. Personalized advertisement, being a data-driven approach, necessitates consumer-related data, adding to its complexity. This paper introduces an innovative data-driven decision-making framework, D3Advert, which personalizes advertisements by analyzing personalities using a BiLSTM network. The framework utilizes the Myers–Briggs Type Indicator (MBTI) dataset for development. The employed BiLSTM network, specifically designed and optimized for D3Advert, classifies user personalities into one of the sixteen MBTI categories based on their social media posts. The classification accuracy is 86.42%, with precision, recall, and F1-Score values of 85.11%, 84.14%, and 83.89%, respectively. The D3Advert framework personalizes advertisements based on these personality classifications. Experimental implementation and performance analysis of D3Advert demonstrate a 40% improvement in impressions. D3Advert’s innovative and straightforward approach has the potential to transform personalized advertising and foster widespread personalized advertisement adoption in marketing.

Keywords: personalized advertisement, deep Learning, MBTI dataset, BiLSTM network, NLP.

Procedia PDF Downloads 22
22510 Communication Infrastructure Required for a Driver Behaviour Monitoring System, ‘SiaMOTO’ IT Platform

Authors: Dogaru-Ulieru Valentin, Sălișteanu Ioan Corneliu, Ardeleanu Mihăiță Nicolae, Broscăreanu Ștefan, Sălișteanu Bogdan, Mihai Mihail

Abstract:

The SiaMOTO system is a communications and data processing platform for vehicle traffic. The human factor is the most important factor in the generation of this data, as the driver is the one who dictates the trajectory of the vehicle. Like any trajectory, specific parameters refer to position, speed and acceleration. Constant knowledge of these parameters allows complex analyses. Roadways allow many vehicles to travel through their confined space, and the overlapping trajectories of several vehicles increase the likelihood of collision events, known as road accidents. Any such event has causes that lead to its occurrence, so the conditions for its occurrence are known. The human factor is predominant in deciding the trajectory parameters of the vehicle on the road, so monitoring it by knowing the events reported by the DiaMOTO device over time, will generate a guide to target any potentially high-risk driving behavior and reward those who control the driving phenomenon well. In this paper, we have focused on detailing the communication infrastructure of the DiaMOTO device with the traffic data collection server, the infrastructure through which the database that will be used for complex AI/DLM analysis is built. The central element of this description is the data string in CODEC-8 format sent by the DiaMOTO device to the SiaMOTO collection server database. The data presented are specific to a functional infrastructure implemented in an experimental model stage, by installing on a number of 50 vehicles DiaMOTO unique code devices, integrating ADAS and GPS functions, through which vehicle trajectories can be monitored 24 hours a day.

Keywords: DiaMOTO, Codec-8, ADAS, GPS, driver monitoring

Procedia PDF Downloads 52
22509 Validity and Reliability of Competency Assessment Implementation (CAI) Instrument Using Rasch Model

Authors: Nurfirdawati Muhamad Hanafi, Azmanirah Ab Rahman, Marina Ibrahim Mukhtar, Jamil Ahmad, Sarebah Warman

Abstract:

This study was conducted to generate empirical evidence on validity and reliability of the item of Competency Assessment Implementation (CAI) Instrument using Rasch Model for polythomous data aided by Winstep software version 3.68. The construct validity was examined by analyzing the point-measure correlation index (PTMEA), in fit and outfit MNSQ values; meanwhile the reliability was examined by analyzing item reliability index. A survey technique was used as the major method with the CAI instrument on 156 teachers from vocational schools. The results have shown that the reliability of CAI Instrument items were between 0.80 and 0.98. PTMEA Correlation is in positive values, in which the item is able to distinguish between the ability of the respondent. Statistical data obtained shows that out of 154 items, 12 items from the instrument suggested to be omitted. This study is hoped could bring a new direction to the process of data analysis in educational research.

Keywords: competency assessment, reliability, validity, item analysis

Procedia PDF Downloads 423