Search results for: data processing strategies
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 30031

Search results for: data processing strategies

29551 A Study on the Different Components of a Typical Back-Scattered Chipless RFID Tag Reflection

Authors: Fatemeh Babaeian, Nemai Chandra Karmakar

Abstract:

Chipless RFID system is a wireless system for tracking and identification which use passive tags for encoding data. The advantage of using chipless RFID tag is having a planar tag which is printable on different low-cost materials like paper and plastic. The printed tag can be attached to different items in the labelling level. Since the price of chipless RFID tag can be as low as a fraction of a cent, this technology has the potential to compete with the conventional optical barcode labels. However, due to the passive structure of the tag, data processing of the reflection signal is a crucial challenge. The captured reflected signal from a tag attached to an item consists of different components which are the reflection from the reader antenna, the reflection from the item, the tag structural mode RCS component and the antenna mode RCS of the tag. All these components are summed up in both time and frequency domains. The effect of reflection from the item and the structural mode RCS component can distort/saturate the frequency domain signal and cause difficulties in extracting the desired component which is the antenna mode RCS. Therefore, it is required to study the reflection of the tag in both time and frequency domains to have a better understanding of the nature of the captured chipless RFID signal. The other benefits of this study can be to find an optimised encoding technique in tag design level and to find the best processing algorithm the chipless RFID signal in decoding level. In this paper, the reflection from a typical backscattered chipless RFID tag with six resonances is analysed, and different components of the signal are separated in both time and frequency domains. Moreover, the time domain signal corresponding to each resonator of the tag is studied. The data for this processing was captured from simulation in CST Microwave Studio 2017. The outcome of this study is understanding different components of a measured signal in a chipless RFID system and a discovering a research gap which is a need to find an optimum detection algorithm for tag ID extraction.

Keywords: antenna mode RCS, chipless RFID tag, resonance, structural mode RCS

Procedia PDF Downloads 180
29550 Susanne Bier, Lone Scherfig: Transnationalization Strategies

Authors: Ebru Thwaites Diken

Abstract:

This article analyzes the works of certain directors in Danish cinema, namely Susanne Bier and Lone Sherfig, in the context of transnationalisation of Danish cinema. It looks at how the films' narratives negotiate and reconstruct the local / national / regional and the global. Scholars such as Nestingen & Elkington (2005), Hjort (2010), Higbee and Lim (2010), Bondebjerg and Redvall (2011) address transnationalism of Danish cinema in terms of production and distribution processes and how film making trascends national boundaries. This paper employs a particular understanding of transnationalism - in terms of how ideas and characters travel - to analyze how the storytelling and style has evolved to connect the national, the regional and the global on the basis of the works of these two directors. Strategies such as Hollywoodization - i.e. focus on stardom and classical narration, adhering to conventional European genre formulas, producing Danish films in English language have been identifiable strategies in Danish cinema in the period after the 2000s. Susanne Bier and Lone Scherfig are significant for employing some of these strategies simultaneously. For this reason, this article will look at how these two directors have employed these strategies and negotiated the cultural boundaries and exchanges.

Keywords: transnational cinema, danish cinema, susanne bier, lone scherfig

Procedia PDF Downloads 63
29549 High Resolution Sandstone Connectivity Modelling: Implications for Outcrop Geological and Its Analog Studies

Authors: Numair Ahmed Siddiqui, Abdul Hadi bin Abd Rahman, Chow Weng Sum, Wan Ismail Wan Yousif, Asif Zameer, Joel Ben-Awal

Abstract:

Advances in data capturing from outcrop studies have made possible the acquisition of high-resolution digital data, offering improved and economical reservoir modelling methods. Terrestrial laser scanning utilizing LiDAR (Light detection and ranging) provides a new method to build outcrop based reservoir models, which provide a crucial piece of information to understand heterogeneities in sandstone facies with high-resolution images and data set. This study presents the detailed application of outcrop based sandstone facies connectivity model by acquiring information gathered from traditional fieldwork and processing detailed digital point-cloud data from LiDAR to develop an intermediate small-scale reservoir sandstone facies model of the Miocene Sandakan Formation, Sabah, East Malaysia. The software RiScan pro (v1.8.0) was used in digital data collection and post-processing with an accuracy of 0.01 m and point acquisition rate of up to 10,000 points per second. We provide an accurate and descriptive workflow to triangulate point-clouds of different sets of sandstone facies with well-marked top and bottom boundaries in conjunction with field sedimentology. This will provide highly accurate qualitative sandstone facies connectivity model which is a challenge to obtain from subsurface datasets (i.e., seismic and well data). Finally, by applying this workflow, we can build an outcrop based static connectivity model, which can be an analogue to subsurface reservoir studies.

Keywords: LiDAR, outcrop, high resolution, sandstone faceis, connectivity model

Procedia PDF Downloads 207
29548 Temporal Progression of Episodic Memory as Function of Encoding Condition and Age: Further Investigation of Action Memory in School-Aged Children

Authors: Farzaneh Badinlou, Reza Kormi-Nouri, Monika Knopf

Abstract:

Studies of adults' episodic memory have found that enacted encoding not only improve recall performance but also retrieve faster during the recall period. The current study focused on exploring the temporal progression of different encoding conditions in younger and older school children. 204 students from two age group of 8 and 14 participated in this study. During the study phase, we studied action encoding in two forms; participants performed the phrases by themselves (SPT), and observed the performance of the experimenter (EPT), which were compared with verbal encoding; participants listened to verbal action phrases (VT). At test phase, we used immediate and delayed free recall tests. We observed significant differences in memory performance as function of age group, and encoding conditions in both immediate and delayed free recall tests. Moreover, temporal progression of recall was faster in older children when compared with younger ones. The interaction of age-group and encoding condition was only significant in delayed recall displaying that younger children performed better in EPT whereas older children outperformed in SPT. It was proposed that enactment effect in form of SPT enhances item-specific processing, whereas EPT improves relational information processing and this differential processes are responsible for the results achieved in younger and older children. The role of memory strategies and information processing methods in younger and older children were considered in this study. Moreover, the temporal progression of recall was faster in action encoding in the form of SPT and EPT compared with verbal encoding in both immediate and delayed free recall and size of enactment effect was constantly increased throughout the recall period. The results of the present study provide further evidence that the action memory is explained with an emphasis on the notion of information processing and strategic views. These results also reveal the temporal progression of recall as a new dimension of episodic memory in children.

Keywords: action memory, enactment effect, episodic memory, school-aged children, temporal progression

Procedia PDF Downloads 262
29547 Development of Fake News Model Using Machine Learning through Natural Language Processing

Authors: Sajjad Ahmed, Knut Hinkelmann, Flavio Corradini

Abstract:

Fake news detection research is still in the early stage as this is a relatively new phenomenon in the interest raised by society. Machine learning helps to solve complex problems and to build AI systems nowadays and especially in those cases where we have tacit knowledge or the knowledge that is not known. We used machine learning algorithms and for identification of fake news; we applied three classifiers; Passive Aggressive, Naïve Bayes, and Support Vector Machine. Simple classification is not completely correct in fake news detection because classification methods are not specialized for fake news. With the integration of machine learning and text-based processing, we can detect fake news and build classifiers that can classify the news data. Text classification mainly focuses on extracting various features of text and after that incorporating those features into classification. The big challenge in this area is the lack of an efficient way to differentiate between fake and non-fake due to the unavailability of corpora. We applied three different machine learning classifiers on two publicly available datasets. Experimental analysis based on the existing dataset indicates a very encouraging and improved performance.

Keywords: fake news detection, natural language processing, machine learning, classification techniques.

Procedia PDF Downloads 151
29546 Advanced Magnetic Field Mapping Utilizing Vertically Integrated Deployment Platforms

Authors: John E. Foley, Martin Miele, Raul Fonda, Jon Jacobson

Abstract:

This paper presents development and implementation of new and innovative data collection and analysis methodologies based on deployment of total field magnetometer arrays. Our research has focused on the development of a vertically-integrated suite of platforms all utilizing common data acquisition, data processing and analysis tools. These survey platforms include low-altitude helicopters and ground-based vehicles, including robots, for terrestrial mapping applications. For marine settings the sensor arrays are deployed from either a hydrodynamic bottom-following wing towed from a surface vessel or from a towed floating platform for shallow-water settings. Additionally, sensor arrays are deployed from tethered remotely operated vehicles (ROVs) for underwater settings where high maneuverability is required. While the primary application of these systems is the detection and mapping of unexploded ordnance (UXO), these system are also used for various infrastructure mapping and geologic investigations. For each application, success is driven by the integration of magnetometer arrays, accurate geo-positioning, system noise mitigation, and stable deployment of the system in appropriate proximity of expected targets or features. Each of the systems collects geo-registered data compatible with a web-enabled data management system providing immediate access of data and meta-data for remote processing, analysis and delivery of results. This approach allows highly sophisticated magnetic processing methods, including classification based on dipole modeling and remanent magnetization, to be efficiently applied to many projects. This paper also briefly describes the initial development of magnetometer-based detection systems deployed from low-altitude helicopter platforms and the subsequent successful transition of this technology to the marine environment. Additionally, we present examples from a range of terrestrial and marine settings as well as ongoing research efforts related to sensor miniaturization for unmanned aerial vehicle (UAV) magnetic field mapping applications.

Keywords: dipole modeling, magnetometer mapping systems, sub-surface infrastructure mapping, unexploded ordnance detection

Procedia PDF Downloads 456
29545 Data Clustering Algorithm Based on Multi-Objective Periodic Bacterial Foraging Optimization with Two Learning Archives

Authors: Chen Guo, Heng Tang, Ben Niu

Abstract:

Clustering splits objects into different groups based on similarity, making the objects have higher similarity in the same group and lower similarity in different groups. Thus, clustering can be treated as an optimization problem to maximize the intra-cluster similarity or inter-cluster dissimilarity. In real-world applications, the datasets often have some complex characteristics: sparse, overlap, high dimensionality, etc. When facing these datasets, simultaneously optimizing two or more objectives can obtain better clustering results than optimizing one objective. However, except for the objectives weighting methods, traditional clustering approaches have difficulty in solving multi-objective data clustering problems. Due to this, evolutionary multi-objective optimization algorithms are investigated by researchers to optimize multiple clustering objectives. In this paper, the Data Clustering algorithm based on Multi-objective Periodic Bacterial Foraging Optimization with two Learning Archives (DC-MPBFOLA) is proposed. Specifically, first, to reduce the high computing complexity of the original BFO, periodic BFO is employed as the basic algorithmic framework. Then transfer the periodic BFO into a multi-objective type. Second, two learning strategies are proposed based on the two learning archives to guide the bacterial swarm to move in a better direction. On the one hand, the global best is selected from the global learning archive according to the convergence index and diversity index. On the other hand, the personal best is selected from the personal learning archive according to the sum of weighted objectives. According to the aforementioned learning strategies, a chemotaxis operation is designed. Third, an elite learning strategy is designed to provide fresh power to the objects in two learning archives. When the objects in these two archives do not change for two consecutive times, randomly initializing one dimension of objects can prevent the proposed algorithm from falling into local optima. Fourth, to validate the performance of the proposed algorithm, DC-MPBFOLA is compared with four state-of-art evolutionary multi-objective optimization algorithms and one classical clustering algorithm on evaluation indexes of datasets. To further verify the effectiveness and feasibility of designed strategies in DC-MPBFOLA, variants of DC-MPBFOLA are also proposed. Experimental results demonstrate that DC-MPBFOLA outperforms its competitors regarding all evaluation indexes and clustering partitions. These results also indicate that the designed strategies positively influence the performance improvement of the original BFO.

Keywords: data clustering, multi-objective optimization, bacterial foraging optimization, learning archives

Procedia PDF Downloads 125
29544 Collective Strategies Dominate in Spatial Iterated Prisoners Dilemma

Authors: Jiawei Li

Abstract:

How cooperation emerges and persists in a population of selfish agents is a fundamental question in evolutionary game theory. Our research shows that Collective Strategies with Master-Slave Mechanism (CSMSM) defeat Tit-for-Tat and other well-known strategies in spatial iterated prisoner’s dilemma. A CSMSM identifies kin members by means of a handshaking mechanism. If the opponent is identified as non-kin, a CSMSM will always defect. Once two CSMSMs meet, they play master and slave roles. A mater defects and a slave cooperates in order to maximize the master’s payoff. CSMSM outperforms non-collective strategies in spatial IPD even if there is only a small cluster of CSMSMs in the population. The existence and performance of CSMSM in spatial iterated prisoner’s dilemma suggests that cooperation first appears and persists in a group of collective agents.

Keywords: Evolutionary game theory, spatial prisoners dilemma, collective strategy, master-slave mechanism

Procedia PDF Downloads 138
29543 A Model Architecture Transformation with Approach by Modeling: From UML to Multidimensional Schemas of Data Warehouses

Authors: Ouzayr Rabhi, Ibtissam Arrassen

Abstract:

To provide a complete analysis of the organization and to help decision-making, leaders need to have relevant data; Data Warehouses (DW) are designed to meet such needs. However, designing DW is not trivial and there is no formal method to derive a multidimensional schema from heterogeneous databases. In this article, we present a Model-Driven based approach concerning the design of data warehouses. We describe a multidimensional meta-model and also specify a set of transformations starting from a Unified Modeling Language (UML) metamodel. In this approach, the UML metamodel and the multidimensional one are both considered as a platform-independent model (PIM). The first meta-model is mapped into the second one through transformation rules carried out by the Query View Transformation (QVT) language. This proposal is validated through the application of our approach to generating a multidimensional schema of a Balanced Scorecard (BSC) DW. We are interested in the BSC perspectives, which are highly linked to the vision and the strategies of an organization.

Keywords: data warehouse, meta-model, model-driven architecture, transformation, UML

Procedia PDF Downloads 145
29542 The Lived Experiences of Paramedical Students Engaged in Virtual Hands-on Learning

Authors: Zyra Cheska Hidalgo, Joehiza Mae Renon, Kzarina Buen, Girlie Mitrado

Abstract:

ABSTRACT: The global coronavirus disease (COVID-19) has dramatically impacted the lives of many, including education and our economy. Thus, it presents a massive challenge for medical education as instructors are mandated to deliver their lectures virtually to ensure the continuity of the medical education process and ensure students' safety. The purpose of this research paper is to determine the lived experiences of paramedical students who are engaged in virtual hands-on learning and to determine the different coping strategies they used to deal with virtual hands-on learning. The researchers used the survey method of descriptive research design to determine the lived experiences and coping strategies of twenty (20) paramedical students from Lorma Colleges (particularly the College of Medicine Department). The data were collected through online questionnaires, particularly with the use of google forms. This study shows technical issues, difficulty in adapting styles, distractions and time management issues, mental and physical health issues, and lack of interest and motivation are the most common problems and challenges experienced by paramedical students. On the other hand, the coping strategies used by paramedical students to deal with those challenges include time management, engagement in leisure activities, acceptance of responsibilities, studying, and adapting. With the data gathered, the researchers concluded that virtual hands-on learning effectively increases the knowledge of paramedical students. However, teaching and learning barriers must have to be considered to implement virtual hands-on learning successfully.

Keywords: virtual hands-on learning, E-learning, paramedical students, medical education

Procedia PDF Downloads 118
29541 On the Analysis of Strategies of Buechi Games

Authors: Ahmad Termimi Ab Ghani, Kojiro Higuchi

Abstract:

In this paper, we present some results of simultaneous infinite games. We mainly work with generalized reachability games and Buechi games. These games are two-player concurrent games where each player chooses simultaneously their moves at each step. Our goal is to give simple expressions of values for each game. Moreover, we are interested in the question of what type of optimal (ε-optimal) strategy exists for both players depending on the type of games. We first show the determinacy (optimal value) and optimal (ε-optimal) strategies in generalized reachability games. We provide a simple expressions of value of this game and prove the existence of memoryless randomized ε-optimal strategy for Player I in any generalized reachability games. We then observe games with more complex objectives, games with Buechi objectives. We present how to compute an ε-optimal strategies and approximate a value of game in some way. Specifically, the results of generalized reachability games are used to show the value of Buechi games can be approximated as values of some generalized reachability games.

Keywords: optimal Strategies, generalized reachability games, Buechi games

Procedia PDF Downloads 583
29540 Vibroacoustic Modulation with Chirp Signal

Authors: Dong Liu

Abstract:

By sending a high-frequency probe wave and a low-frequency pump wave to a specimen, the vibroacoustic method evaluates the defect’s severity according to the modulation index of the received signal. Many studies experimentally proved the significant sensitivity of the modulation index to the tiny contact type defect. However, it has also been found that the modulation index was highly affected by the frequency of probe or pump waves. Therefore, the chirp signal has been introduced to the VAM method since it can assess multiple frequencies in a relatively short time duration, so the robustness of the VAM method could be enhanced. Consequently, the signal processing method needs to be modified accordingly. Various studies utilized different algorithms or combinations of algorithms for processing the VAM signal method by chirp excitation. These signal process methods were compared and used for processing a VAM signal acquired from the steel samples.

Keywords: vibroacoustic modulation, nonlinear acoustic modulation, nonlinear acoustic NDT&E, signal processing, structural health monitoring

Procedia PDF Downloads 88
29539 Genodata: The Human Genome Variation Using BigData

Authors: Surabhi Maiti, Prajakta Tamhankar, Prachi Uttam Mehta

Abstract:

Since the accomplishment of the Human Genome Project, there has been an unparalled escalation in the sequencing of genomic data. This project has been the first major vault in the field of medical research, especially in genomics. This project won accolades by using a concept called Bigdata which was earlier, extensively used to gain value for business. Bigdata makes use of data sets which are generally in the form of files of size terabytes, petabytes, or exabytes and these data sets were traditionally used and managed using excel sheets and RDBMS. The voluminous data made the process tedious and time consuming and hence a stronger framework called Hadoop was introduced in the field of genetic sciences to make data processing faster and efficient. This paper focuses on using SPARK which is gaining momentum with the advancement of BigData technologies. Cloud Storage is an effective medium for storage of large data sets which is generated from the genetic research and the resultant sets produced from SPARK analysis.

Keywords: human genome project, Bigdata, genomic data, SPARK, cloud storage, Hadoop

Procedia PDF Downloads 243
29538 Increasing Added-Value of Salak Fruit by Freezing Frying to Improve the Welfare of Farmers: Case Study of Sleman Regency, Yogyakarta-Indonesia

Authors: Sucihatiningsih Dian Wisika Prajanti, Himawan Arif Susanto

Abstract:

Fruits are perishable products and have relatively low price, especially at harvest time. Generally, farmers only sell the products shortly after the harvest time without any processing. Farmers also only play role as price takers leading them to have less power to set the price. Sometimes, farmers are manipulated by middlemen, especially during abundant harvest. Therefore, it requires an effort to cultivate fruits and create innovation to make them more durable and have higher economic value. The purpose of this research is how to increase the added- value of fruits that have high economic value. The research involved 60 farmers of Salak fruit as the sample. Then, descriptive analysis was used to analyze the data in this study. The results showed the selling price of Salak fruit is very low. Hence, to increase the added-value of the fruits, fruit processing is carried out by freezing - frying which can cause the fruits last longer. In addition to increase these added-value, the products can be accommodated for further processed without worrying about their crops rotted or unsold.

Keywords: fruits processing, Salak fruit, freezing frying, farmer’s welfare, Sleman, Yogyakarta

Procedia PDF Downloads 336
29537 Relationships between Actors within Business Ecosystems That Adopt Circular Strategies: A Systematic Literature Review

Authors: Sophia Barquete, Adriana H. Trevisan, Janaina Mascarenhas

Abstract:

The circular economy (CE) aims at the cycling of resources through restorative and regenerative strategies. To achieve circularity, coordination of several actors who have different responsibilities is necessary. The interaction among multiple actors allows the connection between the CE and business ecosystem research fields. Although fundamental, the relationships between actors within an ecosystem to foster circularity are not deeply explored in the literature. The objective of this study was to identify the possibilities of cooperation, competition, or even coopetition among the members of business ecosystems that adopt circular strategies. In particular, the motivations that make these actors interact to achieve a circular economy were investigated. A systematic literature review was adopted to select business ecosystem cases that adopt circular strategies. As a result, several motivations were identified for actors to engage in relationships within ecosystems, such as sharing knowledge and infrastructure, developing products with a circular design, promoting reverse logistics, among others. The results suggest that partnerships between actors are, in fact, important for the implementation of circular strategies. In order to achieve a complete and circular solution, actors must be able to clearly understand their roles and relationships within the network so that they can establish new partnerships or reframe those already established.

Keywords: business ecosystem, circular economy, cooperation, coopetition, competition

Procedia PDF Downloads 211
29536 Transnationalization Strategies of Danish Cinema: Susanne Bier, Lone Scherfig

Authors: Ebru Thwaites Diken

Abstract:

This article analyzes the works of certain directors in Danish cinema, namely Susanne Bier and Lone Sherfig, in the context of transnationalisation of Danish cinema. It looks at how the films' narratives negotiate and reconstruct the local / national / regional and the global. Scholars such as Nestingen & Elkington (2005), Hjort (2010), Higbee and Lim (2010), Bondebjerg and Redvall (2011) address transnationalism of Danish cinema in terms of production and distribution processes and how film making trascends national boundaries. This paper employs a particular understanding of transnationalism - in terms of how ideas and characters travel - to analyze how the storytelling and style has evolved to connect the national, the regional and the global on the basis of the works of these two directors. Strategies such as Hollywoodization - i.e. focus on stardom and classical narration, adhering to conventional European genre formulas, producing Danish films in English language have been identifiable strategies in Danish cinema in the period after the 2000s. Susanne Bier and Lone Scherfig are significant for employing some of these strategies simultaneously. For this reason, this article will look at how these two directors have employed these strategies and negotiated the cultural boundaries and exchanges.

Keywords: danish cinema, transnational cinema, susanne bier, lone scherfig, national cinema

Procedia PDF Downloads 59
29535 A Visual Inspection System for Automotive Sheet Metal Chasis Parts Produced with Cold-Forming Method

Authors: İmren Öztürk Yılmaz, Abdullah Yasin Bilici, Yasin Atalay Candemir

Abstract:

The system consists of 4 main elements: motion system, image acquisition system, image processing software, and control interface. The parts coming out of the production line to enter the image processing system with the conveyor belt at the end of the line. The 3D scanning of the produced part is performed with the laser scanning system integrated into the system entry side. With the 3D scanning method, it is determined at what position and angle the parts enter the system, and according to the data obtained, parameters such as part origin and conveyor speed are calculated with the designed software, and the robot is informed about the position where it will take part. The robot, which receives the information, takes the produced part on the belt conveyor and shows it to high-resolution cameras for quality control. Measurement processes are carried out with a maximum error of 20 microns determined by the experiments.

Keywords: quality control, industry 4.0, image processing, automated fault detection, digital visual inspection

Procedia PDF Downloads 99
29534 Higher Education Leadership and Creating Sites of Institutional Belonging: A Global Case Study

Authors: Lisa M. Coleman

Abstract:

The focus on disability, LGBTQ+, and internationalization has certainly been the subject of much research and programmatic across higher education. Many universities have entered into global partnerships with varying success and challenges across the various areas, including laws and policies. Attentiveness to the specific nuances of global inclusion, diversity, equity, belonging, and access (GIDBEA) and the leadership to support these efforts is crucial to the development of longstanding success across the programs. There have been a number of shifts related to diversification across student and alumni bodies. These shifts include but are not limited to how people identify gender, race, and sexuality (and the intersections across such identities), as well as trends across emerging and diverse disability communities. NYU is the most international campus in the United States, with the most campuses and sites outside of its county of origin and the most international students and exchange programs than any other university. As a result, the ongoing work related to GIDEBA is at the center of much of the leadership, administrative, and research efforts. Climate assessment work across NYU’s diverse global campus landscape will serve as the foundation to exemplify best practices related to data collection and dissemination, community and stakeholder engagement, and effective implementation of innovative strategies to close gap areas as identified. The data (quantitative and qualitative) and related research findings represent data collected from close to 22,000 stakeholders across the NYU campuses. The case study centers on specific methodological considerations, data integrity, stakeholder engagement from across student-faculty, staff, and alumni constituencies, and tactics to advance specific GIDBEA initiatives related to navigating shifting landscapes. Design thinking, incubation, and co-creation strategies have been employed to expand, leverage, actualize, and implement GIDBEA strategies that are – concrete, measurable, differentiated, and specific to global sites and regions and emerging trends.

Keywords: disability, LGBTQ+, DEI, research, case studies

Procedia PDF Downloads 95
29533 Testing the Impact of Formal Interpreting Training on Working Memory Capacity: Evidence from Turkish-English Student-Interpreters

Authors: Elena Antonova Unlu, Cigdem Sagin Simsek

Abstract:

The research presents two studies examining the impact of formal interpreting training (FIT) on Working Memory Capacity (WMC) of student-interpreters. In Study 1, the storage and processing capacities of the working memory (WM) of last-year student-interpreters were compared with those of last-year Foreign Language Education (FLE) students. In Study 2, the impact of FIT on the WMC of student-interpreters was examined via comparing their results on WM tasks at the beginning and the end of their FIT. In both studies, Digit Span Task (DST) and Reading Span Task (RST) were utilized for testing storage and processing capacities of WM. The results of Study 1 revealed that the last-year student-interpreters outperformed the control groups on the RST but not on the DST. The findings of Study 2 were consistent with Study 1 showing that after FIT, the student-interpreters performed better on the RST but not on the DST. Our findings can be considered as evidence supporting the view that FIT has a beneficial effect not only on the interpreting skills of student-interpreters but also on the central executive and processing capacity of their WM.

Keywords: working memory capacity, formal interpreting training, student-interpreters, cross-sectional and longitudinal data

Procedia PDF Downloads 198
29532 Dynamic Risk Identification Using Fuzzy Failure Mode Effect Analysis in Fabric Process Industries: A Research Article as Management Perspective

Authors: A. Sivakumar, S. S. Darun Prakash, P. Navaneethakrishnan

Abstract:

In and around Erode District, it is estimated that more than 1250 chemical and allied textile processing fabric industries are affected, partially closed and shut off for various reasons such as poor management, poor supplier performance, lack of planning for productivity, fluctuation of output, poor investment, waste analysis, labor problems, capital/labor ratio, accumulation of stocks, poor maintenance of resources, deficiencies in the quality of fabric, low capacity utilization, age of plant and equipment, high investment and input but low throughput, poor research and development, lack of energy, workers’ fear of loss of jobs, work force mix and work ethic. The main objective of this work is to analyze the existing conditions in textile fabric sector, validate the break even of Total Productivity (TP), analyze, design and implement fuzzy sets and mathematical programming for improvement of productivity and quality dimensions in the fabric processing industry. It needs to be compatible with the reality of textile and fabric processing industries. The highly risk events from productivity and quality dimension were found by fuzzy systems and results are wrapped up among the textile fabric processing industry.

Keywords: break even point, fuzzy crisp data, fuzzy sets, productivity, productivity cycle, total productive maintenance

Procedia PDF Downloads 322
29531 Modeling Atmospheric Correction for Global Navigation Satellite System Signal to Improve Urban Cadastre 3D Positional Accuracy Case of: TANA and ADIS IGS Stations

Authors: Asmamaw Yehun

Abstract:

The name “TANA” is one of International Geodetic Service (IGS) Global Positioning System (GPS) station which is found in Bahir Dar University in Institute of Land Administration. The station name taken from one of big Lakes in Africa ,Lake Tana. The Institute of Land Administration (ILA) is part of Bahir Dar University, located in the capital of the Amhara National Regional State, Bahir Dar. The institute is the first of its kind in East Africa. The station is installed by cooperation of ILA and Sweden International Development Agency (SIDA) fund support. The Continues Operating Reference Station (CORS) is a network of stations that provide global satellite system navigation data to help three dimensional positioning, meteorology, space, weather, and geophysical applications throughout the globe. TANA station was as CORS since 2013 and sites are independently owned and operated by governments, research and education facilities and others. The data collected by the reference station is downloadable through Internet for post processing purpose by interested parties who carry out GNSS measurements and want to achieve a higher accuracy. We made a first observation on TANA, monitor stations on May 29th 2013. We used Leica 1200 receivers and AX1202GG antennas and made observations from 11:30 until 15:20 for about 3h 50minutes. Processing of data was done in an automatic post processing service CSRS-PPP by Natural Resources Canada (NRCan) . Post processing was done June 27th 2013 so precise ephemeris was used 30 days after observation. We found Latitude (ITRF08): 11 34 08.6573 (dms) / 0.008 (m), Longitude (ITRF08): 37 19 44.7811 (dms) / 0.018 (m) and Ellipsoidal Height (ITRF08): 1850.958 (m) / 0.037 (m). We were compared this result with GAMIT/GLOBK processed data and it was very closed and accurate. TANA station is one of the second IGS station for Ethiopia since 2015 up to now. It provides data for any civilian users, researchers, governmental and nongovernmental users. TANA station is installed with very advanced choke ring antenna and GR25 Leica receiver and also the site is very good for satellite accessibility. In order to test hydrostatic and wet zenith delay for positional data quality, we used GAMIT/GLOBK and we found that TANA station is the most accurate IGS station in East Africa. Due to lower tropospheric zenith and ionospheric delay, TANA and ADIS IGS stations has 2 and 1.9 meters 3D positional accuracy respectively.

Keywords: atmosphere, GNSS, neutral atmosphere, precipitable water vapour

Procedia PDF Downloads 57
29530 Motion Estimator Architecture with Optimized Number of Processing Elements for High Efficiency Video Coding

Authors: Seongsoo Lee

Abstract:

Motion estimation occupies the heaviest computation in HEVC (high efficiency video coding). Many fast algorithms such as TZS (test zone search) have been proposed to reduce the computation. Still the huge computation of the motion estimation is a critical issue in the implementation of HEVC video codec. In this paper, motion estimator architecture with optimized number of PEs (processing element) is presented by exploiting early termination. It also reduces hardware size by exploiting parallel processing. The presented motion estimator architecture has 8 PEs, and it can efficiently perform TZS with very high utilization of PEs.

Keywords: motion estimation, test zone search, high efficiency video coding, processing element, optimization

Procedia PDF Downloads 349
29529 Analysing Techniques for Fusing Multimodal Data in Predictive Scenarios Using Convolutional Neural Networks

Authors: Philipp Ruf, Massiwa Chabbi, Christoph Reich, Djaffar Ould-Abdeslam

Abstract:

In recent years, convolutional neural networks (CNN) have demonstrated high performance in image analysis, but oftentimes, there is only structured data available regarding a specific problem. By interpreting structured data as images, CNNs can effectively learn and extract valuable insights from tabular data, leading to improved predictive accuracy and uncovering hidden patterns that may not be apparent in traditional structured data analysis. In applying a single neural network for analyzing multimodal data, e.g., both structured and unstructured information, significant advantages in terms of time complexity and energy efficiency can be achieved. Converting structured data into images and merging them with existing visual material offers a promising solution for applying CNN in multimodal datasets, as they often occur in a medical context. By employing suitable preprocessing techniques, structured data is transformed into image representations, where the respective features are expressed as different formations of colors and shapes. In an additional step, these representations are fused with existing images to incorporate both types of information. This final image is finally analyzed using a CNN.

Keywords: CNN, image processing, tabular data, mixed dataset, data transformation, multimodal fusion

Procedia PDF Downloads 105
29528 Improving Grade Control Turnaround Times with In-Pit Hyperspectral Assaying

Authors: Gary Pattemore, Michael Edgar, Andrew Job, Marina Auad, Kathryn Job

Abstract:

As critical commodities become more scarce, significant time and resources have been used to better understand complicated ore bodies and extract their full potential. These challenging ore bodies provide several pain points for geologists and engineers to overcome, poor handling of these issues flows downs stream to the processing plant affecting throughput rates and recovery. Many open cut mines utilise blast hole drilling to extract additional information to feed back into the modelling process. This method requires samples to be collected during or after blast hole drilling. Samples are then sent for assay with turnaround times varying from 1 to 12 days. This method is time consuming, costly, requires human exposure on the bench and collects elemental data only. To address this challenge, research has been undertaken to utilise hyperspectral imaging across a broad spectrum to scan samples, collars or take down hole measurements for minerals and moisture content and grade abundances. Automation of this process using unmanned vehicles and on-board processing reduces human in pit exposure to ensure ongoing safety. On-board processing allows data to be integrated into modelling workflows with immediacy. The preliminary results demonstrate numerous direct and indirect benefits from this new technology, including rapid and accurate grade estimates, moisture content and mineralogy. These benefits allow for faster geo modelling updates, better informed mine scheduling and improved downstream blending and processing practices. The paper presents recommendations for implementation of the technology in open cut mining environments.

Keywords: grade control, hyperspectral scanning, artificial intelligence, autonomous mining, machine learning

Procedia PDF Downloads 98
29527 Effectiveness of Metacognitive Skills in Comprehension Instruction for Elementary Students

Authors: Mahdi Taheri Asl

Abstract:

Using a variety of strategies to read text plays an important role to make students strategic independent, strategic, and metacognitive readers. Given the importance of comprehension instruction (CI), it is essential to support the fostering comprehension skills at elementary age students, particularly those who struggle with or dislike reading. One of the main components of CI is activating metacognitive skills, which double function of elementary students. Thus, it’s important to evaluate the implemented comprehension interventions to inform reading specialist and teachers. There has been limited review research in the area of CI, so the conduction review research is required. The purpose of this review is to examine the effectiveness of metacognitive reading strategies in a regular classroom environment with elementary aged students. We develop five inclusion criteria to identify researches relevant to our research. First, the article had to be published in a peer-reviewed journal from 2000 to 2023. second, the study had to include participants in elementary school it could include of special education students. Third, the intervention needed to be involved with metacognitive strategies. Fourth, the articles had to use experimental or quasi experimental design. The last one needed to include measurement of reading performance in pre and post intervention. We used computer data-based site like Eric, PsychoINFO, and google scholar to search for articles that met these criteria. we used the following search terms: comprehension instruction, meta cognitive strategies, and elementary school. The next step was to do an ancestral search that get in reviewing the relevant studies cited in the articles that were found in the database search. We identified 30studies in the initial searches. After coding agreement, we synthesized 13 with respect to the participant, setting, research design, dependent variables, measures, the intervention used by instructors, and general outcomes. The finding show metacognitive strategies were effective to empower student’s comprehension skills. It also showed that linguistic instruction will be effective if got mixed with metacognitive strategies. The research provides a useful view into reading intervention. Despite the positive effect of metacognitive instruction on students’ comprehension skills, it is not widely used in classroom.

Keywords: comprehension instruction, metacogntion, metacognitive skills, reading intervention

Procedia PDF Downloads 65
29526 The Need for Automation in the Domestic Food Processing Sector and its Impact

Authors: Shantam Gupta

Abstract:

The objective of this study is to address the critical need for automation in the domestic food processing sector and study its impact. Food is the one of the most basic physiological needs essential for the survival of a living being. Some of them have the capacity to prepare their own food (like most plants) and henceforth are designated as primary food producers; those who depend on these primary food producers for food form the primary consumers’ class (herbivores). Some of the organisms relying on the primary food are the secondary food consumers (carnivores). There is a third class of consumers called tertiary food consumers/apex food consumers that feed on both the primary and secondary food consumers. Humans form an essential part of the apex predators and are generally at the top of the food chain. But still further disintegration of the food habits of the modern human i.e. Homo sapiens, reveals that humans depend on other individuals for preparing their own food. The old notion of eating raw/brute food is long gone and food processing has become very trenchant in lives of modern human. This has led to an increase in dependence on other individuals for ‘processing’ the food before it can be actually consumed by the modern human. This has led to a further shift of humans in the classification of food chain of consumers. The effects of the shifts shall be systematically investigated in this paper. The processing of food has a direct impact on the economy of the individual (consumer). Also most individuals depend on other processing individuals for the preparation of food. This dependency leads to establishment of a vital link of dependency in the food web which when altered can adversely affect the food web and can have dire consequences on the health of the individual. This study investigates the challenges arising out due to this dependency and the impact of food processing on the economy of the individual. A comparison of Industrial food processing and processing at domestic platforms (households and restaurants) has been made to provide an idea about the present scenario of automation in the food processing sector. A lot of time and energy is also consumed while processing food at home for consumption. The high frequency of consumption of meals (greater than 2 times a day) makes it even more laborious. Through the medium of this study a pressing need for development of an automatic cooking machine is proposed with a mission to reduce the inter-dependency & human effort of individuals required for the preparation of food (by automation of the food preparation process) and make them more self-reliant The impact of development of this product has also further been profoundly discussed. Assumption used: The individuals those who process food also consume the food that they produce. (They are also termed as ‘independent’ or ‘self-reliant’ modern human beings.)

Keywords: automation, food processing, impact on economy, processing individual

Procedia PDF Downloads 462
29525 An Automated Approach to Consolidate Galileo System Availability

Authors: Marie Bieber, Fabrice Cosson, Olivier Schmitt

Abstract:

Europe's Global Navigation Satellite System, Galileo, provides worldwide positioning and navigation services. The satellites in space are only one part of the Galileo system. An extensive ground infrastructure is essential to oversee the satellites and ensure accurate navigation signals. High reliability and availability of the entire Galileo system are crucial to continuously provide positioning information of high quality to users. Outages are tracked, and operational availability is regularly assessed. A highly flexible and adaptive tool has been developed to automate the Galileo system availability analysis. Not only does it enable a quick availability consolidation, but it also provides first steps towards improving the data quality of maintenance tickets used for the analysis. This includes data import and data preparation, with a focus on processing strings used for classification and identifying faulty data. Furthermore, the tool allows to handle a low amount of data, which is a major constraint when the aim is to provide accurate statistics.

Keywords: availability, data quality, system performance, Galileo, aerospace

Procedia PDF Downloads 147
29524 An Enhanced MEIT Approach for Itemset Mining Using Levelwise Pruning

Authors: Tanvi P. Patel, Warish D. Patel

Abstract:

Association rule mining forms the core of data mining and it is termed as one of the well-known methodologies of data mining. Objectives of mining is to find interesting correlations, frequent patterns, associations or casual structures among sets of items in the transaction databases or other data repositories. Hence, association rule mining is imperative to mine patterns and then generate rules from these obtained patterns. For efficient targeted query processing, finding frequent patterns and itemset mining, there is an efficient way to generate an itemset tree structure named Memory Efficient Itemset Tree. Memory efficient IT is efficient for storing itemsets, but takes more time as compare to traditional IT. The proposed strategy generates maximal frequent itemsets from memory efficient itemset tree by using levelwise pruning. For that firstly pre-pruning of items based on minimum support count is carried out followed by itemset tree reconstruction. By having maximal frequent itemsets, less number of patterns are generated as well as tree size is also reduced as compared to MEIT. Therefore, an enhanced approach of memory efficient IT proposed here, helps to optimize main memory overhead as well as reduce processing time.

Keywords: association rule mining, itemset mining, itemset tree, meit, maximal frequent pattern

Procedia PDF Downloads 357
29523 Development of a Tesla Music Coil from Signal Processing

Authors: Samaniego Campoverde José Enrique, Rosero Muñoz Jorge Enrique, Luzcando Narea Lorena Elizabeth

Abstract:

This paper presents a practical and theoretical model for the operation of the Tesla coil using digital signal processing. The research is based on the analysis of ten scientific papers exploring the development and operation of the Tesla coil. Starting from the Testa coil, several modifications were carried out on the Tesla coil, with the aim of amplifying the digital signal by making use of digital signal processing. To achieve this, an amplifier with a transistor and digital filters provided by MATLAB software were used, which were chosen according to the characteristics of the signals in question.

Keywords: tesla coil, digital signal process, equalizer, graphical environment

Procedia PDF Downloads 101
29522 Impact of Climatic Hazards on the Jamuna River Fisheries and Coping and Adaptation Strategies

Authors: Farah Islam, Md. Monirul Islam, Mosammat Salma Akter, Goutam Kumar Kundu

Abstract:

The continuous variability of climate and the risk associated with it have a significant impact on the fisheries leading to a global concern for about half a billion fishery-based livelihoods. Though in the context of Bangladesh mounting evidence on the impacts of climate change on fishery-based livelihoods or their socioeconomic conditions are present, the country’s inland fisheries sector remains in a negligible corner as compared to the coastal areas which are spotted on the highlight due to its higher vulnerability to climatic hazards. The available research on inland fisheries, particularly river fisheries, has focussed mainly on fish production, pollution, fishing gear, fish biodiversity and livelihoods of the fishers. This study assesses the impacts of climate variability and changes on the Jamuna (a transboundary river called Brahmaputra in India) River fishing communities and their coping and adaptation strategies. This study has used primary data collected from Kalitola Ghat and Debdanga fishing communities of the Jamuna River during May, August and December 2015 using semi-structured interviews, oral history interviews, key informant interviews, focus group discussions and impact matrix as well as secondary data. This study has found that both communities are exposed to storms, floods and land erosions which impact on fishery-based livelihood assets, strategies, and outcomes. The impact matrix shows that human and physical capitals are more affected by climate hazards which in turn affect financial capital. Both communities have been responding to these exposures through multiple coping and adaptation strategies. The coping strategies include making dam with soil, putting jute sac on the yard, taking shelter on boat or embankment, making raised platform or ‘Kheua’ and involving with temporary jobs. While, adaptation strategies include permanent migration, change of livelihood activities and strategies, changing fishing practices and making robust houses. The study shows that migration is the most common adaptation strategy for the fishers which resulted in mostly positive outcomes for the migrants. However, this migration has impacted negatively on the livelihoods of existing fishers in the communities. In sum, the Jamuna river fishing communities have been impacted by several climatic hazards and they have traditionally coped with or adapted to the impacts which are not sufficient to maintain sustainable livelihoods and fisheries. In coming decades, this situation may become worse as predicted by latest scientific research and an enhanced level of response would be needed.

Keywords: climatic hazards, impacts and adaptation, fisherfolk, the Jamuna River

Procedia PDF Downloads 301