Search results for: analog signal processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5170

Search results for: analog signal processing

4270 The Structural Pattern: An Event-Related Potential Study on Tang Poetry

Authors: ShuHui Yang, ChingChing Lu

Abstract:

Measuring event-related potentials (ERPs) has been fundamental to our understanding of how people process language. One specific ERP component, a P600, has been hypothesized to be associated with syntactic reanalysis processes. We, however, propose that the P600 is not restricted to reanalysis processes, but is the index of the structural pattern processing. To investigate the structural pattern processing, we utilized the effects of stimulus degradation in structural priming. To put it another way, there was no P600 effect if the structure of the prime was the same with the structure of the target. Otherwise, there would be a P600 effect if the structure were different between the prime and the target. In the experiment, twenty-two participants were presented with four sentences of Tang poetry. All of the first two sentences, being prime, were conducted with SVO+VP. The last two sentences, being the target, were divided into three types. Type one of the targets was SVO+VP. Type two of the targets was SVO+VPVP. Type three of the targets was VP+VP. The result showed that both of the targets, SVO+VPVP and VP+VP, elicited positive-going brainwave, a P600 effect, at 600~900ms time window. Furthermore, the P600 component was lager for the target’ VP+VP’ than the target’ SVO+VPVP’. That meant the more dissimilar the structure was, the lager the P600 effect we got. These results indicate that P600 was the index of the structure processing, and it would affect the P600 effect intensity with the degrees of structural heterogeneity.

Keywords: ERPs, P600, structural pattern, structural priming, Tang poetry

Procedia PDF Downloads 140
4269 Effect of Traffic Composition on Delay and Saturation Flow at Signal Controlled Intersections

Authors: Arpita Saha, Apoorv Jain, Satish Chandra, Indrajit Ghosh

Abstract:

Level of service at a signal controlled intersection is directly measured from the delay. Similarly, saturation flow rate is a fundamental parameter to measure the intersection capacity. The present study calculates vehicle arrival rate, departure rate, and queue length for every five seconds interval in each cycle. Based on the queue lengths, the total delay of the cycle has been calculated using Simpson’s 1/3rd rule. Saturation flow has been estimated in terms of veh/hr of green/lane for every five seconds interval of the green period until at least three vehicles are left to cross the stop line. Vehicle composition shows an immense effect on total delay and saturation flow rate. The increase in two-wheeler proportion increases the saturation flow rate and reduces the total delay per vehicle significantly. Additionally, an increase in the heavy vehicle proportion reduces the saturation flow rate and increases the total delay for each vehicle.

Keywords: delay, saturation flow, signalised intersection, vehicle composition

Procedia PDF Downloads 464
4268 Roughness Discrimination Using Bioinspired Tactile Sensors

Authors: Zhengkun Yi

Abstract:

Surface texture discrimination using artificial tactile sensors has attracted increasing attentions in the past decade as it can endow technical and robot systems with a key missing ability. However, as a major component of texture, roughness has rarely been explored. This paper presents an approach for tactile surface roughness discrimination, which includes two parts: (1) design and fabrication of a bioinspired artificial fingertip, and (2) tactile signal processing for tactile surface roughness discrimination. The bioinspired fingertip is comprised of two polydimethylsiloxane (PDMS) layers, a polymethyl methacrylate (PMMA) bar, and two perpendicular polyvinylidene difluoride (PVDF) film sensors. This artificial fingertip mimics human fingertips in three aspects: (1) Elastic properties of epidermis and dermis in human skin are replicated by the two PDMS layers with different stiffness, (2) The PMMA bar serves the role analogous to that of a bone, and (3) PVDF film sensors emulate Meissner’s corpuscles in terms of both location and response to the vibratory stimuli. Various extracted features and classification algorithms including support vector machines (SVM) and k-nearest neighbors (kNN) are examined for tactile surface roughness discrimination. Eight standard rough surfaces with roughness values (Ra) of 50 μm, 25 μm, 12.5 μm, 6.3 μm 3.2 μm, 1.6 μm, 0.8 μm, and 0.4 μm are explored. The highest classification accuracy of (82.6 ± 10.8) % can be achieved using solely one PVDF film sensor with kNN (k = 9) classifier and the standard deviation feature.

Keywords: bioinspired fingertip, classifier, feature extraction, roughness discrimination

Procedia PDF Downloads 312
4267 Memory Retrieval and Implicit Prosody during Reading: Anaphora Resolution by L1 and L2 Speakers of English

Authors: Duong Thuy Nguyen, Giulia Bencini

Abstract:

The present study examined structural and prosodic factors on the computation of antecedent-reflexive relationships and sentence comprehension in native English (L1) and Vietnamese-English bilinguals (L2). Participants read sentences presented on the computer screen in one of three presentation formats aimed at manipulating prosodic parsing: word-by-word (RSVP), phrase-segment (self-paced), or whole-sentence (self-paced), then completed a grammaticality rating and a comprehension task (following Pratt & Fernandez, 2016). The design crossed three factors: syntactic structure (simple; complex), grammaticality (target-match; target-mismatch) and presentation format. An example item is provided in (1): (1) The actress that (Mary/John) interviewed at the awards ceremony (about two years ago/organized outside the theater) described (herself/himself) as an extreme workaholic). Results showed that overall, both L1 and L2 speakers made use of a good-enough processing strategy at the expense of more detailed syntactic analyses. L1 and L2 speakers’ comprehension and grammaticality judgements were negatively affected by the most prosodically disrupting condition (word-by-word). However, the two groups demonstrated differences in their performance in the other two reading conditions. For L1 speakers, the whole-sentence and the phrase-segment formats were both facilitative in the grammaticality rating and comprehension tasks; for L2, compared with the whole-sentence condition, the phrase-segment paradigm did not significantly improve accuracy or comprehension. These findings are consistent with the findings of Pratt & Fernandez (2016), who found a similar pattern of results in the processing of subject-verb agreement relations using the same experimental paradigm and prosodic manipulation with English L1 and L2 English-Spanish speakers. The results provide further support for a Good-Enough cue model of sentence processing that integrates cue-based retrieval and implicit prosodic parsing (Pratt & Fernandez, 2016) and highlights similarities and differences between L1 and L2 sentence processing and comprehension.

Keywords: anaphora resolution, bilingualism, implicit prosody, sentence processing

Procedia PDF Downloads 152
4266 Neural Correlates of Arabic Digits Naming

Authors: Fernando Ojedo, Alejandro Alvarez, Pedro Macizo

Abstract:

In the present study, we explored electrophysiological correlates of Arabic digits naming to determine semantic processing of numbers. Participants named Arabic digits grouped by category or intermixed with exemplars of other semantic categories while the N400 event-related potential was examined. Around 350-450 ms after the presentation of Arabic digits, brain waves were more positive in anterior regions and more negative in posterior regions when stimuli were grouped by category relative to the mixed condition. Contrary to what was found in other studies, electrophysiological results suggested that the production of numerals involved semantic mediation.

Keywords: Arabic digit naming, event-related potentials, semantic processing, number production

Procedia PDF Downloads 582
4265 Elevated Temperature Shot Peening for M50 Steel

Authors: Xinxin Ma, Guangze Tang, Shuxin Yang, Jinguang He, Fan Zhang, Peiling Sun, Ming Liu, Minyu Sun, Liqin Wang

Abstract:

As a traditional surface hardening technique, shot peening is widely used in industry. By using shot peening, a residual compressive stress is formed in the surface which is beneficial for improving the fatigue life of metal materials. At the same time, very fine grains and high density defects are generated in the surface layer which enhances the surface hardness, either. However, most of the processes are carried out at room temperature. For high strength steel, such as M50, the thickness of the strengthen layer is limited. In order to obtain a thick strengthen surface layer, elevated temperature shot peening was carried out in this work by using Φ1mm cast ion balls with a speed of 80m/s. Considering the tempering temperature of M50 steel is about 550 oC, the processing temperature was in the range from 300 to 500 oC. The effect of processing temperature and processing time of shot peening on distribution of residual stress and surface hardness was investigated. As we known, the working temperature of M50 steel can be as high as 315 oC. Because the defects formed by shot peening are unstable when the working temperature goes higher, it is worthy to understand what happens during the shot peening process, and what happens when the strengthen samples were kept at a certain temperature. In our work, the shot peening time was selected from 2 to 10 min. And after the strengthening process, the samples were annealed at various temperatures from 200 to 500 oC up to 60 h. The results show that the maximum residual compressive stress is near 900 MPa. Compared with room temperature shot peening, the strengthening depth of 500 oC shot peening sample is about 2 times deep. The surface hardness increased with the processing temperature, and the saturation peening time decreases. After annealing, the residual compressive stress decreases, however, for 500 oC peening sample, even annealing at 500 oC for 20 h, the residual compressive stress is still over 600 MPa. However, it is clean to see from SEM that the grain size of surface layers is still very small.

Keywords: shot peening, M50 steel, residual compressive stress, elevated temperature

Procedia PDF Downloads 456
4264 Comparing Image Processing and AI Techniques for Disease Detection in Plants

Authors: Luiz Daniel Garay Trindade, Antonio De Freitas Valle Neto, Fabio Paulo Basso, Elder De Macedo Rodrigues, Maicon Bernardino, Daniel Welfer, Daniel Muller

Abstract:

Agriculture plays an important role in society since it is one of the main sources of food in the world. To help the production and yield of crops, precision agriculture makes use of technologies aiming at improving productivity and quality of agricultural commodities. One of the problems hampering quality of agricultural production is the disease affecting crops. Failure in detecting diseases in a short period of time can result in small or big damages to production, causing financial losses to farmers. In order to provide a map of the contributions destined to the early detection of plant diseases and a comparison of the accuracy of the selected studies, a systematic literature review of the literature was performed, showing techniques for digital image processing and neural networks. We found 35 interesting tool support alternatives to detect disease in 19 plants. Our comparison of these studies resulted in an overall average accuracy of 87.45%, with two studies very closer to obtain 100%.

Keywords: pattern recognition, image processing, deep learning, precision agriculture, smart farming, agricultural automation

Procedia PDF Downloads 379
4263 Computer Aided Analysis of Breast Based Diagnostic Problems from Mammograms Using Image Processing and Deep Learning Methods

Authors: Ali Berkan Ural

Abstract:

This paper presents the analysis, evaluation, and pre-diagnosis of early stage breast based diagnostic problems (breast cancer, nodulesorlumps) by Computer Aided Diagnosing (CAD) system from mammogram radiological images. According to the statistics, the time factor is crucial to discover the disease in the patient (especially in women) as possible as early and fast. In the study, a new algorithm is developed using advanced image processing and deep learning method to detect and classify the problem at earlystagewithmoreaccuracy. This system first works with image processing methods (Image acquisition, Noiseremoval, Region Growing Segmentation, Morphological Operations, Breast BorderExtraction, Advanced Segmentation, ObtainingRegion Of Interests (ROIs), etc.) and segments the area of interest of the breast and then analyzes these partly obtained area for cancer detection/lumps in order to diagnosis the disease. After segmentation, with using the Spectrogramimages, 5 different deep learning based methods (specified Convolutional Neural Network (CNN) basedAlexNet, ResNet50, VGG16, DenseNet, Xception) are applied to classify the breast based problems.

Keywords: computer aided diagnosis, breast cancer, region growing, segmentation, deep learning

Procedia PDF Downloads 95
4262 Implementing a Database from a Requirement Specification

Authors: M. Omer, D. Wilson

Abstract:

Creating a database scheme is essentially a manual process. From a requirement specification, the information contained within has to be analyzed and reduced into a set of tables, attributes and relationships. This is a time-consuming process that has to go through several stages before an acceptable database schema is achieved. The purpose of this paper is to implement a Natural Language Processing (NLP) based tool to produce a from a requirement specification. The Stanford CoreNLP version 3.3.1 and the Java programming were used to implement the proposed model. The outcome of this study indicates that the first draft of a relational database schema can be extracted from a requirement specification by using NLP tools and techniques with minimum user intervention. Therefore, this method is a step forward in finding a solution that requires little or no user intervention.

Keywords: information extraction, natural language processing, relation extraction

Procedia PDF Downloads 261
4261 Legal Issues of Collecting and Processing Big Health Data in the Light of European Regulation 679/2016

Authors: Ioannis Iglezakis, Theodoros D. Trokanas, Panagiota Kiortsi

Abstract:

This paper aims to explore major legal issues arising from the collection and processing of Health Big Data in the light of the new European secondary legislation for the protection of personal data of natural persons, placing emphasis on the General Data Protection Regulation 679/2016. Whether Big Health Data can be characterised as ‘personal data’ or not is really the crux of the matter. The legal ambiguity is compounded by the fact that, even though the processing of Big Health Data is premised on the de-identification of the data subject, the possibility of a combination of Big Health Data with other data circulating freely on the web or from other data files cannot be excluded. Another key point is that the application of some provisions of GPDR to Big Health Data may both absolve the data controller of his legal obligations and deprive the data subject of his rights (e.g., the right to be informed), ultimately undermining the fundamental right to the protection of personal data of natural persons. Moreover, data subject’s rights (e.g., the right not to be subject to a decision based solely on automated processing) are heavily impacted by the use of AI, algorithms, and technologies that reclaim health data for further use, resulting in sometimes ambiguous results that have a substantial impact on individuals. On the other hand, as the COVID-19 pandemic has revealed, Big Data analytics can offer crucial sources of information. In this respect, this paper identifies and systematises the legal provisions concerned, offering interpretative solutions that tackle dangers concerning data subject’s rights while embracing the opportunities that Big Health Data has to offer. In addition, particular attention is attached to the scope of ‘consent’ as a legal basis in the collection and processing of Big Health Data, as the application of data analytics in Big Health Data signals the construction of new data and subject’s profiles. Finally, the paper addresses the knotty problem of role assignment (i.e., distinguishing between controller and processor/joint controllers and joint processors) in an era of extensive Big Health data sharing. The findings are the fruit of a current research project conducted by a three-member research team at the Faculty of Law of the Aristotle University of Thessaloniki and funded by the Greek Ministry of Education and Religious Affairs.

Keywords: big health data, data subject rights, GDPR, pandemic

Procedia PDF Downloads 129
4260 Performance Analysis of the Precise Point Positioning Data Online Processing Service and Using for Monitoring Plate Tectonic of Thailand

Authors: Nateepat Srivarom, Weng Jingnong, Serm Chinnarat

Abstract:

Precise Point Positioning (PPP) technique is use to improve accuracy by using precise satellite orbit and clock correction data, but this technique is complicated methods and high costs. Currently, there are several online processing service providers which offer simplified calculation. In the first part of this research, we compare the efficiency and precision of four software. There are three popular online processing service providers: Australian Online GPS Processing Service (AUSPOS), CSRS-Precise Point Positioning and CenterPoint RTX post processing by Trimble and 1 offline software, RTKLIB, which collected data from 10 the International GNSS Service (IGS) stations for 10 days. The results indicated that AUSPOS has the least distance root mean square (DRMS) value of 0.0029 which is good enough to be calculated for monitoring the movement of tectonic plates. The second, we use AUSPOS to process the data of geodetic network of Thailand. In December 26, 2004, the earthquake occurred a 9.3 MW at the north of Sumatra that highly affected all nearby countries, including Thailand. Earthquake effects have led to errors of the coordinate system of Thailand. The Royal Thai Survey Department (RTSD) is primarily responsible for monitoring of the crustal movement of the country. The difference of the geodetic network movement is not the same network and relatively large. This result is needed for survey to continue to improve GPS coordinates system in every year. Therefore, in this research we chose the AUSPOS to calculate the magnitude and direction of movement, to improve coordinates adjustment of the geodetic network consisting of 19 pins in Thailand during October 2013 to November 2017. Finally, results are displayed on the simulation map by using the ArcMap program with the Inverse Distance Weighting (IDW) method. The pin with the maximum movement is pin no. 3239 (Tak) in the northern part of Thailand. This pin moved in the south-western direction to 11.04 cm. Meanwhile, the directional movement of the other pins in the south gradually changed from south-west to south-east, i.e., in the direction noticed before the earthquake. The magnitude of the movement is in the range of 4 - 7 cm, implying small impact of the earthquake. However, the GPS network should be continuously surveyed in order to secure accuracy of the geodetic network of Thailand.

Keywords: precise point positioning, online processing service, geodetic network, inverse distance weighting

Procedia PDF Downloads 189
4259 An Advanced Automated Brain Tumor Diagnostics Approach

Authors: Berkan Ural, Arif Eser, Sinan Apaydin

Abstract:

Medical image processing is generally become a challenging task nowadays. Indeed, processing of brain MRI images is one of the difficult parts of this area. This study proposes a hybrid well-defined approach which is consisted from tumor detection, extraction and analyzing steps. This approach is mainly consisted from a computer aided diagnostics system for identifying and detecting the tumor formation in any region of the brain and this system is commonly used for early prediction of brain tumor using advanced image processing and probabilistic neural network methods, respectively. For this approach, generally, some advanced noise removal functions, image processing methods such as automatic segmentation and morphological operations are used to detect the brain tumor boundaries and to obtain the important feature parameters of the tumor region. All stages of the approach are done specifically with using MATLAB software. Generally, for this approach, firstly tumor is successfully detected and the tumor area is contoured with a specific colored circle by the computer aided diagnostics program. Then, the tumor is segmented and some morphological processes are achieved to increase the visibility of the tumor area. Moreover, while this process continues, the tumor area and important shape based features are also calculated. Finally, with using the probabilistic neural network method and with using some advanced classification steps, tumor area and the type of the tumor are clearly obtained. Also, the future aim of this study is to detect the severity of lesions through classes of brain tumor which is achieved through advanced multi classification and neural network stages and creating a user friendly environment using GUI in MATLAB. In the experimental part of the study, generally, 100 images are used to train the diagnostics system and 100 out of sample images are also used to test and to check the whole results. The preliminary results demonstrate the high classification accuracy for the neural network structure. Finally, according to the results, this situation also motivates us to extend this framework to detect and localize the tumors in the other organs.

Keywords: image processing algorithms, magnetic resonance imaging, neural network, pattern recognition

Procedia PDF Downloads 418
4258 A Novel RLS Based Adaptive Filtering Method for Speech Enhancement

Authors: Pogula Rakesh, T. Kishore Kumar

Abstract:

Speech enhancement is a long standing problem with numerous applications like teleconferencing, VoIP, hearing aids, and speech recognition. The motivation behind this research work is to obtain a clean speech signal of higher quality by applying the optimal noise cancellation technique. Real-time adaptive filtering algorithms seem to be the best candidate among all categories of the speech enhancement methods. In this paper, we propose a speech enhancement method based on Recursive Least Squares (RLS) adaptive filter of speech signals. Experiments were performed on noisy data which was prepared by adding AWGN, Babble and Pink noise to clean speech samples at -5dB, 0dB, 5dB, and 10dB SNR levels. We then compare the noise cancellation performance of proposed RLS algorithm with existing NLMS algorithm in terms of Mean Squared Error (MSE), Signal to Noise ratio (SNR), and SNR loss. Based on the performance evaluation, the proposed RLS algorithm was found to be a better optimal noise cancellation technique for speech signals.

Keywords: adaptive filter, adaptive noise canceller, mean squared error, noise reduction, NLMS, RLS, SNR, SNR loss

Procedia PDF Downloads 481
4257 The Non-Linear Analysis of Brain Response to Visual Stimuli

Authors: H. Namazi, H. T. N. Kuan

Abstract:

Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to visual stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to visual stimuli but provide us with very good recommendations for clinical purposes.

Keywords: visual stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 561
4256 Analysis of Control by Flattening of the Welded Tubes

Authors: Hannachi Med Tahar, H. Djebaili, B. Daheche

Abstract:

In this approach, we have tried to describe the flattening of welded tubes, and its experimental application. The test is carried out at the (National product processing company dishes and tubes production). Usually, the final products (tubes) undergo a series of non-destructive inspection online and offline welding, and obviously destructive mechanical testing (bending, flattening, flaring, etc.). For this and for the purpose of implementing the flattening test, which applies to the processing of round tubes in other forms, it took four sections of welded tubes draft (before stretching hot) and welded tubes finished (after drawing hot and annealing), it was also noted the report 'health' flattened tubes must not show or crack or tear. The test is considered poor if it reveals a lack of ductility of the metal.

Keywords: flattening, destructive testing, tube drafts, finished tube, Castem 2001

Procedia PDF Downloads 446
4255 Determinaton of Processing Parameters of Decaffeinated Black Tea by Using Pilot-Scale Supercritical CO₂ Extraction

Authors: Saziye Ilgaz, Atilla Polat

Abstract:

There is a need for development of new processing techniques to ensure safety and quality of final product while minimizing the adverse impact of extraction solvents on environment and residue levels of these solvents in final product, decaffeinated black tea. In this study pilot scale supercritical carbon dioxide (SCCO₂) extraction was used to produce decaffeinated black tea in place of solvent extraction. Pressure (250, 375, 500 bar), extraction time (60, 180, 300 min), temperature (55, 62.5, 70 °C), CO₂ flow rate (1, 2 ,3 LPM) and co-solvent quantity (0, 2.5, 5 %mol) were selected as extraction parameters. The five factors BoxBehnken experimental design with three center points was performed to generate 46 different processing conditions for caffeine removal from black tea samples. As a result of these 46 experiments caffeine content of black tea samples were reduced from 2.16 % to 0 – 1.81 %. The experiments showed that extraction time, pressure, CO₂ flow rate and co-solvent quantity had great impact on decaffeination yield. Response surface methodology (RSM) was used to optimize the parameters of the supercritical carbon dioxide extraction. Optimum extraction parameters obtained of decaffeinated black tea were as follows: extraction temperature of 62,5 °C, extraction pressure of 375 bar, CO₂ flow rate of 3 LPM, extraction time of 176.5 min and co-solvent quantity of 5 %mol.

Keywords: supercritical carbon dioxide, decaffeination, black tea, extraction

Procedia PDF Downloads 364
4254 Difficulties in the Emotional Processing of Intimate Partner Violence Perpetrators

Authors: Javier Comes Fayos, Isabel RodríGuez Moreno, Sara Bressanutti, Marisol Lila, Angel Romero MartíNez, Luis Moya Albiol

Abstract:

Given the great impact produced by gender-based violence, its comprehensive approach seems essential. Consequently, research has focused on risk factors for violent behaviour, linking various psychosocial variables, as well as cognitive and neuropsychological deficits with the aggressors. However, studies on affective processing are scarce, so the present study investigates possible emotional alterations in men convicted of gender violence. The participants were 51 aggressors, who attended the CONTEXTO program with sentences of less than two years, and 47 men with no history of violence. The sample did not differ in age, socioeconomic level, education, or alcohol and other substances consumption. Anger, alexithymia and facial recognition of other people´s emotions were assessed through the State-Trait Anger Expression Inventory (STAXI-2), the Toronto Alexithymia Scale (TAS-20) and Reading the mind in the eyes (REM), respectively. Men convicted of gender-based violence showed higher scores on the anger trait and temperament dimensions, as well as on the anger expression index. They also scored higher on alexithymia and in the identification and emotional expression subscales. In addition, they showed greater difficulties in the facial recognition of emotions by having a lower score in the REM. These results seem to show difficulties in different affective areas in men condemned for gender violence. The deficits are reflected in greater difficulty in identifying and expressing emotions, in processing anger and in recognizing the emotions of others. All these difficulties have been related to the use of violent behavior. Consequently, it is essential and necessary to include emotional regulation in intervention programs for men who have been convicted of gender-based violence.

Keywords: alexithymia, anger, emotional processing, emotional recognition, empathy, intimate partner violence

Procedia PDF Downloads 201
4253 Achieving Flow at Work: An Experience Sampling Study to Comprehend How Cognitive Task Characteristics and Work Environments Predict Flow Experiences

Authors: Jonas De Kerf, Rein De Cooman, Sara De Gieter

Abstract:

For many decades, scholars have aimed to understand how work can become more meaningful by maximizing both potential and enhancing feelings of satisfaction. One of the largest contributions towards such positive psychology was made with the introduction of the concept of ‘flow,’ which refers to a condition in which people feel intense engagement and effortless action. Since then, valuable research on work-related flow has indicated that this state of mind is related to positive outcomes for both organizations (e.g., social, supportive climates) and workers (e.g., job satisfaction). Yet, scholars still do not fully comprehend how such deep involvement at work is obtained, given the notion that flow is considered a short-term, complex, and dynamic experience. Most research neglects that people who experience flow ought to be optimally challenged so that intense concentration is required. Because attention is at the core of this enjoyable state of mind, this study aims to comprehend how elements that affect workers’ cognitive functioning impact flow at work. Research on cognitive performance suggests that working on mentally demanding tasks (e.g., information processing tasks) requires workers to concentrate deeply, as a result leading to flow experiences. Based on social facilitation theory, working on such tasks in an isolated environment eases concentration. Prior research has indicated that working at home (instead of working at the office) or in a closed office (rather than in an open-plan office) impacts employees’ overall functioning in terms of concentration and productivity. Consequently, we advance such knowledge and propose an interaction by combining cognitive task characteristics and work environments among part-time teleworkers. Hence, we not only aim to shed light on the relation between cognitive tasks and flow but also provide empirical evidence that workers performing such tasks achieve the highest states of flow while working either at home or in closed offices. In July 2022, an experience-sampling study will be conducted that uses a semi-random signal schedule to understand how task and environment predictors together impact part-time teleworkers’ flow. More precisely, about 150 knowledge workers will fill in multiple surveys a day for two consecutive workweeks to report their flow experiences, cognitive tasks, and work environments. Preliminary results from a pilot study indicate that on a between level, tasks high in information processing go along with high self-reported fluent productivity (i.e., making progress). As expected, evidence was found for higher fluency in productivity for workers performing information processing tasks both at home and in a closed office, compared to those performing the same tasks at the office or in open-plan offices. This study expands the current knowledge on work-related flow by looking at a task and environmental predictors that enable workers to obtain such a peak state. While doing so, our findings suggest that practitioners should strive for ideal alignments between tasks and work locations to work with both deep involvement and gratification.

Keywords: cognitive work, office lay-out, work location, work-related flow

Procedia PDF Downloads 101
4252 Embedded Hw-Sw Reconfigurable Techniques For Wireless Sensor Network Applications

Authors: B. Kirubakaran, C. Rajasekaran

Abstract:

Reconfigurable techniques are used in many engineering and industrial applications for the efficient data transmissions through the wireless sensor networks. Nowadays most of the industrial applications are work for try to minimize the size and cost. During runtime the reconfigurable technique avoid the unwanted hang and delay in the system performance. In recent world Field Programmable Gate Array (FPGA) as one of the most efficient reconfigurable device and widely used for most of the hardware and software reconfiguration applications. In this paper, the work deals with whatever going to make changes in the hardware and software during runtime it’s should not affect the current running process that’s the main objective of the paper our changes be done in a parallel manner at the same time concentrating the cost and power transmission problems during data trans-receiving. Analog sensor (Temperature) as an input for the controller (PIC) through that control the FPGA digital sensors in generalized manner.

Keywords: field programmable gate array, peripheral interrupt controller, runtime reconfigurable techniques, wireless sensor networks

Procedia PDF Downloads 407
4251 Cooperative Diversity Scheme Based on MIMO-OFDM in Small Cell Network

Authors: Dong-Hyun Ha, Young-Min Ko, Chang-Bin Ha, Hyoung-Kyu Song

Abstract:

In Heterogeneous network (HetNet) can provide high quality of a service in a wireless communication system by composition of small cell networks. The composition of small cell networks improves cell coverage and capacity to the mobile users.Recently, various techniques using small cell networks have been researched in the wireless communication system. In this paper, the cooperative scheme obtaining high reliability is proposed in the small cell networks. The proposed scheme suggests a cooperative small cell system and the new signal transmission technique in the proposed system model. The new signal transmission technique applies a cyclic delay diversity (CDD) scheme based on the multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) system to obtain improved performance. The improved performance of the proposed scheme is confirmed by the simulation results.

Keywords: adaptive transmission, cooperative communication, diversity gain, OFDM

Procedia PDF Downloads 502
4250 Influence of Some Technological Parameters on the Content of Voids in Composite during On-Line Consolidation with Filament Winding Technology

Authors: M. Stefanovska, B. Samakoski, S. Risteska, G. Maneski

Abstract:

In this study was performed in situ consolidation of polypropylene matrix/glass reinforced roving by combining heating systems and roll pressing. The commingled roving during hoop winding was winded on a cylindrical mandrel. The work also presents the advances made in the processing of these materials into composites by conventional technique filament winding. Experimental studies were performed with changing parameters – temperature, pressure and speed. Finally, it describes the investigation of the optimal processing conditions that maximize the mechanical properties of the composites. These properties are good enough for composites to be used as engineering materials in many structural applications.

Keywords: commingled fiber, consolidation heat, filament winding, voids

Procedia PDF Downloads 266
4249 New Concept for Real Time Selective Harmonics Elimination Based on Lagrange Interpolation Polynomials

Authors: B. Makhlouf, O. Bouchhida, M. Nibouche, K. Laidi

Abstract:

A variety of methods for selective harmonics elimination pulse width modulation have been developed, the most frequently used for real-time implementation based on look-up tables method. To address real-time requirements based in modified carrier signal is proposed in the presented work, with a general formulation to real-time harmonics control/elimination in switched inverters. Firstly, the proposed method has been demonstrated for a single value of the modulation index. However, in reality, this parameter is variable as a consequence of the voltage (amplitude) variability. In this context, a simple interpolation method for calculating the modified sine carrier signal is proposed. The method allows a continuous adjustment in both amplitude and frequency of the fundamental. To assess the performance of the proposed method, software simulations and hardware experiments have been carried out in the case of a single-phase inverter. Obtained results are very satisfactory.

Keywords: harmonic elimination, Particle Swarm Optimisation (PSO), polynomial interpolation, pulse width modulation, real-time harmonics control, voltage inverter

Procedia PDF Downloads 503
4248 Electrochemiluminescent Detection of DNA Damage Induced by Tetrachloro-1,4- Benzoquinone Using DNA Sensor

Authors: Tian-Fang Kang, Xue Sun

Abstract:

DNA damage induced by tetrachloro-1,4-benzoquinone (TCBQ), a reactive metabolite of pentachloro-phenol (PCP), was investigated using a glassy carbon electrode (GCE) modified with calf thymus double-stranded DNA (ds-DNA) in this work. DNA modified films were constructed by layer-by-layer adsorption of polycationic poly(diallyldimethyl- ammonium chloride) (PDDA) and negatively charged ds-DNA on the surface of a glassy carbon electrode. The DNA intercalator [Ru(bpy)2(dppz)]2+ (bpy=2, 2′-bipyridine, dppz0dipyrido [3, 2-a: 2′,3′-c] phenazine) was chosen as an electrochemical probe to detect DNA damage. After the sensor was incubated in 0.1 M pH 7.3 phosphate buffer solution (PBS) for 30min, the intact PDDA/DNA film produced a sensitive electrochemiluminescent (ECL) signal. However, after the sensor was incubated in 100 μM TCBQ or a mixed solution of 100 μM TCBQ and 2 mM H2O2, ECL signal decreased significantly. During the incubation of DNA in TCBQ or TCBQ-H2O2 solution, the double-helix of DNA was damaged, which resulted in the decrease of Ru-dppz bound to DNA. Additionally, the results were verified independently by fluorescence experiments. This paper provides a sensitive method to directly screen DNA damage induced by chemicals in the environment.

Keywords: DNA damage, detection, electrochemiluminescence, sensor

Procedia PDF Downloads 410
4247 The Analysis of Brain Response to Auditory Stimuli through EEG Signals’ Non-Linear Analysis

Authors: H. Namazi, H. T. N. Kuan

Abstract:

Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to auditory stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to auditory stimuli but provide us with very good recommendations for clinical purposes.

Keywords: auditory stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure

Procedia PDF Downloads 534
4246 Food Waste and Sustainable Management

Authors: Farhana Nosheen, Moeez Ahmad

Abstract:

Throughout the food chain, the food waste from initial agricultural production to final household consumption has become a serious concern for global sustainability because of its adverse impacts on food security, natural resources, the environment, and human health. About a third of tomatoes (Lycopersicon esculentum L.) delivered to processing plants end as processing waste. The amount of such waste material is estimated to have increased with the emergence of mechanical harvesting. Experiments were made to determine the nutritional profile and antioxidant activity of tomato processing waste and to explore the bioactive compound in tomato waste, i.e., Lycopene. Tomato Variety of ‘SAHARA F1’ was used to make tomato waste. The tomatoes were properly cleaned, and then unwanted impurities were removed properly. The tomatoes were blanched at 90 ℃ for 5 minutes. After which, the skin of the tomatoes was removed, and the remaining part passed through the electric pulper. The pulp and seeds were collected separately. The seeds and skin of tomatoes were mixed and saved in a sterilized jar. The samples of tomato waste were found to contain 89.11±0.006 g/100g moisture, 10.13±0.115 g/100g protein, 2.066±0.57 g/100g fat, 4.81±0.10 g/100g crude fiber, and 4.06±0.057 g/100g ash and NFE 78.92±0.066 g/100g. The results confirmed that tomato waste contains a considerable amount of Lycopene 51.0667±0.00577 mg/100g and exhibited good antioxidant properties. Total phenolics showed average contents of 122.9600±0.01000 mg GAE/100g, of which flavonoids accounted for 41.5367±0.00577 mg QE/100g. Antioxidant activity of tomato processing waste was found 0.6833±0.00577 mmol Trolox/100g. Unsaturated fatty acids represent the major portion of total fatty acids, Linoleic acid being the major one. The mineral content of tomato waste showed a good amount of potassium 3030.1767 mg/100g and calcium 131.80 mg/100g, respectively were present in it. These findings suggest that tomato processing waste is rich in nutrients, antioxidants, fatty acids, and minerals. I recommend that this waste should be sun-dried to be used in the combination of feed of the animals. It can also be used in making some other products like lycopene tea or several other health-beneficial products.

Keywords: food waste, tomato, bioactive compound, sustainable management

Procedia PDF Downloads 109
4245 Low-Cost Image Processing System for Evaluating Pavement Surface Distress

Authors: Keerti Kembhavi, M. R. Archana, V. Anjaneyappa

Abstract:

Most asphalt pavement condition evaluation use rating frameworks in which asphalt pavement distress is estimated by type, extent, and severity. Rating is carried out by the pavement condition rating (PCR), which is tedious and expensive. This paper presents the development of a low-cost technique for image pavement distress analysis that permits the identification of pothole and cracks. The paper explores the application of image processing tools for the detection of potholes and cracks. Longitudinal cracking and pothole are detected using Fuzzy-C- Means (FCM) and proceeded with the Spectral Theory algorithm. The framework comprises three phases, including image acquisition, processing, and extraction of features. A digital camera (Gopro) with the holder is used to capture pavement distress images on a moving vehicle. FCM classifier and Spectral Theory algorithms are used to compute features and classify the longitudinal cracking and pothole. The Matlab2016Ra Image preparing tool kit utilizes performance analysis to identify the viability of pavement distress on selected urban stretches of Bengaluru city, India. The outcomes of image evaluation with the utilization semi-computerized image handling framework represented the features of longitudinal crack and pothole with an accuracy of about 80%. Further, the detected images are validated with the actual dimensions, and it is seen that dimension variability is about 0.46. The linear regression model y=1.171x-0.155 is obtained using the existing and experimental / image processing area. The R2 correlation square obtained from the best fit line is 0.807, which is considered in the linear regression model to be ‘large positive linear association’.

Keywords: crack detection, pothole detection, spectral clustering, fuzzy-c-means

Procedia PDF Downloads 181
4244 Case Study of High-Resolution Marine Seismic Survey in Shallow Water, Arabian Gulf, Saudi Arabia

Authors: Almalki M., Alajmi M., Qadrouh Y., Alzahrani E., Sulaiman A., Aleid M., Albaiji A., Alfaifi H., Alhadadi A., Almotairy H., Alrasheed R., Alhafedh Y.

Abstract:

High-resolution marine seismic survey is a well-established technique that commonly used to characterize near-surface sediments and geological structures at shallow water. We conduct single channel seismic survey to provide high quality seismic images for near-surface sediments upto 100m depth at Jubal costal area, Arabian Gulf. Eight hydrophones streamer has been used to collect stacked seismic traces alone 5km seismic line. To reach the required depth, we have used spark system that discharges energies above 5000 J with expected frequency output span the range from 200 to 2000 Hz. A suitable processing flow implemented to enhance the signal-to-noise ratio of the seismic profile. We have found that shallow sedimentary layers at the study site have complex pattern of reflectivity, which decay significantly due to amount of source energy used as well as the multiples associated to seafloor. In fact, the results reveal that single channel marine seismic at shallow water is a cost-effective technique that can be easily repeated to observe any possibly changes in the wave physical properties at the near surface layers

Keywords: shallow marine single-channel data, high resolution, frequency filtering, shallow water

Procedia PDF Downloads 72
4243 A Study of Using Different Printed Circuit Board Design Methods on Ethernet Signals

Authors: Bahattin Kanal, Nursel Akçam

Abstract:

Data transmission size and frequency requirements are increasing rapidly in electronic communication protocols. Increasing data transmission speeds have made the design of printed circuit boards much more important. It is important to carefully examine the requirements and make analyses before and after the design of the digital electronic circuit board. It delves into impedance matching techniques, signal trace routing considerations, and the impact of layer stacking on signal performance. The paper extensively explores techniques for minimizing crosstalk issues and interference, presenting a holistic perspective on design strategies to optimize the quality of high-speed signals. Through a comprehensive review of these design methodologies, this study aims to provide insights into achieving reliable and high-performance printed circuit board layouts for these signals. In this study, the effect of different design methods on Ethernet signals was examined from the type of S parameters. Siemens company HyperLynx software tool was used for the analyses.

Keywords: HyperLynx, printed circuit board, s parameters, ethernet

Procedia PDF Downloads 34
4242 The Role of Structure Input in Pi in the Acquisition of English Relative Clauses by L1 Saudi Arabic Speakers

Authors: Faraj Alhamami

Abstract:

The effects of classroom input through structured input activities have been addressing two main lines of inquiry: (1) measuring the effects of structured input activities as a possible causative factor of PI and (2) comparing structured input practice versus other types of instruction or no-training controls. This line of research, the main purpose of this classroom-based research, was to establish which type of activities is the most effective in processing instruction, whether it is the explicit information component and referential activities only or the explicit information component and affective activities only or a combination of the two. The instruments were: a) grammatical judgment task, b) Picture-cued task, and c) a translation task as pre-tests, post-tests and delayed post-tests seven weeks after the intervention. While testing is ongoing, preliminary results shows that the examination of participants' pre-test performance showed that all five groups - the processing instruction including both activities (RA), Traditional group (TI), Referential group (R), Affective group (A), and Control group - performed at a comparable chance or baseline level across the three outcome measures. However, at the post-test stage, the RA, TI, R, and A groups demonstrated significant improvement compared to the Control group in all tasks. Furthermore, significant difference was observed among PI groups (RA, R, and A) at post-test and delayed post-test on some of the tasks when compared to traditional group. Therefore, the findings suggest that the use of the sole application and/or the combination of the structured input activities has succeeded in helping Saudi learners of English make initial form-meaning connections and acquire RRCs in the short and the long term.

Keywords: input processing, processing instruction, MOGUL, structure input activities

Procedia PDF Downloads 79
4241 Comparing the Durability of Saudi Silica Sands for Use in Foundry Processing

Authors: Mahdi Alsagour, Sam Ramrattan

Abstract:

This paper was developed to investigate two types of sands from the Kingdom of Saudi Arabia (KSA) for potential use in the global metal casting industry. Four types of sands were selected for study, two of the sand systems investigated are natural sands from the KSA. The third sand sample is a heat processed synthetic sand and the last sample is commercially available US silica sand that is used as a control in the study. The purpose of this study is to define the durability of the four sand systems selected for foundry usage. Additionally, chemical analysis of the sand systems is presented before and after elevated temperature exposure. Results show that Saudi silica sands are durable and can be used in foundry processing.

Keywords: alternative molding media, foundry sand, reclamation, silica sand, specialty sand

Procedia PDF Downloads 138