Search results for: time series feature extraction
19506 Quantum Cum Synaptic-Neuronal Paradigm and Schema for Human Speech Output and Autism
Authors: Gobinathan Devathasan, Kezia Devathasan
Abstract:
Objective: To improve the current modified Broca-Wernicke-Lichtheim-Kussmaul speech schema and provide insight into autism. Methods: We reviewed the pertinent literature. Current findings, involving Brodmann areas 22, 46, 9,44,45,6,4 are based on neuropathology and functional MRI studies. However, in primary autism, there is no lucid explanation and changes described, whether neuropathology or functional MRI, appear consequential. Findings: We forward an enhanced model which may explain the enigma related to autism. Vowel output is subcortical and does need cortical representation whereas consonant speech is cortical in origin. Left lateralization is needed to commence the circuitry spin as our life have evolved with L-amino acids and left spin of electrons. A fundamental species difference is we are capable of three syllable-consonants and bi-syllable expression whereas cetaceans and songbirds are confined to single or dual consonants. The 4 key sites for speech are superior auditory cortex, Broca’s two areas, and the supplementary motor cortex. Using the Argand’s diagram and Reimann’s projection, we theorize that the Euclidean three dimensional synaptic neuronal circuits of speech are quantized to coherent waves, and then decoherence takes place at area 6 (spherical representation). In this quantum state complex, 3-consonant languages are instantaneously integrated and multiple languages can be learned, verbalized and differentiated. Conclusion: We postulate that evolutionary human speech is elevated to quantum interaction unlike cetaceans and birds to achieve the three consonants/bi-syllable speech. In classical primary autism, the sudden speech switches off and on noted in several cases could now be explained not by any anatomical lesion but failure of coherence. Area 6 projects directly into prefrontal saccadic area (8); and this further explains the second primary feature in autism: lack of eye contact. The third feature which is repetitive finger gestures, located adjacent to the speech/motor areas, are actual attempts to communicate with the autistic child akin to sign language for the deaf.Keywords: quantum neuronal paradigm, cetaceans and human speech, autism and rapid magnetic stimulation, coherence and decoherence of speech
Procedia PDF Downloads 19319505 An Efficient Algorithm for Solving the Transmission Network Expansion Planning Problem Integrating Machine Learning with Mathematical Decomposition
Authors: Pablo Oteiza, Ricardo Alvarez, Mehrdad Pirnia, Fuat Can
Abstract:
To effectively combat climate change, many countries around the world have committed to a decarbonisation of their electricity, along with promoting a large-scale integration of renewable energy sources (RES). While this trend represents a unique opportunity to effectively combat climate change, achieving a sound and cost-efficient energy transition towards low-carbon power systems poses significant challenges for the multi-year Transmission Network Expansion Planning (TNEP) problem. The objective of the multi-year TNEP is to determine the necessary network infrastructure to supply the projected demand in a cost-efficient way, considering the evolution of the new generation mix, including the integration of RES. The rapid integration of large-scale RES increases the variability and uncertainty in the power system operation, which in turn increases short-term flexibility requirements. To meet these requirements, flexible generating technologies such as energy storage systems must be considered within the TNEP as well, along with proper models for capturing the operational challenges of future power systems. As a consequence, TNEP formulations are becoming more complex and difficult to solve, especially for its application in realistic-sized power system models. To meet these challenges, there is an increasing need for developing efficient algorithms capable of solving the TNEP problem with reasonable computational time and resources. In this regard, a promising research area is the use of artificial intelligence (AI) techniques for solving large-scale mixed-integer optimization problems, such as the TNEP. In particular, the use of AI along with mathematical optimization strategies based on decomposition has shown great potential. In this context, this paper presents an efficient algorithm for solving the multi-year TNEP problem. The algorithm combines AI techniques with Column Generation, a traditional decomposition-based mathematical optimization method. One of the challenges of using Column Generation for solving the TNEP problem is that the subproblems are of mixed-integer nature, and therefore solving them requires significant amounts of time and resources. Hence, in this proposal we solve a linearly relaxed version of the subproblems, and trained a binary classifier that determines the value of the binary variables, based on the results obtained from the linearized version. A key feature of the proposal is that we integrate the binary classifier into the optimization algorithm in such a way that the optimality of the solution can be guaranteed. The results of a study case based on the HRP 38-bus test system shows that the binary classifier has an accuracy above 97% for estimating the value of the binary variables. Since the linearly relaxed version of the subproblems can be solved with significantly less time than the integer programming counterpart, the integration of the binary classifier into the Column Generation algorithm allowed us to reduce the computational time required for solving the problem by 50%. The final version of this paper will contain a detailed description of the proposed algorithm, the AI-based binary classifier technique and its integration into the CG algorithm. To demonstrate the capabilities of the proposal, we evaluate the algorithm in case studies with different scenarios, as well as in other power system models.Keywords: integer optimization, machine learning, mathematical decomposition, transmission planning
Procedia PDF Downloads 8319504 Reducing Component Stress during Encapsulation of Electronics: A Simulative Examination of Thermoplastic Foam Injection Molding
Authors: Constantin Ott, Dietmar Drummer
Abstract:
The direct encapsulation of electronic components is an effective way of protecting components against external influences. In addition to achieving a sufficient protective effect, there are two other big challenges for satisfying the increasing demand for encapsulated circuit boards. The encapsulation process should be both suitable for mass production and offer a low component load. Injection molding is a method with good suitability for large series production but also with typically high component stress. In this article, two aims were pursued: first, the development of a calculation model that allows an estimation of the occurring forces based on process variables and material parameters. Second, the evaluation of a new approach for stress reduction by means of thermoplastic foam injection molding. For this purpose, simulation-based process data was generated with the Moldflow simulation tool. Based on this, component stresses were calculated with the calculation model. At the same time, this paper provided a model for estimating the forces occurring during overmolding and derived a solution method for reducing these forces. The suitability of this approach was clearly demonstrated and a significant reduction in shear forces during overmolding was achieved. It was possible to demonstrate a process development that makes it possible to meet the two main requirements of direct encapsulation in addition to a high protective effect.Keywords: encapsulation, stress reduction, foam-injection-molding, simulation
Procedia PDF Downloads 12419503 Ankle Fracture Management: A Unique Cross Departmental Quality Improvement Project
Authors: Langhit Kurar, Loren Charles
Abstract:
Introduction: In light of recent BOAST 12 (August 2016) published guidance on management of ankle fractures, the project aimed to highlight key discrepancies throughout the care trajectory from admission to point of discharge at a district general hospital. Wide breadth of data covering three key domains: accident and emergency, radiology, and orthopaedic surgery were subsequently stratified and recommendations on note documentation, and outpatient follow up were made. Methods: A retrospective twelve month audit was conducted reviewing results of ankle fracture management in 37 patients. Inclusion criterion involved all patients seen at Darent Valley Hospital (DVH) emergency department with radiographic evidence of an ankle fracture. Exclusion criterion involved all patients managed solely by nursing staff or having sustained purely ligamentous injury. Medical notes, including discharge summaries and the PACS online radiographic tool were used for data extraction. Results: Cross-examination of the A & E domain revealed limited awareness of the BOAST 12 recent publication including requirements to document skin integrity and neurovascular assessment. This had direct implications as this would have changed the surgical plan for acutely compromised patients. The majority of results obtained from the radiographic domain were satisfactory with appropriate X-rays taken in over 95% of cases. However, due to time pressures within A & E, patients were often left without a post manipulation XRAY in a backslab. Poorly reduced fractures were subsequently left for a long period resulting in swollen ankles and a time-dependent lag to surgical intervention. This had knocked on implications for prolonged inpatient stay resulting in hospital-acquired co-morbidity including pressure sores. Discussion: The audit has highlighted several areas of improvement throughout the disease trajectory from review in the emergency department to follow up as an outpatient. This has prompted the creation of an algorithm to ensure patients with significant fractures presenting to the emergency department are seen promptly and treatment expedited as per recent guidance. This includes timing for X-rays taken in A & E. Re-audit has shown significant improvement in both documentation at time of presentation and appropriate follow-up strategies. Within the orthopedic domain, we are in the process of creating an ankle fracture pathway to ensure imaging and weight bearing status are made clear to the consulting clinicians in an outpatient setting. Significance/Clinical Relevance: As a result of the ankle fracture algorithm we have adapted the BOAST 12 guidance to shape an intrinsic pathway to not only improve patient management within the emergency department but also create a standardised format for follow up.Keywords: ankle, fracture, BOAST, radiology
Procedia PDF Downloads 17819502 An Architecture of Ingenuity and Empowerment
Authors: Timothy Gray
Abstract:
This paper will present work and discuss lessons learned during a semester-long travel study based in Southeast Asia, which was run in the Spring Semester of 2019 and again in the summer of 2023. The first travel group consisted of fifteen students, and the second group consisted of twelve students ranging from second-year to graduate level, student participants majoring in either architecture or planning. Students worked in interdisciplinary teams, each team beginning their travel study, living together in a separate small town for over a month in (relatively) remote conditions in rural Thailand. Students became intimately familiar with these towns, forged strong personal relationships, and built reservoirs of knowledge one conversation at a time. Rather than impose external ideas and solutions, students were asked to learn from and be open to lessons from the people and the place. The following design statement was used as a point of departure for their investigations: It is our shared premise that architecture exists in small villages and towns of Southeast Asia in the ingenuity of the people, that architecture exists in a shared language of making, modifying, and reusing. It is a modest but vibrant architecture, an architecture that is alive and evolving, an architecture that is small in scale, accessible, and one that emerges from the people. It is an architecture that can exist in a modified bicycle, a woven bamboo bridge, or a self-built community. Students were challenged to engage in existing conditions as design professionals, both empowering and lending coherence to the energies that already existed in the place. As one of the student teams noted in their design narrative: “During our field study, we had the unique opportunity to tour a number of informal settlements and meet and talk to residents through interpreters. We found that many of the residents work in nearby factories for dollars a day. Others find employment in self-generated informal economies such as hand carving and textiles. Despite extreme poverty, we found these places to be vibrant and full of life as people navigate these challenging conditions to live lives with purpose and dignity.” Students worked together with local community members and colleagues to develop a series of varied proposals that emerged from their interrogations of place and partnered with community members and professional colleagues in the development of these proposals. Project partners included faculty and student colleagues Yangon University, the mayor's Office, Planning Department Officials and religious leaders in Sawankhalok, Thailand, and community leaders in Natonchan, Thailand, to name a few. This paper will present a series of student community-based design projects that emerged from these conditions. The paper will also discuss this model of travel study as a way of building an architecture which uses social and cultural issues as a catalyst for design. The paper will discuss lessons relative to sustainable development that the Western students learned through their travels in Southeast Asia.Keywords: travel study, CAPasia, architecture of empowerment, modular housing
Procedia PDF Downloads 4719501 Numerical Experiments for the Purpose of Studying Space-Time Evolution of Various Forms of Pulse Signals in the Collisional Cold Plasma
Authors: N. Kh. Gomidze, I. N. Jabnidze, K. A. Makharadze
Abstract:
The influence of inhomogeneities of plasma and statistical characteristics on the propagation of signal is very actual in wireless communication systems. While propagating in the media, the deformation and evaluation of the signal in time and space take place and on the receiver we get a deformed signal. The present article is dedicated to studying the space-time evolution of rectangular, sinusoidal, exponential and bi-exponential impulses via numerical experiment in the collisional, cold plasma. The presented method is not based on the Fourier-presentation of the signal. Analytically, we have received the general image depicting the space-time evolution of the radio impulse amplitude that gives an opportunity to analyze the concrete results in the case of primary impulse.Keywords: collisional, cold plasma, rectangular pulse signal, impulse envelope
Procedia PDF Downloads 38219500 Multi-Temporal Cloud Detection and Removal in Satellite Imagery for Land Resources Investigation
Authors: Feng Yin
Abstract:
Clouds are inevitable contaminants in optical satellite imagery, and prevent the satellite imaging systems from acquiring clear view of the earth surface. The presence of clouds in satellite imagery bring negative influences for remote sensing land resources investigation. As a consequence, detecting the locations of clouds in satellite imagery is an essential preprocessing step, and further remove the existing clouds is crucial for the application of imagery. In this paper, a multi-temporal based satellite imagery cloud detection and removal method is proposed, which will be used for large-scale land resource investigation. The proposed method is mainly composed of four steps. First, cloud masks are generated for cloud contaminated images by single temporal cloud detection based on multiple spectral features. Then, a cloud-free reference image of target areas is synthesized by weighted averaging time-series images in which cloud pixels are ignored. Thirdly, the refined cloud detection results are acquired by multi-temporal analysis based on the reference image. Finally, detected clouds are removed via multi-temporal linear regression. The results of a case application in Hubei province indicate that the proposed multi-temporal cloud detection and removal method is effective and promising for large-scale land resource investigation.Keywords: cloud detection, cloud remove, multi-temporal imagery, land resources investigation
Procedia PDF Downloads 27619499 The Perception and Integration of Lexical Tone and Vowel in Mandarin-speaking Children with Autism: An Event-Related Potential Study
Authors: Rui Wang, Luodi Yu, Dan Huang, Hsuan-Chih Chen, Yang Zhang, Suiping Wang
Abstract:
Enhanced discrimination of pure tones but diminished discrimination of speech pitch (i.e., lexical tone) were found in children with autism who speak a tonal language (Mandarin), suggesting a speech-specific impairment of pitch perception in these children. However, in tonal languages, both lexical tone and vowel are phonemic cues and integrally dependent on each other. Therefore, it is unclear whether the presence of phonemic vowel dimension contributes to the observed lexical tone deficits in Mandarin-speaking children with autism. The current study employed a multi-feature oddball paradigm to examine how vowel and tone dimensions contribute to the neural responses for syllable change detection and involuntary attentional orienting in school-age Mandarin-speaking children with autism. In the oddball sequence, syllable /da1/ served as the standard stimulus. There were three deviant stimulus conditions, representing tone-only change (TO, /da4/), vowel-only change (VO, /du1/), and change of tone and vowel simultaneously (TV, /du4/). EEG data were collected from 25 children with autism and 20 age-matched normal controls during passive listening to the stimulation. For each deviant condition, difference waveform measuring mismatch negativity (MMN) was derived from subtracting the ERP waveform to the standard sound from that to the deviant sound for each participant. Additionally, the linear summation of TO and VO difference waveforms was compared to the TV difference waveform, to examine whether neural sensitivity for TV change detection reflects simple summation or nonlinear integration of the two individual dimensions. The MMN results showed that the autism group had smaller amplitude compared with the control group in the TO and VO conditions, suggesting impaired discriminative sensitivity for both dimensions. In the control group, amplitude of the TV difference waveform approximated the linear summation of the TO and VO waveforms only in the early time window but not in the late window, suggesting a time course from dimensional summation to nonlinear integration. In the autism group, however, the nonlinear TV integration was already present in the early window. These findings suggest that speech perception atypicality in children with autism rests not only in the processing of single phonemic dimensions, but also in the dimensional integration process.Keywords: autism, event-related potentials , mismatch negativity, speech perception
Procedia PDF Downloads 21619498 On Flexible Preferences for Standard Taxis, Electric Taxis, and Peer-to-Peer Ridesharing
Authors: Ricardo Daziano
Abstract:
In the analysis and planning of the mobility ecosystem, preferences for ride-hailing over incumbent street-hailing services need better understanding. In this paper, a seminonparametric discrete choice model that allows for flexible preference heterogeneity is fitted with data from a discrete choice experiment among adult commuters in Montreal, Canada (N=760). Participants chose among Uber, Teo (a local electric ride-hailing service that was in operation when data was collected in 2018), and a standard taxi when presented with information about cost, time (on-trip, waiting, walking), powertrain of the car (gasoline/hybrid) for Uber and taxi, and whether the available electric Teo was a Tesla (which was one of the actual features of the Teo fleet). The fitted flexible model offers several behavioral insights. Waiting time for ride-hailing services is associated with a statistically significant but low marginal disutility. For other time components, including on-ride, and street-hailing waiting and walking the estimates of the value of time show an interesting pattern: whereas in a conditional logit on-ride time reductions are valued higher, in the flexible LML specification means of the value of time follow the expected pattern of waiting and walking creating a higher disutility. At the same time, the LML estimates show the presence of important, multimodal unobserved preference heterogeneity.Keywords: discrete choice, electric taxis, ridehailing, semiparametrics
Procedia PDF Downloads 16119497 Analyses and Optimization of Physical and Mechanical Properties of Direct Recycled Aluminium Alloy (AA6061) Wastes by ANOVA Approach
Authors: Mohammed H. Rady, Mohd Sukri Mustapa, S Shamsudin, M. A. Lajis, A. Wagiman
Abstract:
The present study is aimed at investigating microhardness and density of aluminium alloy chips when subjected to various settings of preheating temperature and preheating time. Three values of preheating temperature were taken as 450 °C, 500 °C, and 550 °C. On the other hand, three values of preheating time were chosen (1, 2, 3) hours. The influences of the process parameters (preheating temperature and time) were analyzed using Design of Experiments (DOE) approach whereby full factorial design with center point analysis was adopted. The total runs were 11 and they comprise of two factors of full factorial design with 3 center points. The responses were microhardness and density. The results showed that the density and microhardness increased with decreasing the preheating temperature. The results also found that the preheating temperature is more important to be controlled rather than the preheating time in microhardness analysis while both the preheating temperature and preheating time are important in density analysis. It can be concluded that setting temperature at 450 °C for 1 hour resulted in the optimum responses.Keywords: AA6061, density, DOE, hot extrusion, microhardness
Procedia PDF Downloads 34819496 Morphology Feature of Nanostructure Bainitic Steel after Tempering Treatment
Authors: Chih Yuan Chen, Chien Chon Chen, Jin-Shyong Lin
Abstract:
The microstructure characterization of tempered nanocrystalline bainitic steel is investigated in the present study. It is found that two types of plastic relaxation, dislocation debris and nanotwin, occurs in the displacive transformation due to relatively low transformation temperature and high carbon content. Because most carbon atoms trap in the dislocation, high dislocation density can be sustained during the tempering process. More carbides only can be found in the high tempered temperature due to intense recovery progression.Keywords: nanostructure bainitic steel, tempered, TEM, nano-twin, dislocation debris, accommodation
Procedia PDF Downloads 53319495 Driver Take-Over Time When Resuming Control from Highly Automated Driving in Truck Platooning Scenarios
Authors: Bo Zhang, Ellen S. Wilschut, Dehlia M. C. Willemsen, Marieke H. Martens
Abstract:
With the rapid development of intelligent transportation systems, automated platooning of trucks is drawing increasing interest for its beneficial effects on safety, energy consumption and traffic flow efficiency. Nevertheless, one major challenge lies in the safe transition of control from the automated system back to the human drivers, especially when they have been inattentive after a long period of highly automated driving. In this study, we investigated driver take-over time after a system initiated request to leave the platooning system Virtual Tow Bar in a non-critical scenario. 22 professional truck drivers participated in the truck driving simulator experiment, and each was instructed to drive under three experimental conditions before the presentation of the take-over request (TOR): driver ready (drivers were instructed to monitor the road constantly), driver not-ready (drivers were provided with a tablet) and eye-shut. The results showed significantly longer take-over time in both driver not-ready and eye-shut conditions compared with the driver ready condition. Further analysis revealed hand movement time as the main factor causing long response time in the driver not-ready condition, while in the eye-shut condition, gaze reaction time also influenced the total take-over time largely. In addition to comparing the means, large individual differences can be found especially in two driver, not attentive conditions. The importance of a personalized driver readiness predictor for a safe transition is concluded.Keywords: driving simulation, highly automated driving, take-over time, transition of control, truck platooning
Procedia PDF Downloads 25119494 A Pragmatic Approach of Memes Created in Relation to the COVID-19 Pandemic
Authors: Alexandra-Monica Toma
Abstract:
Internet memes are an element of computer mediated communication and an important part of online culture that combines text and image in order to generate meaning. This term coined by Richard Dawkings refers to more than a mere way to briefly communicate ideas or emotions, thus naming a complex and an intensely perpetuated phenomenon in the virtual environment. This paper approaches memes as a cultural artefact and a virtual trope that mirrors societal concerns and issues, and analyses the pragmatics of their use. Memes have to be analysed in series, usually relating to some image macros, which is proof of the interplay between imitation and creativity in the memes’ writing process. We believe that their potential to become viral relates to three key elements: adaptation to context, reference to a successful meme series, and humour (jokes, irony, sarcasm), with various pragmatic functions. The study also uses the concept of multimodality and stresses how the memes’ text interacts with the image, discussing three types of relations: symmetry, amplification, and contradiction. Moreover, the paper proves that memes could be employed as speech acts with illocutionary force, when the interaction between text and image is enriched through the connection to a specific situation. The features mentioned above are analysed in a corpus that consists of memes related to the COVID-19 pandemic. This corpus shows them to be highly adaptable to context, which helps build the feeling of connection and belonging in an otherwise tremendously fragmented world. Some of them are created based on well-known image macros, and their humour results from an intricate dialogue between texts and contexts. Memes created in relation to the COVID-19 pandemic can be considered speech acts and are often used as such, as proven in the paper. Consequently, this paper tackles the key features of memes, makes a thorough analysis of the memes sociocultural, linguistic, and situational context, and emphasizes their intertextuality, with special accent on their illocutionary potential.Keywords: context, memes, multimodality, speech acts
Procedia PDF Downloads 19719493 Improving Temporal Correlations in Empirical Orthogonal Function Expansions for Data Interpolating Empirical Orthogonal Function Algorithm
Authors: Ping Bo, Meng Yunshan
Abstract:
Satellite-derived sea surface temperature (SST) is a key parameter for many operational and scientific applications. However, the disadvantage of SST data is a high percentage of missing data which is mainly caused by cloud coverage. Data Interpolating Empirical Orthogonal Function (DINEOF) algorithm is an EOF-based technique for reconstructing the missing data and has been widely used in oceanographic field. The reconstruction of SST images within a long time series using DINEOF can cause large discontinuities and one solution for this problem is to filter the temporal covariance matrix to reduce the spurious variability. Based on the previous researches, an algorithm is presented in this paper to improve the temporal correlations in EOF expansion. Similar with the previous researches, a filter, such as Laplacian filter, is implemented on the temporal covariance matrix, but the temporal relationship between two consecutive images which is used in the filter is considered in the presented algorithm, for example, two images in the same season are more likely correlated than those in the different seasons, hence the latter one is less weighted in the filter. The presented approach is tested for the monthly nighttime 4-km Advanced Very High Resolution Radiometer (AVHRR) Pathfinder SST for the long-term period spanning from 1989 to 2006. The results obtained from the presented algorithm are compared to those from the original DINEOF algorithm without filtering and from the DINEOF algorithm with filtering but without taking temporal relationship into account.Keywords: data interpolating empirical orthogonal function, image reconstruction, sea surface temperature, temporal filter
Procedia PDF Downloads 32319492 Project Time and Quality Management during Construction
Authors: Nahed Al-Hajeri
Abstract:
Time and cost is an integral part of every construction plan and can affect each party’s contractual obligations. The performance of both time and cost are usually important to the client and contractor during the project. Almost all construction projects are experiencing time overrun. These time overruns always contributed as expensive to both client and contractor. Construction of any project inside the gathering centers involves complex management skills related to work force, materials, plant, machineries, new technologies etc. It also involves many agencies interdependent on each other like the vendors, structural and functional designers including various types of specialized engineers and it includes support of contractors and specialized contractors. This paper mainly highlights the types of construction delays due to which project suffer time and cost overrun. This paper also speaks about the delay causes and factors that contribute to the construction sequence delay for the oil and gas projects. Construction delay is supposed to be one of the repeated problems in the construction projects and it has an opposing effect on project success in terms of time, cost and quality. Some effective methods are identified to minimize delays in construction projects such as: 1. Site management and supervision, 2. Effective strategic planning, 3. Clear information and communication channel. Our research paper studies the types of delay with some real examples with statistic results and suggests solutions to overcome this problem.Keywords: non-compensable delay, delays caused by force majeure, compensable delay, delays caused by the owner or the owner’s representative, non-excusable delay, delay caused by the contractor or the contractor’s representative, concurrent delay, delays resulting from two separate causes at the same time
Procedia PDF Downloads 23919491 The Chemical Transport Mechanism of Emitter Micro-Particles in Tungsten Electrode: A Metallurgical Study
Authors: G. Singh, H.Schuster, U. Füssel
Abstract:
The stability of electric arc and durability of electrode tip used in Tungsten Inert Gas (TIG) welding demand a metallurgical study about the chemical transport mechanism of emitter oxide particles in tungsten electrode during its real welding conditions. The tungsten electrodes doped with emitter oxides of rare earth oxides such as La₂O₃, Th₂O₃, Y₂O₃, CeO₂ and ZrO₂ feature a comparatively lower work function than tungsten and thus have superior emission characteristics due to lesser surface temperature of the cathode. The local change in concentration of these emitter particles in tungsten electrode due to high temperature diffusion (chemical transport) can change its functional properties like electrode temperature, work function, electron emission, and stability of the electrode tip shape. The resulting increment in tip surface temperature results in the electrode material loss. It was also observed that the tungsten recrystallizes to large grains at high temperature. When the shape of grain boundaries are granular in shape, the intergranular diffusion of oxide emitter particles takes more time to reach the electrode surface. In the experimental work, the microstructure of the used electrode's tip surface will be studied by scanning electron microscope and reflective X-ray technique in order to gauge the extent of the diffusion and chemical reaction of emitter particles. Besides, a simulated model is proposed to explain the effect of oxide particles diffusion on the electrode’s microstructure, electron emission characteristics, and electrode tip erosion. This model suggests metallurgical modifications in tungsten electrode to enhance its erosion resistance.Keywords: rare-earth emitter particles, temperature-dependent diffusion, TIG welding, Tungsten electrode
Procedia PDF Downloads 18419490 Coupled Space and Time Homogenization of Viscoelastic-Viscoplastic Composites
Authors: Sarra Haouala, Issam Doghri
Abstract:
In this work, a multiscale computational strategy is proposed for the analysis of structures, which are described at a refined level both in space and in time. The proposal is applied to two-phase viscoelastic-viscoplastic (VE-VP) reinforced thermoplastics subjected to large numbers of cycles. The main aim is to predict the effective long time response while reducing the computational cost considerably. The proposed computational framework is a combination of the mean-field space homogenization based on the generalized incrementally affine formulation for VE-VP composites, and the asymptotic time homogenization approach for coupled isotropic VE-VP homogeneous solids under large numbers of cycles. The time homogenization method is based on the definition of micro and macro-chronological time scales, and on asymptotic expansions of the unknown variables. First, the original anisotropic VE-VP initial-boundary value problem of the composite material is decomposed into coupled micro-chronological (fast time scale) and macro-chronological (slow time-scale) problems. The former is purely VE, and solved once for each macro time step, whereas the latter problem is nonlinear and solved iteratively using fully implicit time integration. Second, mean-field space homogenization is used for both micro and macro-chronological problems to determine the micro and macro-chronological effective behavior of the composite material. The response of the matrix material is VE-VP with J2 flow theory assuming small strains. The formulation exploits the return-mapping algorithm for the J2 model, with its two steps: viscoelastic predictor and plastic corrections. The proposal is implemented for an extended Mori-Tanaka scheme, and verified against finite element simulations of representative volume elements, for a number of polymer composite materials subjected to large numbers of cycles.Keywords: asymptotic expansions, cyclic loadings, inclusion-reinforced thermoplastics, mean-field homogenization, time homogenization
Procedia PDF Downloads 36819489 The Role Collagen VI Plays in Heart Failure: A Tale Untold
Authors: Summer Hassan, David Crossman
Abstract:
Myocardial fibrosis (MF) has been loosely defined as the process occurring in the pathological remodeling of the myocardium due to excessive production and deposition of extracellular matrix (ECM) proteins, including collagen. This reduces tissue compliance and accelerates progression to heart failure, as well as affecting the electrical properties of the myocytes resulting in arrhythmias. Microscopic interrogation of MF is key to understanding the molecular orchestrators of disease. It is well-established that recruitment and stimulation of myofibroblasts result in Collagen deposition and the resulting expansion in the ECM. Many types of Collagens have been identified and implicated in scarring of tissue. In a series of experiments conducted at our lab, we aim to elucidate the role collagen VI plays in the development of myocardial fibrosis and its direct impact on myocardial function. This was investigated through an animal experiment in Rats with Collagen VI knockout diseased and healthy animals as well as Collagen VI wild diseased and healthy rats. Echocardiogram assessments of these rats ensued at four-time points, followed by microscopic interrogation of the myocardium aiming to correlate the role collagen VI plays in myocardial function. Our results demonstrate a deterioration in cardiac function as represented by the ejection fraction in the knockout healthy and diseased rats. This elucidates a potential protective role that collagen-VI plays following a myocardial insult. Current work is dedicated to the microscopic characterisation of the fibrotic process in all rat groups, with the results to follow.Keywords: heart failure, myocardial fibrosis, collagen, echocardiogram, confocal microscopy
Procedia PDF Downloads 8119488 Towards Update a Road Map Solution: Use of Information Obtained by the Extraction of Road Network and Its Nodes from a Satellite Image
Authors: Z. Nougrara, J. Meunier
Abstract:
In this paper, we present a new approach for extracting roads, there road network and its nodes from satellite image representing regions in Algeria. Our approach is related to our previous research work. It is founded on the information theory and the mathematical morphology. We therefore have to define objects as sets of pixels and to study the shape of these objects and the relations that exist between them. The main interest of this study is to solve the problem of the automatic mapping from satellite images. This study is thus applied for that the geographical representation of the images is as near as possible to the reality.Keywords: nodes, road network, satellite image, updating a road map
Procedia PDF Downloads 42319487 Planning Quality and Maintenance Activities in a Closed-Loop Serial Multi-Stage Manufacturing System under Constant Degradation
Authors: Amauri Josafat Gomez Aguilar, Jean Pierre Kenné
Abstract:
This research presents the development of a self-sustainable manufacturing system from a circular economy perspective, structured by a multi-stage serial production system consisting of a series of machines under deterioration in charge of producing a single product and a reverse remanufacturing system constituted by the same productive systems of the first scheme and different tooling, fed by-products collected at the end of their life cycle, and non-conforming elements of the first productive scheme. Since the advanced production manufacturing system is unable to satisfy the customer's quality expectations completely, we propose the development of a mixed integer linear mathematical model focused on the optimal search and assignment of quality stations and preventive maintenance operation to the machines over a time horizon, intending to segregate the correct number of non-conforming parts for reuse in the remanufacturing system and thereby minimizing production, quality, maintenance, and customer non-conformance penalties. Numerical experiments are performed to analyze the solutions found by the model under different scenarios. The results showed that the correct implementation of a closed manufacturing system and allocation of quality inspection and preventive maintenance operations generate better levels of customer satisfaction and an efficient manufacturing system.Keywords: closed loop, mixed integer linear programming, preventive maintenance, quality inspection
Procedia PDF Downloads 8019486 Agile Project Management: A Real Application in a Multi-Project Research and Development Center
Authors: Aysegul Sarac
Abstract:
The aim of this study is to analyze the impacts of integrating agile development principles and practices, in particular to reduce project lead time in a multi-project environment. We analyze Arçelik Washing Machine R&D Center in which multiple projects are conducted by shared resources. In the first part of the study, we illustrate the current waterfall model system by using a value stream map. We define all activities starting from the first idea of the project to the customer and measure process time and lead time of projects. In the second part of the study we estimate potential improvements and select a set of these improvements to integrate agile principles. We aim to develop a future state map and analyze the impacts of integrating lean principles on project lead time. The main contribution of this study is that we analyze and integrate agile product development principles in a real multi-project system.Keywords: agile project management, multi project system, project lead time, product development
Procedia PDF Downloads 30319485 Cross-Cultural Competence Development through 'Learning by Reflection': A Case Study of Chinese International Students Learning through Taking Part-Time Jobs in the UK
Authors: Xin Zhao
Abstract:
The project aims to expand the notion of narrative learning and address the importance of learning by reflection in our learning and teaching context at a British university. Drawing on the key concepts such as development ZPD, transition and reflection-in and –on-action, this project analyses the learning experiences of a small sample of Chinese postgraduate students in a British University, who use part-time job experience to develop cross-cultural communication skills. The project adopts a mixed methods approach. Questionnaires and focus group interviews are used to examine the way in which students adapt (or not adapt) to the culture of learning in a British university and develop a renewed sense of self in transitions from one culture to the other. The project also looks at how the students appropriate opportunities for learning not just from classrooms but outside classrooms from everyday encounters. The project aims to address the implication of learning by reflection as development in transition. Time in and for learning, or duration, is taken for granted in theorising narrative learning. The project shall explore this very issue of time in relation to learning by reflection in considering time in/of/for learning as duration.Keywords: cross-cultural competence, learning by refection, international student transition, part-time work experience
Procedia PDF Downloads 18119484 Evaluation of Hand Grip Strength and EMG Signal on Visual Reaction
Authors: Sung-Wook Shin, Sung-Taek Chung
Abstract:
Hand grip strength has been utilized as an indicator to evaluate the motor ability of hands, responsible for performing multiple body functions. It is, however, difficult to evaluate other factors (other than hand muscular strength) utilizing the hand grip strength only. In this study, we analyzed the motor ability of hands using EMG and the hand grip strength, simultaneously in order to evaluate concentration, muscular strength reaction time, instantaneous muscular strength change, and agility in response to visual reaction. In results, the average time (and their standard deviations) of muscular strength reaction EMG signal and hand grip strength was found to be 209.6 ± 56.2 ms and 354.3 ± 54.6 ms, respectively. In addition, the onset time which represents acceleration time to reach 90% of maximum hand grip strength, was 382.9 ± 129.9 ms.Keywords: hand grip strength, EMG, visual reaction, endurance
Procedia PDF Downloads 46019483 Number of Parametrization of Discrete-Time Systems without Unit-Delay Element: Single-Input Single-Output Case
Authors: Kazuyoshi Mori
Abstract:
In this paper, we consider the parametrization of the discrete-time systems without the unit-delay element within the framework of the factorization approach. In the parametrization, we investigate the number of required parameters. We consider single-input single-output systems in this paper. By the investigation, we find, on the discrete-time systems without the unit-delay element, three cases that are (1) there exist plants which require only one parameter and (2) two parameters, and (3) the number of parameters is at most three.Keywords: factorization approach, discrete-time system, parameterization of stabilizing controllers, system without unit-delay
Procedia PDF Downloads 23819482 Time Travel Testing: A Mechanism for Improving Renewal Experience
Authors: Aritra Majumdar
Abstract:
While organizations strive to expand their new customer base, retaining existing relationships is a key aspect of improving overall profitability and also showcasing how successful an organization is in holding on to its customers. It is an experimentally proven fact that the lion’s share of profit always comes from existing customers. Hence seamless management of renewal journeys across different channels goes a long way in improving trust in the brand. From a quality assurance standpoint, time travel testing provides an approach to both business and technology teams to enhance the customer experience when they look to extend their partnership with the organization for a defined phase of time. This whitepaper will focus on key pillars of time travel testing: time travel planning, time travel data preparation, and enterprise automation. Along with that, it will call out some of the best practices and common accelerator implementation ideas which are generic across verticals like healthcare, insurance, etc. In this abstract document, a high-level snapshot of these pillars will be provided. Time Travel Planning: The first step of setting up a time travel testing roadmap is appropriate planning. Planning will include identifying the impacted systems that need to be time traveled backward or forward depending on the business requirement, aligning time travel with other releases, frequency of time travel testing, preparedness for handling renewal issues in production after time travel testing is done and most importantly planning for test automation testing during time travel testing. Time Travel Data Preparation: One of the most complex areas in time travel testing is test data coverage. Aligning test data to cover required customer segments and narrowing it down to multiple offer sequencing based on defined parameters are keys for successful time travel testing. Another aspect is the availability of sufficient data for similar combinations to support activities like defect retesting, regression testing, post-production testing (if required), etc. This section will talk about the necessary steps for suitable data coverage and sufficient data availability from a time travel testing perspective. Enterprise Automation: Time travel testing is never restricted to a single application. The workflow needs to be validated in the downstream applications to ensure consistency across the board. Along with that, the correctness of offers across different digital channels needs to be checked in order to ensure a smooth customer experience. This section will talk about the focus areas of enterprise automation and how automation testing can be leveraged to improve the overall quality without compromising on the project schedule. Along with the above-mentioned items, the white paper will elaborate on the best practices that need to be followed during time travel testing and some ideas pertaining to accelerator implementation. To sum it up, this paper will be written based on the real-time experience author had on time travel testing. While actual customer names and program-related details will not be disclosed, the paper will highlight the key learnings which will help other teams to implement time travel testing successfully.Keywords: time travel planning, time travel data preparation, enterprise automation, best practices, accelerator implementation ideas
Procedia PDF Downloads 15819481 FLIME - Fast Low Light Image Enhancement for Real-Time Video
Authors: Vinay P., Srinivas K. S.
Abstract:
Low Light Image Enhancement is of utmost impor- tance in computer vision based tasks. Applications include vision systems for autonomous driving, night vision devices for defence systems, low light object detection tasks. Many of the existing deep learning methods are resource intensive during the inference step and take considerable time for processing. The algorithm should take considerably less than 41 milliseconds in order to process a real-time video feed with 24 frames per second and should be even less for a video with 30 or 60 frames per second. The paper presents a fast and efficient solution which has two main advantages, it has the potential to be used for a real-time video feed, and it can be used in low compute environments because of the lightweight nature. The proposed solution is a pipeline of three steps, the first one is the use of a simple function to map input RGB values to output RGB values, the second is to balance the colors and the final step is to adjust the contrast of the image. Hence a custom dataset is carefully prepared using images taken in low and bright lighting conditions. The preparation of the dataset, the proposed model, the processing time are discussed in detail and the quality of the enhanced images using different methods is shown.Keywords: low light image enhancement, real-time video, computer vision, machine learning
Procedia PDF Downloads 20119480 Numerical Investigation of Dynamic Stall over a Wind Turbine Pitching Airfoil by Using OpenFOAM
Authors: Mahbod Seyednia, Shidvash Vakilipour, Mehran Masdari
Abstract:
Computations for two-dimensional flow past a stationary and harmonically pitching wind turbine airfoil at a moderate value of Reynolds number (400000) are carried out by progressively increasing the angle of attack for stationary airfoil and at fixed pitching frequencies for rotary one. The incompressible Navier-Stokes equations in conjunction with Unsteady Reynolds Average Navier-Stokes (URANS) equations for turbulence modeling are solved by OpenFOAM package to investigate the aerodynamic phenomena occurred at stationary and pitching conditions on a NACA 6-series wind turbine airfoil. The aim of this study is to enhance the accuracy of numerical simulation in predicting the aerodynamic behavior of an oscillating airfoil in OpenFOAM. Hence, for turbulence modelling, k-ω-SST with low-Reynolds correction is employed to capture the unsteady phenomena occurred in stationary and oscillating motion of the airfoil. Using aerodynamic and pressure coefficients along with flow patterns, the unsteady aerodynamics at pre-, near-, and post-static stall regions are analyzed in harmonically pitching airfoil, and the results are validated with the corresponding experimental data possessed by the authors. The results indicate that implementing the mentioned turbulence model leads to accurate prediction of the angle of static stall for stationary airfoil and flow separation, dynamic stall phenomenon, and reattachment of the flow on the surface of airfoil for pitching one. Due to the geometry of the studied 6-series airfoil, the vortex on the upper surface of the airfoil during upstrokes is formed at the trailing edge. Therefore, the pattern flow obtained by our numerical simulations represents the formation and change of the trailing-edge vortex at near- and post-stall regions where this process determines the dynamic stall phenomenon.Keywords: CFD, moderate Reynolds number, OpenFOAM, pitching oscillation, unsteady aerodynamics, wind turbine
Procedia PDF Downloads 20019479 Development of Quasi Real-Time Comprehensive System for Earthquake Disaster
Authors: Zhi Liu, Hui Jiang, Jin Li, Kunhao Chen, Langfang Zhang
Abstract:
Fast acquisition of the seismic information and accurate assessment of the earthquake disaster is the key problem for emergency rescue after a destructive earthquake. In order to meet the requirements of the earthquake emergency response and rescue for the cities and counties, a quasi real-time comprehensive evaluation system for earthquake disaster is developed. Based on monitoring data of Micro-Electro-Mechanical Systems (MEMS) strong motion network, structure database of a county area and the real-time disaster information by the mobile terminal after an earthquake, fragility analysis method and dynamic correction algorithm are synthetically obtained in the developed system. Real-time evaluation of the seismic disaster in the county region is finally realized to provide scientific basis for seismic emergency command, rescue and assistant decision.Keywords: quasi real-time, earthquake disaster data collection, MEMS accelerometer, dynamic correction, comprehensive evaluation
Procedia PDF Downloads 21119478 The Optimization of the Parameters for Eco-Friendly Leaching of Precious Metals from Waste Catalyst
Authors: Silindile Gumede, Amir Hossein Mohammadi, Mbuyu Germain Ntunka
Abstract:
Goal 12 of the 17 Sustainable Development Goals (SDGs) encourages sustainable consumption and production patterns. This necessitates achieving the environmentally safe management of chemicals and all wastes throughout their life cycle and the proper disposal of pollutants and toxic waste. Fluid catalytic cracking (FCC) catalysts are widely used in the refinery to convert heavy feedstocks to lighter ones. During the refining processes, the catalysts are deactivated and discarded as hazardous toxic solid waste. Spent catalysts (SC) contain high-cost metal, and the recovery of metals from SCs is a tactical plan for supplying part of the demand for these substances and minimizing the environmental impacts. Leaching followed by solvent extraction, has been found to be the most efficient method to recover valuable metals with high purity from spent catalysts. However, the use of inorganic acids during the leaching process causes a secondary environmental issue. Therefore, it is necessary to explore other alternative efficient leaching agents that are economical and environmentally friendly. In this study, the waste catalyst was collected from a domestic refinery and was characterised using XRD, ICP, XRF, and SEM. Response surface methodology (RSM) and Box Behnken design were used to model and optimize the influence of some parameters affecting the acidic leaching process. The parameters selected in this investigation were the acid concentration, temperature, and leaching time. From the characterisation results, it was found that the spent catalyst consists of high concentrations of Vanadium (V) and Nickel (Ni); hence this study focuses on the leaching of Ni and V using a biodegradable acid to eliminate the formation of the secondary pollution.Keywords: eco-friendly leaching, optimization, metal recovery, leaching
Procedia PDF Downloads 6619477 Investigation of Doping Effects on Nonradiative Recombination Parameters in Bulk GaAs
Authors: Soufiene Ilahi
Abstract:
We have used Photothermal deflection spectroscopy PTD to investigate the impact of doping on electronics properties of bulk. Then, the extraction of these parameters is performed by fitting the theoretical curves to the experimental PTD ones. We have remarked that electron mobility in p type C-doped GaAs is about 300 cm2/V·s. Accordinagly, the diffusion length of minority carrier lifetime is equal to 5 (± 7%), 5 (± 4,4%) and 1.42 µm (± 7,2 %) for the Cr, C and Si doped GaAs respectively. Surface recombination velocity varies randomly that can be found around of 7942 m/s, 100 m/s and 153 m/s GaAs doped Si, Cr, C, respectively.Keywords: nonradiative lifetime, mobility of minority carrier, diffusion length, surface and interface recombination in GaAs
Procedia PDF Downloads 69