Search results for: radiation processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4913

Search results for: radiation processing

3173 Cognitive Deficits and Association with Autism Spectrum Disorder and Attention Deficit Hyperactivity Disorder in 22q11.2 Deletion Syndrome

Authors: Sinead Morrison, Ann Swillen, Therese Van Amelsvoort, Samuel Chawner, Elfi Vergaelen, Michael Owen, Marianne Van Den Bree

Abstract:

22q11.2 Deletion Syndrome (22q11.2DS) is caused by the deletion of approximately 60 genes on chromosome 22 and is associated with high rates of neurodevelopmental disorders such as Attention Deficit Hyperactivity Disorder (ADHD) and Autism Spectrum Disorders (ASD). The presentation of these disorders in 22q11.2DS is reported to be comparable to idiopathic forms and therefore presents a valuable model for understanding mechanisms of neurodevelopmental disorders. Cognitive deficits are thought to be a core feature of neurodevelopmental disorders, and possibly manifest in behavioural and emotional problems. There have been mixed findings in 22q11.2DS on whether the presence of ADHD or ASD is associated with greater cognitive deficits. Furthermore, the influence of developmental stage has never been taken into account. The aim was therefore to examine whether the presence of ADHD or ASD was associated with cognitive deficits in childhood and/or adolescence in 22q11.2DS. We conducted the largest study to date of this kind in 22q11.2DS. The same battery of tasks measuring processing speed, attention and spatial working memory were completed by 135 participants with 22q11.2DS. Wechsler IQ tests were completed, yielding Full Scale (FSIQ), Verbal (VIQ) and Performance IQ (PIQ). Age-standardised difference scores were produced for each participant. Developmental stages were defined as children (6-10 years) and adolescents (10-18 years). ADHD diagnosis was ascertained from a semi-structured interview with a parent. ASD status was ascertained from a questionnaire completed by a parent. Interaction and main effects of cognitive performance of those with or without a diagnosis of ADHD or ASD in childhood or adolescence were conducted with 2x2 ANOVA. Significant interactions were followed up with t-tests of simple effects. Adolescents with ASD displayed greater deficits in all measures (processing speed, p = 0.022; sustained attention, p = 0.016; working memory, p = 0.006) than adolescents without ASD; there was no difference between children with and without ASD. There were no significant differences on IQ measures. Both children and adolescents with ADHD displayed greater deficits on sustained attention (p = 0.002) than those without ADHD. There were no significant differences on any other measures for ADHD. Magnitude of cognitive deficit in individuals with 22q11.2DS varied by cognitive domain, developmental stage and presence of neurodevelopmental disorder. Adolescents with 22q11.2DS and ASD showed greater deficits on all measures, which suggests there may be a sensitive period in childhood to acquire these domains, or reflect increasing social and academic demands in adolescence. The finding of poorer sustained attention in children and adolescents with ADHD supports previous research and suggests a specific deficit which can be separated from processing speed and working memory. This research provides unique insights into the association of ASD and ADHD with cognitive deficits in a group at high genomic risk of neurodevelopmental disorders.

Keywords: 22q11.2 deletion syndrome, attention deficit hyperactivity disorder, autism spectrum disorder, cognitive development

Procedia PDF Downloads 136
3172 Integrated Life Skill Training and Executive Function Strategies in Children with Autism Spectrum Disorder in Qatar: A Study Protocol for a Randomized Controlled Trial

Authors: Bara M Yousef, Naresh B Raj, Nadiah W Arfah, Brightlin N Dhas

Abstract:

Background: Executive function (EF) impairment is common in children with autism spectrum disorder (ASD). EF strategies are considered effective in improving the therapeutic outcomes of children with ASD. Aims: This study primarily aims to explore whether integrating EF strategies combined with regular occupational therapy intervention is more effective in improving daily life skills (DLS) and sensory integration/processing (SI/SP) skills than regular occupational therapy alone in children with ASD and secondarily aims to assess treatment outcomes on improving visual motor integration (VMI) skills. Procedures: A total of 92 children with ASD will be recruited and, following baseline assessments, randomly assigned to the treatment group (45-min once weekly individual occupational therapy plus EF strategies) and control group (45-min once weekly individual therapy sessions alone). Results and Outcomes: All children will be evaluated systematically by assessing SI/SP, DLS, and VMI, skills at baseline, 7 weeks, and 14 weeks of treatment. Data will be analyzed using ANCOVA and T-test. Conclusions and Implications: This single-blind, randomized controlled trial will provide empirical evidence for the effectiveness of EF strategies when combined with regular occupational therapy programs. Based on trial results, EF strategies could be recommended in multidisciplinary programs for children with ASD. Trial Registration: The trial has been registered in the clinicaltrail.gov for a registry, protocol ID: MRC-01-22-509 ClinicalTrials.gov Identifier: NCT05829577, registered 25th April 2023

Keywords: autism spectrum disorder, executive function strategies, daily life skills, sensory integration/processing, visual motor integration, occupational therapy, effectiveness

Procedia PDF Downloads 95
3171 Entrepreneurial Orientation and Business Performance: The Case of Micro Scale Food Processors Operating in a War-Recovery Environment

Authors: V. Suganya, V. Balasuriya

Abstract:

The functioning of Micro and Small Scale (MSS) businesses in the northern part of Sri Lanka was vulnerable due to three decades of internal conflict and the subsequent post-war economic openings has resulted new market prospects for MSS businesses. MSS businesses survive and operate with limited resources and struggle to access finance, raw material, markets, and technology. This study attempts to identify the manner in which entrepreneurial orientation puts into practice by the business operators to overcome these business challenges. Business operators in the traditional food processing sector are taken for this study as this sub-sector of the food industry is developing at a rapid pace. A review of the literature was done to recognize the concepts of entrepreneurial orientation, defining MMS businesses and the manner in which business performance is measured. Direct interview method supported by a structured questionnaire is used to collect data from 80 respondents; based on a fixed interval random sampling technique. This study reveals that more than half of the business operators have opted to commence their business ventures as a result of identifying a market opportunity. 41 per cent of the business operators are highly entrepreneurial oriented in a scale of 1 to 5. Entrepreneurial orientation shows significant relationship and strongly correlated with business performance. Pro-activeness, innovativeness and competitive aggressiveness shows a significant relationship with business performance while risk taking is negative and autonomy is not significantly related to business performance. It is evident that entrepreneurial oriented business practices contribute to better business performance even though 70 per cent prefer the ideas/views of the support agencies than the stakeholders when making business decisions. It is recommended that appropriate training should be introduced to develop entrepreneurial skills focusing to improve business networks so that new business opportunities and innovative business practices are identified.

Keywords: Micro and Small Scale (MMS) businesses, entrepreneurial orientation (EO), food processing, business operators

Procedia PDF Downloads 476
3170 Study of Radioactivity of Oil and Gas

Authors: Harish Aryal, Thalia Balderas, Alondra Rodriguez

Abstract:

Radioactivity present in nature possess a major challenge to public health and occupational concerns. Even at low doses, NORM can cause radiation-induced cancers, heritable diseases, genetic defects, etc. There have not been enough radiological studies and consequently, there is a lack of supportive data. In addition, there is no universal medical surveillance program for low-level doses and there is a need for NORM management guidelines for appropriate control. Naturally Occurring Radioactive Material (NORM) is present everywhere during oil/gas exploration. Currently, there is limited data available to quantify radioactivity. This research presents the study of radioactivity in different areas in the United States to be encouraged to be used for further study in Texas or similar areas within the oil and gas industry. Many materials that are found in the oil and gas industry are NORM (Naturally Occurring Radioactive Materials). The NORM is made of various types of materials, including Radium 226, Radium 228, and Radon 222. Efforts to characterize the geographic distribution of NORM have been limited by poor statistical representation in this area of study. In addition, the fate of NORM in the environment has not been fully defined, and few human health risk assessments have been conducted. To further comprehend how to measure radioactivity in oil and gas, it will be essential to understand the amount and type of radioactivity that is wasted on the water and soil of the industry.

Keywords: NORM, radium 226, radon 222, radionuclides, geological formations

Procedia PDF Downloads 64
3169 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure

Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer

Abstract:

The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.

Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition

Procedia PDF Downloads 92
3168 Influence of Processing Parameters in Selective Laser Melting on the Microstructure and Mechanical Properties of Ti/Tin Composites With in-situ and ex-situ Reinforcement

Authors: C. Sánchez de Rojas Candela, A. Riquelme, P. Rodrigo, M. D. Escalera-Rodríguez, B. Torres, J. Rams

Abstract:

Selective laser melting is one of the most commonly used AM techniques. In it, a thin layer of metallic powder is deposited, and a laser is used to melt selected zones. The accumulation of layers, each one molten in the preselected zones, gives rise to the formation of a 3D sample with a nearly arbitrary design. To ensure that the properties of the final parts match those of the powder, all the process is carried out in an inert atmosphere, preferentially Ar, although this gas could be substituted. Ti6Al4V alloy is widely used in multiple industrial applications such as aerospace, maritime transport and biomedical, due to its properties. However, due to the demanding requirements of these applications, greater hardness and wear resistance are necessary, together with a better machining capacity, which currently limits its commercialization. To improve these properties, in this study, Selective Laser Melting (SLM) is used to manufacture Ti/TiN metal matrix composites with in-situ and ex-situ titanium nitride reinforcement where the scanning speed is modified (from 28.5 up to 65 mm/s) to study the influence of the processing parameters in SLM. A one-step method of nitriding the Ti6Al4V alloy is carried out to create in-situ TiN reinforcement in a reactive atmosphere and it is compared with ex-situ composites manufactured by previous mixture of both the titanium alloy powder and the ceramic reinforcement particles. The microstructure and mechanical properties of the different Ti/TiN composite materials have been analyzed. As a result, the existence of a similar matrix has been confirmed in in-situ and ex-situ fabrications and the growth mechanisms of the nitrides have been studied. An increase in the mechanical properties with respect to the initial alloy has been observed in both cases and related to changes in their microstructure. Specifically, a greater improvement (around 30.65%) has been identified in those manufactured by the in-situ method at low speeds although other properties such as porosity must be improved for their future industrial applicability.

Keywords: in-situ reinforcement, nitriding reaction, selective laser melting, titanium nitride

Procedia PDF Downloads 69
3167 Examining the Influence of Firm Internal Level Factors on Performance Variations among Micro and Small Enterprises: Evidence from Tanzanian Agri-Food Processing Firms

Authors: Pulkeria Pascoe, Hawa P. Tundui, Marcia Dutra de Barcellos, Hans de Steur, Xavier Gellynck

Abstract:

A majority of Micro and Small Enterprises (MSEs) experience low or no growth. Understanding their performance remains unfinished and disjointed as there is no consensus on the factors influencing it, especially in developing countries. Using a Resource-Based View (RBV) as the theoretical background, this cross-sectional study employed four regression models to examine the influence of firm-level factors (firm-specific characteristics, firm resources, manager socio-demographic characteristics, and selected management practices) on the overall performance variations among 442 Tanzanian micro and small agri-food processing firms. Study results confirmed the RBV argument that intangible resources make a larger contribution to overall performance variations among firms than that tangible resources. Firms' tangible and intangible resources explained 34.5% of overall performance variations (intangible resources explained the overall performance variability by 19.4% compared to tangible resources, which accounted for 15.1%), ranking first in explaining the overall performance variance. Firm-specific characteristics ranked second by influencing variations in overall performance by 29.0%. Selected management practices ranked third (6.3%), while the manager's socio-demographic factors were last on the list, as they influenced the overall performance variability among firms by only 5.1%. The study also found that firms that focus on proper utilization of tangible resources (financial and physical), set targets, and undertake better working capital management practices performed higher than their counterparts (low and average performers). Furthermore, accumulation and proper utilization of intangible resources (relational, organizational, and reputational), undertaking performance monitoring practices, age of the manager, and the choice of the firm location and activity were the dominant significant factors influencing the variations among average and high performers, relative to low performers. The entrepreneurial background was a significant factor influencing variations in average and low-performing firms, indicating that entrepreneurial skills are crucial to achieving average levels of performance. Firm age, size, legal status, source of start-up capital, gender, education level, and total business experience of the manager were not statistically significant variables influencing the overall performance variations among the agri-food processors under the study. The study has identified both significant and non-significant factors influencing performance variations among low, average, and high-performing micro and small agri-food processing firms in Tanzania. Therefore, results from this study will help managers, policymakers and researchers to identify areas where more attention should be placed in order to improve overall performance of MSEs in agri-food industry.

Keywords: firm-level factors, micro and small enterprises, performance, regression analysis, resource-based-view

Procedia PDF Downloads 68
3166 Investigation of Stabilized Turbulent Diffusion Flames Using Synthesis Fuel with Different Burner Configurations

Authors: Moataz Medhat, Essam Khalil, Hatem Haridy

Abstract:

The present study investigates the flame structure of turbulent diffusion flame of synthesis fuel in a 300 KW swirl-stabilized burner. The three-dimensional model adopts a realizable k-ε turbulent scheme interacting with two-dimensional PDF combustion scheme by applying flamelet concept. The study reveals more characteristics on turbulent diffusion flame of synthesis fuel when changing the inlet air swirl number and the burner quarl angle. Moreover, it concerns with studying the effect of flue gas recirculation and staging with taking radiation effect into consideration. The comparison with natural gas was investigated. The study showed two zones of recirculation, the primary one is at the center of the furnace, and the location of the secondary one varies by changing the quarl angle of the burner. The results revealed an increase in temperature in the external recirculation zone as a result of increasing the swirl number of the inlet air stream. Also, it was found that recirculating part of the combustion products decreases pollutants formation especially nitrogen monoxide. The predicted results showed a great agreement when compared with the experiments.

Keywords: gas turbine, syngas, analysis, recirculation

Procedia PDF Downloads 261
3165 Processing Big Data: An Approach Using Feature Selection

Authors: Nikat Parveen, M. Ananthi

Abstract:

Big data is one of the emerging technology, which collects the data from various sensors and those data will be used in many fields. Data retrieval is one of the major issue where there is a need to extract the exact data as per the need. In this paper, large amount of data set is processed by using the feature selection. Feature selection helps to choose the data which are actually needed to process and execute the task. The key value is the one which helps to point out exact data available in the storage space. Here the available data is streamed and R-Center is proposed to achieve this task.

Keywords: big data, key value, feature selection, retrieval, performance

Procedia PDF Downloads 323
3164 A Case Study Report on Acoustic Impact Assessment and Mitigation of the Hyprob Research Plant

Authors: D. Bianco, A. Sollazzo, M. Barbarino, G. Elia, A. Smoraldi, N. Favaloro

Abstract:

The activities, described in the present paper, have been conducted in the framework of the HYPROB-New Program, carried out by the Italian Aerospace Research Centre (CIRA) promoted and funded by the Italian Ministry of University and Research (MIUR) in order to improve the National background on rocket engine systems for space applications. The Program has the strategic objective to improve National system and technology capabilities in the field of liquid rocket engines (LRE) for future Space Propulsion Systems applications, with specific regard to LOX/LCH4 technology. The main purpose of the HYPROB program is to design and build a Propulsion Test Facility (HIMP) allowing test activities on Liquid Thrusters. The development of skills in liquid rocket propulsion can only pass through extensive test campaign. Following its mission, CIRA has planned the development of new testing facilities and infrastructures for space propulsion characterized by adequate sizes and instrumentation. The IMP test cell is devoted to testing articles representative of small combustion chambers, fed with oxygen and methane, both in liquid and gaseous phase. This article describes the activities that have been carried out for the evaluation of the acoustic impact, and its consequent mitigation. The impact of the simulated acoustic disturbance has been evaluated, first, using an approximated method based on experimental data by Baumann and Coney, included in “Noise and Vibration Control Engineering” edited by Vér and Beranek. This methodology, used to evaluate the free-field radiation of jet in ideal acoustical medium, analyzes in details the jet noise and assumes sources acting at the same time. It considers as principal radiation sources the jet mixing noise, caused by the turbulent mixing of jet gas and the ambient medium. Empirical models, allowing a direct calculation of the Sound Pressure Level, are commonly used for rocket noise simulation. The model named after K. Eldred is probably one of the most exploited in this area. In this paper, an improvement of the Eldred Standard model has been used for a detailed investigation of the acoustical impact of the Hyprob facility. This new formulation contains an explicit expression for the acoustic pressure of each equivalent noise source, in terms of amplitude and phase, allowing the investigation of the sources correlation effects and their propagation through wave equations. In order to enhance the evaluation of the facility acoustic impact, including an assessment of the mitigation strategies to be set in place, a more advanced simulation campaign has been conducted using both an in-house code for noise propagation and scattering, and a commercial code for industrial noise environmental impact, CadnaA. The noise prediction obtained with the revised Eldred-based model has then been used for formulating an empirical/BEM (Boundary Element Method) hybrid approach allowing the evaluation of the barrier mitigation effect, at the design. This approach has been compared with the analogous empirical/ray-acoustics approach, implemented within CadnaA using a customized definition of sources and directivity factor. The resulting impact evaluation study is reported here, along with the design-level barrier optimization for noise mitigation.

Keywords: acoustic impact, industrial noise, mitigation, rocket noise

Procedia PDF Downloads 130
3163 Verbal Working Memory in Sequential and Simultaneous Bilinguals: An Exploratory Study

Authors: Archana Rao R., Deepak P., Chayashree P. D., Darshan H. S.

Abstract:

Cognitive abilities in bilinguals have been widely studied over the last few decades. Bilingualism has been found to extensively facilitate the ability to store and manipulate information in Working Memory (WM). The mechanism of WM includes primary memory, attentional control, and secondary memory, each of which makes a contribution to WM. Many researches have been done in an attempt to measure WM capabilities through both verbal (phonological) and nonverbal tasks (visuospatial). Since there is a lot of speculations regarding the relationship between WM and bilingualism, further investigation is required to understand the nature of WM in bilinguals, i.e., with respect to sequential and simultaneous bilinguals. Hence the present study aimed to highlight the verbal working memory abilities in sequential and simultaneous bilinguals with respect to the processing and recall abilities of nouns and verbs. Two groups of bilinguals aged between 18-30 years were considered for the study. Group 1 consisted of 20 (10 males and 10 females) sequential bilinguals who had acquired L1 (Kannada) before the age of 3 and had exposure to L2 (English) for a period of 8-10 years. Group 2 consisted of 20 (10 males and 10 females) simultaneous bilinguals who have acquired both L1 and L2 before the age of 3. Working memory abilities were assessed using two tasks, and a set of stimuli which was presented in gradation of complexity and the stimuli was inclusive of frequent and infrequent nouns and verbs. The tasks involved the participants to judge the correctness of the sentence and simultaneously remember the last word of each sentence and the participants are instructed to recall the words at the end of each set. The results indicated no significant difference between sequential and simultaneous bilinguals in processing the nouns and verbs, and this could be attributed to the proficiency level of the participants in L1 and the alike cognitive abilities between the groups. And recall of nouns was better compared to verbs, maybe because of the complex argument structure involved in verbs. Similarly, authors found a frequency of occurrence of nouns and verbs also had an effect on WM abilities. The difference was also found across gradation due to the load imposed on the central executive function and phonological loop.

Keywords: bilinguals, nouns, verbs, working memory

Procedia PDF Downloads 111
3162 LCA/CFD Studies of Artisanal Brick Manufacture in Mexico

Authors: H. A. Lopez-Aguilar, E. A. Huerta-Reynoso, J. A. Gomez, J. A. Duarte-Moller, A. Perez-Hernandez

Abstract:

Environmental performance of artisanal brick manufacture was studied by Lifecycle Assessment (LCA) methodology and Computational Fluid Dynamics (CFD) analysis in Mexico. The main objective of this paper is to evaluate the environmental impact during artisanal brick manufacture. LCA cradle-to-gate approach was complemented with CFD analysis to carry out an Environmental Impact Assessment (EIA). The lifecycle includes the stages of extraction, baking and transportation to the gate. The functional unit of this study was the production of a single brick in Chihuahua, Mexico and the impact categories studied were carcinogens, respiratory organics and inorganics, climate change radiation, ozone layer depletion, ecotoxicity, acidification/ eutrophication, land use, mineral use and fossil fuels. Laboratory techniques for fuel characterization, gas measurements in situ, and AP42 emission factors were employed in order to calculate gas emissions for inventory data. The results revealed that the categories with greater impacts are ecotoxicity and carcinogens. The CFD analysis is helpful in predicting the thermal diffusion and contaminants from a defined source. LCA-CFD synergy complemented the EIA and allowed us to identify the problem of thermal efficiency within the system.

Keywords: LCA, CFD, brick, artisanal

Procedia PDF Downloads 378
3161 Comparison of Tribological and Mechanical Properties of White Metal Produced by Laser Cladding and Conventional Methods

Authors: Jae-Il Jeong, Hoon-Jae Park, Jung-Woo Cho, Yang-Gon Kim, Jin-Young Park, Joo-Young Oh, Si-Geun Choi, Seock-Sam Kim, Young Tae Cho, Chan Gyu Kim, Jong-Hyoung Kim

Abstract:

Bearing component has strongly required to decrease vibration and wear to achieve high durability and life time. In the industry field, bearing durability is improved by surface treatment on the bearing surface by centrifugal casting or gravity casting production method. However, this manufacturing method has caused problems such as long processing time, defect rate, and health harmful effect. To solve this problem, there is a laser cladding deposition treatment, which provides fast processing and food adhesion. Therefore, optimum conditions of white metal laser deposition should be studied to minimize bearing contact axis wear using laser cladding techniques. In this study, we deposit a soft white metal layer on SCM440, which is mainly used for shaft and bolt. On laser deposition process, the laser power and powder feed rate and laser head speed factors are controlled to find out the optimal conditions. We also measure hardness using micro Vickers, analyze FE-SEM (Field Emission Scanning Electron Microscope) and EDS (Energy Dispersive Spectroscopy) to study the mechanical properties and surface characteristics with various parameters change. Furthermore, this paper suggests the optimum condition of laser cladding deposition to apply in industrial fields. This work was supported by the Industrial Innovation Project of the Korea Evaluation Institute of Industrial Technology (KEIT) granted financial resource from the Ministry of Trade, Industry & Energy, Republic of Korea (Research no. 10051653).

Keywords: laser deposition, bearing, white metal, mechanical properties

Procedia PDF Downloads 249
3160 Establishment of Precision System for Underground Facilities Based on 3D Absolute Positioning Technology

Authors: Yonggu Jang, Jisong Ryu, Woosik Lee

Abstract:

The study aims to address the limitations of existing underground facility exploration equipment in terms of exploration depth range, relative depth measurement, data processing time, and human-centered ground penetrating radar image interpretation. The study proposed the use of 3D absolute positioning technology to develop a precision underground facility exploration system. The aim of this study is to establish a precise exploration system for underground facilities based on 3D absolute positioning technology, which can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The study developed software and hardware technologies to build the precision exploration system. The software technologies developed include absolute positioning technology, ground surface location synchronization technology of GPR exploration equipment, GPR exploration image AI interpretation technology, and integrated underground space map-based composite data processing technology. The hardware systems developed include a vehicle-type exploration system and a cart-type exploration system. The data was collected using the developed exploration system, which employs 3D absolute positioning technology. The GPR exploration images were analyzed using AI technology, and the three-dimensional location information of the explored precise underground facilities was compared to the integrated underground space map. The study successfully developed a precision underground facility exploration system based on 3D absolute positioning technology. The developed exploration system can accurately survey up to a depth of 5m and measure the 3D absolute location of precise underground facilities. The system comprises software technologies that build a 3D precise DEM, synchronize the GPR sensor's ground surface 3D location coordinates, automatically analyze and detect underground facility information in GPR exploration images and improve accuracy through comparative analysis of the three-dimensional location information, and hardware systems, including a vehicle-type exploration system and a cart-type exploration system. The study's findings and technological advancements are essential for underground safety management in Korea. The proposed precision exploration system significantly contributes to establishing precise location information of underground facility information, which is crucial for underground safety management and improves the accuracy and efficiency of exploration. The study addressed the limitations of existing equipment in exploring underground facilities, proposed 3D absolute positioning technology-based precision exploration system, developed software and hardware systems for the exploration system, and contributed to underground safety management by providing precise location information. The developed precision underground facility exploration system based on 3D absolute positioning technology has the potential to provide accurate and efficient exploration of underground facilities up to a depth of 5m. The system's technological advancements contribute to the establishment of precise location information of underground facility information, which is essential for underground safety management in Korea.

Keywords: 3D absolute positioning, AI interpretation of GPR exploration images, complex data processing, integrated underground space maps, precision exploration system for underground facilities

Procedia PDF Downloads 47
3159 In silico Repopulation Model of Various Tumour Cells during Treatment Breaks in Head and Neck Cancer Radiotherapy

Authors: Loredana G. Marcu, David Marcu, Sanda M. Filip

Abstract:

Advanced head and neck cancers are aggressive tumours, which require aggressive treatment. Treatment efficiency is often hindered by cancer cell repopulation during radiotherapy, which is due to various mechanisms triggered by the loss of tumour cells and involves both stem and differentiated cells. The aim of the current paper is to present in silico simulations of radiotherapy schedules on a virtual head and neck tumour grown with biologically realistic kinetic parameters. Using the linear quadratic formalism of cell survival after radiotherapy, altered fractionation schedules employing various treatment breaks for normal tissue recovery are simulated and repopulation mechanism implemented in order to evaluate the impact of various cancer cell contribution on tumour behaviour during irradiation. The model has shown that the timing of treatment breaks is an important factor influencing tumour control in rapidly proliferating tissues such as squamous cell carcinomas of the head and neck. Furthermore, not only stem cells but also differentiated cells, via the mechanism of abortive division, can contribute to malignant cell repopulation during treatment.

Keywords: radiation, tumour repopulation, squamous cell carcinoma, stem cell

Procedia PDF Downloads 255
3158 A Generalized Framework for Adaptive Machine Learning Deployments in Algorithmic Trading

Authors: Robert Caulk

Abstract:

A generalized framework for adaptive machine learning deployments in algorithmic trading is introduced, tested, and released as open-source code. The presented software aims to test the hypothesis that recent data contains enough information to form a probabilistically favorable short-term price prediction. Further, the framework contains various adaptive machine learning techniques that are geared toward generating profit during strong trends and minimizing losses during trend changes. Results demonstrate that this adaptive machine learning approach is capable of capturing trends and generating profit. The presentation also discusses the importance of defining the parameter space associated with the dynamic training data-set and using the parameter space to identify and remove outliers from prediction data points. Meanwhile, the generalized architecture enables common users to exploit the powerful machinery while focusing on high-level feature engineering and model testing. The presentation also highlights common strengths and weaknesses associated with the presented technique and presents a broad range of well-tested starting points for feature set construction, target setting, and statistical methods for enforcing risk management and maintaining probabilistically favorable entry and exit points. The presentation also describes the end-to-end data processing tools associated with FreqAI, including automatic data fetching, data aggregation, feature engineering, safe and robust data pre-processing, outlier detection, custom machine learning and statistical tools, data post-processing, and adaptive training backtest emulation, and deployment of adaptive training in live environments. Finally, the generalized user interface is also discussed in the presentation. Feature engineering is simplified so that users can seed their feature sets with common indicator libraries (e.g. TA-lib, pandas-ta). The user also feeds data expansion parameters to fill out a large feature set for the model, which can contain as many as 10,000+ features. The presentation describes the various object-oriented programming techniques employed to make FreqAI agnostic to third-party libraries and external data sources. In other words, the back-end is constructed in such a way that users can leverage a broad range of common regression libraries (Catboost, LightGBM, Sklearn, etc) as well as common Neural Network libraries (TensorFlow, PyTorch) without worrying about the logistical complexities associated with data handling and API interactions. The presentation finishes by drawing conclusions about the most important parameters associated with a live deployment of the adaptive learning framework and provides the road map for future development in FreqAI.

Keywords: machine learning, market trend detection, open-source, adaptive learning, parameter space exploration

Procedia PDF Downloads 75
3157 Arabic Light Word Analyser: Roles with Deep Learning Approach

Authors: Mohammed Abu Shquier

Abstract:

This paper introduces a word segmentation method using the novel BP-LSTM-CRF architecture for processing semantic output training. The objective of web morphological analysis tools is to link a formal morpho-syntactic description to a lemma, along with morpho-syntactic information, a vocalized form, a vocalized analysis with morpho-syntactic information, and a list of paradigms. A key objective is to continuously enhance the proposed system through an inductive learning approach that considers semantic influences. The system is currently under construction and development based on data-driven learning. To evaluate the tool, an experiment on homograph analysis was conducted. The tool also encompasses the assumption of deep binary segmentation hypotheses, the arbitrary choice of trigram or n-gram continuation probabilities, language limitations, and morphology for both Modern Standard Arabic (MSA) and Dialectal Arabic (DA), which provide justification for updating this system. Most Arabic word analysis systems are based on the phonotactic morpho-syntactic analysis of a word transmitted using lexical rules, which are mainly used in MENA language technology tools, without taking into account contextual or semantic morphological implications. Therefore, it is necessary to have an automatic analysis tool taking into account the word sense and not only the morpho-syntactic category. Moreover, they are also based on statistical/stochastic models. These stochastic models, such as HMMs, have shown their effectiveness in different NLP applications: part-of-speech tagging, machine translation, speech recognition, etc. As an extension, we focus on language modeling using Recurrent Neural Network (RNN); given that morphological analysis coverage was very low in dialectal Arabic, it is significantly important to investigate deeply how the dialect data influence the accuracy of these approaches by developing dialectal morphological processing tools to show that dialectal variability can support to improve analysis.

Keywords: NLP, DL, ML, analyser, MSA, RNN, CNN

Procedia PDF Downloads 23
3156 Intelligent Process and Model Applied for E-Learning Systems

Authors: Mafawez Alharbi, Mahdi Jemmali

Abstract:

E-learning is a developing area especially in education. E-learning can provide several benefits to learners. An intelligent system to collect all components satisfying user preferences is so important. This research presents an approach that it capable to personalize e-information and give the user their needs following their preferences. This proposal can make some knowledge after more evaluations made by the user. In addition, it can learn from the habit from the user. Finally, we show a walk-through to prove how intelligent process work.

Keywords: artificial intelligence, architecture, e-learning, software engineering, processing

Procedia PDF Downloads 175
3155 The Design and Analysis of a Novel Type High Gain Microstrip Patch Antenna System for the Satellite Communication

Authors: Shahid M. Ali, Zakiullah

Abstract:

An individual feed, smooth and smart, completely new shaped, dual band microstrip patch antenna has been proposed in this manuscript. Right here three triangular shape slots are usually presented in the 3 edges on the patch and along with a small feed line has utilized another edge on the patch to find out the dual band. The antenna carries a condensed framework wherever patch is around about 8.5mm by means of 7.96mm by means of 1.905mm leading to excellent bandwidths covering 13. 15 GHz to 13. 72 GHz in addition to 16.04 GHz to 16.58GHz. The return loss(RL) decrease in -19. 00dB and will be attained in the first resonant frequency at 13. 61 GHz and -28.69dB is at second resonance frequency at 16.33GHz. The stable average peak gain that may be observed along the operating band in lower and higher frequency is actually three. 53dB in addition to 5.562dB correspondingly. The radiation designs usually are omni directional along with moderate gain within equally most of these functioning bands. Accomplishment is proven within double frequencies at 13.62GHz since downlink in addition to 16.33GHz since uplink. This kind of low and simple configuration of the proposed antenna shows simplest fabrication and make it ensure that it is adaptable for your application within instant in satellite and as well as for the wireless communication system.

Keywords: dual band, microstrip patch antenna, HFSS, Ku band, satellite

Procedia PDF Downloads 346
3154 Experimental and Numerical Analysis of Built-In Thermoelectric Generator Modules with Elliptical Pin-Fin Heat Sink

Authors: J. Y Jang, C. Y. Tseng

Abstract:

A three-dimensional numerical model of thermoelectric generator (TEG) modules attached to a large chimney plate is proposed and solved numerically using a control volume based finite difference formulation. The TEG module consists of a thermoelectric generator, an elliptical pin-fin heat sink, and a cold plate for water cooling. In the chimney, the temperature of flue gases is 450-650K. Therefore, the effects of convection and radiation heat transfer are considered. Although the TEG hot-side temperature and thus the electric power output can be increased by inserting an elliptical pin-fin heat sink into the chimney tunnel to increase the heat transfer area, the pin fin heat sink would cause extra pumping power at the same time. The main purpose of this study is to analyze the effects of geometrical parameters on the electric power output and chimney pressure drop characteristics. In addition, the effects of different operating conditions, including various inlet velocities (Vin = 1, 3, 5 m/s) and inlet temperatures (Tgas = 450, 550, 650K) are discussed in detail. The predicted numerical data for the power vs. current (P-I) curve are in good agreement (within 11%) with the experimental data.

Keywords: thermoelectric generator, waste heat recovery, pin-fin heat sink, experimental and numerical analysis

Procedia PDF Downloads 368
3153 Integrating Natural Language Processing (NLP) and Machine Learning in Lung Cancer Diagnosis

Authors: Mehrnaz Mostafavi

Abstract:

The assessment and categorization of incidental lung nodules present a considerable challenge in healthcare, often necessitating resource-intensive multiple computed tomography (CT) scans for growth confirmation. This research addresses this issue by introducing a distinct computational approach leveraging radiomics and deep-learning methods. However, understanding local services is essential before implementing these advancements. With diverse tracking methods in place, there is a need for efficient and accurate identification approaches, especially in the context of managing lung nodules alongside pre-existing cancer scenarios. This study explores the integration of text-based algorithms in medical data curation, indicating their efficacy in conjunction with machine learning and deep-learning models for identifying lung nodules. Combining medical images with text data has demonstrated superior data retrieval compared to using each modality independently. While deep learning and text analysis show potential in detecting previously missed nodules, challenges persist, such as increased false positives. The presented research introduces a Structured-Query-Language (SQL) algorithm designed for identifying pulmonary nodules in a tertiary cancer center, externally validated at another hospital. Leveraging natural language processing (NLP) and machine learning, the algorithm categorizes lung nodule reports based on sentence features, aiming to facilitate research and assess clinical pathways. The hypothesis posits that the algorithm can accurately identify lung nodule CT scans and predict concerning nodule features using machine-learning classifiers. Through a retrospective observational study spanning a decade, CT scan reports were collected, and an algorithm was developed to extract and classify data. Results underscore the complexity of lung nodule cohorts in cancer centers, emphasizing the importance of careful evaluation before assuming a metastatic origin. The SQL and NLP algorithms demonstrated high accuracy in identifying lung nodule sentences, indicating potential for local service evaluation and research dataset creation. Machine-learning models exhibited strong accuracy in predicting concerning changes in lung nodule scan reports. While limitations include variability in disease group attribution, the potential for correlation rather than causality in clinical findings, and the need for further external validation, the algorithm's accuracy and potential to support clinical decision-making and healthcare automation represent a significant stride in lung nodule management and research.

Keywords: lung cancer diagnosis, structured-query-language (SQL), natural language processing (NLP), machine learning, CT scans

Procedia PDF Downloads 65
3152 Multiscale Process Modeling Analysis for the Prediction of Composite Strength Allowables

Authors: Marianna Maiaru, Gregory M. Odegard

Abstract:

During the processing of high-performance thermoset polymer matrix composites, chemical reactions occur during elevated pressure and temperature cycles, causing the constituent monomers to crosslink and form a molecular network that gradually can sustain stress. As the crosslinking process progresses, the material naturally experiences a gradual shrinkage due to the increase in covalent bonds in the network. Once the cured composite completes the cure cycle and is brought to room temperature, the thermal expansion mismatch of the fibers and matrix cause additional residual stresses to form. These compounded residual stresses can compromise the reliability of the composite material and affect the composite strength. Composite process modeling is greatly complicated by the multiscale nature of the composite architecture. At the molecular level, the degree of cure controls the local shrinkage and thermal-mechanical properties of the thermoset. At the microscopic level, the local fiber architecture and packing affect the magnitudes and locations of residual stress concentrations. At the macroscopic level, the layup sequence controls the nature of crack initiation and propagation due to residual stresses. The goal of this research is use molecular dynamics (MD) and finite element analysis (FEA) to predict the residual stresses in composite laminates and the corresponding effect on composite failure. MD is used to predict the polymer shrinkage and thermomechanical properties as a function of degree of cure. This information is used as input into FEA to predict the residual stresses on the microscopic level resulting from the complete cure process. Virtual testing is subsequently conducted to predict strength allowables. Experimental characterization is used to validate the modeling.

Keywords: molecular dynamics, finite element analysis, processing modeling, multiscale modeling

Procedia PDF Downloads 80
3151 Modeling the Elastic Mean Free Path of Electron Collision with Pyrimidine: The Screen Corrected Additivity Rule Method

Authors: Aouina Nabila Yasmina, Chaoui Zine El Abiddine

Abstract:

This study presents a comprehensive investigation into the elastic mean free path (EMFP) of electrons colliding with pyrimidine, a precursor to the pyrimidine bases in DNA, employing the Screen Corrected Additivity Rule (SCAR) method. The SCAR method is introduced as a novel approach that combines classical and quantum mechanical principles to elucidate the interaction of electrons with pyrimidine. One of the most fundamental properties characterizing the propagation of a particle in the nuclear medium is its mean free path. Knowledge of the elastic mean free path is essential to accurately predict the effects of radiation on biological matter, as it contributes to the distances between collisions. Additionally, the mean free path plays a role in the interpretation of almost all experiments in which an excited electron moves through a solid. Pyrimidine, the precursor of the pyrimidine bases of DNA, has interesting physicochemical properties, which make it an interesting molecule to study from a fundamental point of view. These include a relatively large dipole polarizability and dipole moment and an electronic charge cloud with a significant spatial extension, which justifies its choice in this present study.

Keywords: elastic mean free path, elastic collision, pyrimidine, SCAR

Procedia PDF Downloads 48
3150 Bactericidal Efficacy of Quaternary Ammonium Compound on Carriers with Food Additive Grade Calcium Hydroxide against Salmonella Infantis and Escherichia coli

Authors: M. Shahin Alam, Satoru Takahashi, Mariko Itoh, Miyuki Komura, Mayuko Suzuki, Natthanan Sangsriratanakul, Kazuaki Takehara

Abstract:

Cleaning and disinfection are key components of routine biosecurity in livestock farming and food processing industry. The usage of suitable disinfectants and their proper concentration are important factors for a successful biosecurity program. Disinfectants have optimum bactericidal and virucidal efficacies at temperatures above 20°C, but very few studies on application and effectiveness of disinfectants at low temperatures have been done. In the present study, the bactericidal efficacies of food additive grade calcium hydroxide (FdCa(OH)), quaternary ammonium compound (QAC) and their mixture, were investigated under different conditions, including time, organic materials (fetal bovine serum: FBS) and temperature, either in suspension or in carrier test. Salmonella Infantis and Escherichia coli, which are the most prevalent gram negative bacteria in commercial poultry housing and food processing industry, were used in this study. Initially, we evaluated these disinfectants at two different temperatures (4°C and room temperature (RT) (25°C ± 2°C)) and 7 contact times (0, 5 and 30 sec, 1, 3, 20 and 30 min), with suspension tests either in the presence or absence of 5% FBS. Secondly, we investigated the bactericidal efficacies of these disinfectants by carrier tests (rubber, stainless steel and plastic) at same temperatures and 4 contact times (30 sec, 1, 3, and 5 min). Then, we compared the bactericidal efficacies of each disinfectant within their mixtures, as follows. When QAC was diluted with redistilled water (dW2) at 1: 500 (QACx500) to obtain the final concentration of didecyl-dimethylammonium chloride (DDAC) of 200 ppm, it could inactivate Salmonella Infantis within 5 sec at RT either with or without 5% FBS in suspension test; however, at 4°C it required 30 min in presence of 5% FBS. FdCa(OH)2 solution alone could inactivate bacteria within 1 min both at RT and 4°C even with 5% FBS. While FdCa(OH)2 powder was added at final concentration 0.2% to QACx500 (Mix500), the mixture could inactivate bacteria within 30 sec and 5 sec, respectively, with or without 5% FBS at 4°C. The findings from the suspension test indicated that low temperature inhibited the bactericidal efficacy of QAC, whereas Mix500 was effective, regardless of short contact time and low temperature, even with 5% FBS. In the carrier test, single disinfectant required bit more time to inactivate bacteria on rubber and plastic surfaces than on stainless steel. However, Mix500 could inactivate S. Infantis on rubber, stainless steel and plastic surfaces within 30 sec and 1 min, respectively, at RT and 4°C; but, for E. coli, it required only 30 sec at both temperatures. So, synergistic effects were observed on different carriers at both temperatures. For a successful enhancement of biosecurity during winter, the disinfectants should be selected that could have short contact times with optimum efficacy against the target pathogen. The present study findings help farmers to make proper strategies for application of disinfectants in their livestock farming and food processing industry.

Keywords: carrier, food additive grade calcium hydroxide (FdCa(OH)₂), quaternary ammonium compound, synergistic effects

Procedia PDF Downloads 282
3149 Phytochemicals and Photosynthesis of Grape Berry Exocarp and Seed (Vitis vinifera, cv. Alvarinho): Effects of Foliar Kaolin and Irrigation

Authors: Andreia Garrido, Artur Conde, Ana Cunha, Ric De Vos

Abstract:

Climate changes predictions point to increases in abiotic stress for crop plants in Portugal, like pronounced temperature variation and decreased precipitation, which will have negative impact on grapevine physiology and consequently, on grape berry and wine quality. Short-term mitigation strategies have, therefore, been implemented to alleviate the impacts caused by adverse climatic periods. These strategies include foliar application of kaolin, an inert mineral, which has radiation reflection proprieties that decreases stress from excessive heat/radiation absorbed by its leaves, as well as smart irrigation strategies to avoid water stress. However, little is known about the influence of these mitigation measures on grape berries, neither on the photosynthetic activity nor on the photosynthesis-related metabolic profiles of its various tissues. Moreover, the role of fruit photosynthesis on berry quality is poorly understood. The main objective of our work was to assess the effects of kaolin and irrigation treatments on the photosynthetic activity of grape berry tissues (exocarp and seeds) and on their global metabolic profile, also investigating their possible relationship. We therefore collected berries of field-grown plants of the white grape variety Alvarinho from two distinct microclimates, i.e. from clusters exposed to high light (HL, 150 µmol photons m⁻² s⁻¹) and low light (LL, 50 µmol photons m⁻² s⁻¹), from both kaolin and non-kaolin (control) treated plants at three fruit developmental stages (green, véraison and mature). Plant irrigation was applied after harvesting the green berries, which also enabled comparison of véraison and mature berries from irrigated and non-irrigated growth conditions. Photosynthesis was assessed by pulse amplitude modulated chlorophyll fluorescence imaging analysis, and the metabolite profile of both tissues was assessed by complementary metabolomics approaches. Foliar kaolin application resulted in, for instance, an increased photosynthetic activity of the exocarp of LL-grown berries at green developmental stage, as compared to the control non-kaolin treatment, with a concomitant increase in the levels of several lipid-soluble isoprenoids (chlorophylls, carotenoids, and tocopherols). The exocarp of mature berries grown at HL microclimate on kaolin-sprayed non-irrigated plants had higher total sugar levels content than all other treatments, suggesting that foliar application of this mineral results in an increased accumulation of photoassimilates in mature berries. Unbiased liquid chromatography-mass spectrometry-based profiling of semi-polar compounds followed by ASCA (ANOVA simultaneous component analysis) and ANOVA statistical analysis indicated that kaolin had no or inconsistent effect on the flavonoid and phenylpropanoid composition in both seed and exocarp at any developmental stage; in contrast, both microclimate and irrigation influenced the level of several of these compounds depending on berry ripening stage. Overall, our study provides more insight into the effects of mitigation strategies on berry tissue photosynthesis and phytochemistry, under contrasting conditions of cluster light microclimate. We hope that this may contribute to develop sustainable management in vineyards and to maintain grape berries and wines with high quality even at increasing abiotic stress challenges.

Keywords: climate change, grape berry tissues, metabolomics, mitigation strategies

Procedia PDF Downloads 98
3148 MHD Stagnation-Point Flow over a Plate

Authors: H. Niranjan, S. Sivasankaran

Abstract:

Heat and mass transfer near a steady stagnation point boundary layer flow of viscous incompressible fluid through porous media investigates along a vertical plate is thoroughly studied under the presence of magneto hydrodynamic (MHD) effects. The fluid flow is steady, laminar, incompressible and in two-dimensional. The nonlinear differential coupled parabolic partial differential equations of continuity, momentum, energy and specie diffusion are converted into the non-similar boundary layer equations using similarity transformation, which are then solved numerically using the Runge-Kutta method along with shooting method. The effects of the conjugate heat transfer parameter, the porous medium parameter, the permeability parameter, the mixed convection parameter, the magnetic parameter, and the thermal radiation on the velocity and temperature profiles as well as on the local skin friction and local heat transfer are presented and analyzed. The validity of the methodology and analysis is checked by comparing the results obtained for some specific cases with those available in the literature. The various parameters on local skin friction, heat and mass transfer rates are presented in tabular form.

Keywords: MHD, porous medium, slip, convective boundary condition, stagnation point

Procedia PDF Downloads 290
3147 Applying ASHRAE Standards on the Hospital Buildings of UAE

Authors: Hanan M. Taleb

Abstract:

Energy consumption associated with buildings has a significant impact on the environment. To that end, and as a transaction between the inside and outside and between the building and urban space, the building skin plays an especially important role. It provides protection from the elements; demarcates private property and creates privacy. More importantly, it controls the admission of solar radiation. Therefore, designing the building skin sustainably will help to achieve optimal performance in terms of both energy consumption and thermal comfort. Unfortunately, with accelerating construction expansion, many recent buildings do not pay attention to the importance of the envelope design. This piece of research will highlight the importance of this part of the creation of buildings by providing evidence of a significant reduction in energy consumption if the envelopes are redesigned. Consequently, the aim of this paper is to enhance the performance of the hospital envelope in order to achieve sustainable performance. A hospital building sited in Abu Dhabi, in the UAE, has been chosen to act as a case study. A detailed analysis of the annual energy performance of the case study will be performed with the use of a computerised simulation; this is in order to explore their energy performance shortcomings. The energy consumption of the base case will then be compared with that resulting from the new proposed building skin. The results will inform architects and designers of the savings potential from various strategies.

Keywords: ASHREA, building skin, building envelopes, hospitals, Abu Dhabi, UAE, IES software

Procedia PDF Downloads 347
3146 Pulse Method for Investigation of Zr-C Phase Diagram at High Carbon Content Domain under High Temperatures

Authors: Arseniy M. Kondratyev, Sergey V. Onufriev, Alexander I. Savvatimskiy

Abstract:

The microsecond electrical pulse heating technique which provides uniform energy input into an investigated specimen is considered. In the present study we investigated ZrC+C carbide specimens in a form of a thin layer (about 5 microns thick) that were produced using a method of magnetron sputtering on insulating substrates. Specimens contained (at. %): Zr–17.88; C–67.69; N–8.13; O–5.98. Current through the specimen, voltage drop across it and radiation at the wavelength of 856 nm were recorded in the experiments. It enabled us to calculate the input energy, specific heat (from 2300 to 4500 K) and resistivity (referred to the initial dimensions of a specimen). To obtain the true temperature a black body specimen was used. Temperature of the beginning and completion of a phase transition (solid–liquid) was measured.Temperature of the onset of melting was 3150 K at the input energy 2.65 kJ/g; temperature of the completion of melting was 3450 K at the input energy 5.2 kJ/g. The specific heat of the solid phase of investigated carbide calculated using our data on temperature and imparted energy, is close to 0.75 J/gК for temperature range 2100–2800 K. Our results are considered together with the equilibrium Zr-C phase diagram.

Keywords: pulse heating, zirconium carbide, high temperatures, melting

Procedia PDF Downloads 309
3145 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory

Authors: Xiaochen Mu

Abstract:

Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.

Keywords: data protection, property rights, intellectual property, Big data

Procedia PDF Downloads 10
3144 Building Atmospheric Moisture Diagnostics: Environmental Monitoring and Data Collection

Authors: Paula Lopez-Arce, Hector Altamirano, Dimitrios Rovas, James Berry, Bryan Hindle, Steven Hodgson

Abstract:

Efficient mould remediation and accurate moisture diagnostics leading to condensation and mould growth in dwellings are largely untapped. Number of factors are contributing to the rising trend of excessive moisture in homes mainly linked with modern living, increased levels of occupation and rising fuel costs, as well as making homes more energy efficient. Environmental monitoring by means of data collection though loggers sensors and survey forms has been performed in a range of buildings from different UK regions. Air and surface temperature and relative humidity values of residential areas affected by condensation and/or mould issues were recorded. Additional measurements were taken through different trials changing type, location, and position of loggers. In some instances, IR thermal images and ventilation rates have also been acquired. Results have been interpreted together with environmental key parameters by processing and connecting data from loggers and survey questionnaires, both in buildings with and without moisture issues. Monitoring exercises carried out during Winter and Spring time show the importance of developing and following accurate protocols for guidance to obtain consistent, repeatable and comparable results and to improve the performance of environmental monitoring. A model and a protocol are being developed to build a diagnostic tool with the goal of performing a simple but precise residential atmospheric moisture diagnostics to distinguish the cause entailing condensation and mould generation, i.e., ventilation, insulation or heating systems issue. This research shows the relevance of monitoring and processing environmental data to assign moisture risk levels and determine the origin of condensation or mould when dealing with a building atmospheric moisture excess.

Keywords: environmental monitoring, atmospheric moisture, protocols, mould

Procedia PDF Downloads 126