Search results for: Dirichlet eta lambda beta functions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2926

Search results for: Dirichlet eta lambda beta functions

2656 Development of β-Ti Alloy Powders for Additive Manufacturing for Application in Patient-Specific Orthopedic Implants

Authors: Eugene Ivanov, Eduardo del-Rio, Igor Kapchenko, Maija Nystrӧm, Juha Kotila

Abstract:

Series of low modulus beta Ti alloy billets and powders can be produced in commercial quantities using a combination of electron beam melting (EBM) and EIGA atomization processes. In the present study, TNZT alloy powder was produced and processed in the EOSINT M290 laser sintering system to produce parts for mechanical testing. Post heat treatments such as diffusion annealing to reduce internal stresses or hot isostatic pressing to remove closed pores were not applied. The density can visually be estimated to be > 99,9 %. According to EDS study Nb, Zr, and Ta are distributed homogeneously throughout the printed sample. There are no indications for any segregation or chemical inhomogeneity, i.e. variation of the element distribution. These points to the fact that under the applied experimental conditions the melt generated by the laser rapidly cools down in the SLM (Selective Laser Melting) process. The selective laser sintering yielded dense structures with relatively good surface quality. The mechanical properties, especially the elongation (24%) along with tensile strength ( > 500MPa) and modulus of elasticity (~60GPa), were found to be promising compared to titanium alloys in general.

Keywords: beta titanium alloys, additive manufacturing, powder, implants

Procedia PDF Downloads 204
2655 Supply Chain Competitiveness with the Perspective of Service Performance Between Supply Chain Actors and Functions: A Theoretical Model

Authors: Umer Mukhtar

Abstract:

Supply Chain Competitiveness is the capability of a supply chain to deliver value to the customer for the sake of competitive advantage. Service Performance and Quality intervene between supply chain actors including functions inside the firm in a significant way for the supply chain to achieve a competitive position in the market to gain competitive advantage. Supply Chain competitiveness is the current issue of interest because of supply chains’ competition for competitive advantage rather than firms’. A proposed theoretical model is developed by extracting and integrating different theories to pursue further inquiry based on case studies and survey design. It is also intended to develop a scale of service performance for functions of the focal firm that is a revolving center for a whole supply chain.

Keywords: supply chain competitiveness, service performance in supply chain, service quality in supply chain, competitive advantage by supply chain, networks and supply chain, customer value, value supply chain, value chain

Procedia PDF Downloads 573
2654 Orientational Pair Correlation Functions Modelling of the LiCl6H2O by the Hybrid Reverse Monte Carlo: Using an Environment Dependence Interaction Potential

Authors: Mohammed Habchi, Sidi Mohammed Mesli, Rafik Benallal, Mohammed Kotbi

Abstract:

On the basis of four partial correlation functions and some geometric constraints obtained from neutron scattering experiments, a Reverse Monte Carlo (RMC) simulation has been performed in the study of the aqueous electrolyte LiCl6H2O at the glassy state. The obtained 3-dimensional model allows computing pair radial and orientational distribution functions in order to explore the structural features of the system. Unrealistic features appeared in some coordination peaks. To remedy to this, we use the Hybrid Reverse Monte Carlo (HRMC), incorporating an additional energy constraint in addition to the usual constraints derived from experiments. The energy of the system is calculated using an Environment Dependence Interaction Potential (EDIP). Ions effects is studied by comparing correlations between water molecules in the solution and in pure water at room temperature Our results show a good agreement between experimental and computed partial distribution functions (PDFs) as well as a significant improvement in orientational distribution curves.

Keywords: LiCl6H2O, glassy state, RMC, HRMC

Procedia PDF Downloads 436
2653 Poor Cognitive Flexibility as Suggested Basis for Learning Difficulties among Children with Moderate-INTO-Severe Asthma: Evidence from WCSTPerformance

Authors: Haitham Taha

Abstract:

The cognitive flexibility of 27 asthmatic children with learning difficulties was tested by using the Wisconsin card sorting test (WCST) and compared to the performances of 30 non-asthmatic children who have persistence learning difficulties also. The results revealed that the asthmatic group had poor performance through all the WCST psychometric parameters and especially the preservative errors one. The results were discussed in light of the postulation that poor executive functions and specifically poor cognitive flexibility are in the basis of the learning difficulties of asthmatic children with learning difficulties. Neurophysiologic framework was suggested for explaining the etiology of poor executive functions and cognitive flexibility among children with moderate into severe asthma.

Keywords: asthma, learning disabilities, executive functions, cognitive flexibility, WCST

Procedia PDF Downloads 474
2652 Epistemological Functions of Emotions and Their Relevance to the Formation of Citizens and Scientists

Authors: Dení Stincer Gómez, Zuraya Monroy Nasr

Abstract:

Pedagogy of science historically has given priority to teaching strategies that mobilize the cognitive mechanisms leaving out emotional. Modern epistemology, cognitive psychology and psychoanalysis begin to argue and prove that emotions are relevant epistemological functions. They are 1) the selection function: that allows the perception and reason choose, to multiple alternative explanation of a particular fact, those are relevant and discard those that are not, 2) heuristic function: that is related to the activation cognitive processes that are effective in the process of knowing; and 3) the function that called carrier content: on the latter it arises that emotions give the material reasoning that later transformed into linguistic propositions. According to these hypotheses, scientific knowledge seems to come from emotions that meet these functions. In this paper I argue that science education should start from the presence of certain emotions in the learner if it is to form citizens with scientific or cultural future scientists.

Keywords: epistemic emotions, science education, formation of citizens and scientists., philosophy of emotions

Procedia PDF Downloads 96
2651 Inheritance, Stability, and Validation of Provitamin a Markers in Striga Hermonthica-Resistant Maize

Authors: Fiston Masudi Tambwe, Lwanga Charles, Arfang Badji, Unzimai Innocent

Abstract:

The development of maize varieties combining Provitamin A (PVA), high yield, and Striga resistance is an effective and affordable strategy to contribute to food security in sub-Saharan Africa, where maize is a staple food crop. There has been limited research on introgressing PVA genes into Striga-resistant maize genotypes. The objectives of this study were to: i) determine the mode of gene action controlling PVA carotenoid accumulation in Striga-resistant maize, ii) identify Striga-resistant maize hybrids with high PVA content and stable yield, and iii) validate the presence of PVA functional markers in offspring. Six elite, Striga-resistant inbred females were crossed with six high-PVA inbred males in a North Carolina Design II and their offspring were evaluated in four environments, following a 5x8 alpha lattice design with four hybrid checks. Results revealed that both additive and non-additive gene action control carotenoid accumulation in the present study, with a predominance of non-additive gene effects for PVA. Hybrids STR1004xCLHP0352 and STR1004xCLHP0046 - identified as Striga-resistant because they supported fewer Striga plants – were the highest-yielding genotypes with a moderate PVA concentration of 5.48 and 5.77 µg/g, respectively. However, those two hybrids were not stable in terms of yield across all environments. Hybrid STR1007xCLHP0046, however, supported fewer Striga plants, had a yield of 4.52 T/ha, a PVA concentration of 4.52 µg/g, and was also stable. Gel-based marker systems of CrtRB1 and LCYE were used to screen the hybrids and favorable alleles of CrtRB1 primers were detected in 20 hybrids, confirming good levels of PVA carotenoids. Hybrids with favorable alleles of LCYE had the highest concentration of non-PVA carotenoids. These findings will contribute to the development of high-yielding PVA-rich maize varieties in Uganda.

Keywords: gene action, stability, striga resistance, provitamin A markers, beta-carotene hydroxylase 1, CrtRB1, beta-carotene, beta-cryptoxanthin, lycopene epsilon cyclase, LCYE

Procedia PDF Downloads 39
2650 Exploring Neural Responses to Urban Spaces in Older People Using Mobile EEG

Authors: Chris Neale, Jenny Roe, Peter Aspinall, Sara Tilley, Steve Cinderby, Panos Mavros, Richard Coyne, Neil Thin, Catharine Ward Thompson

Abstract:

This research directly assesses older people’s neural activation in response to walking through a changing urban environment, as measured by electroencephalography (EEG). As the global urban population is predicted to grow, there is a need to understand the role that the urban environment may play on the health of its older inhabitants. There is a large body of evidence suggesting green space has a beneficial restorative effect, but this effect remains largely understudied in both older people and by using a neuroimaging assessment. For this study, participants aged 65 years and over were required to walk between a busy urban built environment and a green urban environment, in a counterbalanced design, wearing an Emotiv EEG headset to record real-time neural responses to place. Here we report on the outputs for these responses derived from both the proprietary Affectiv Suite software, which creates emotional parameters with a real time value assigned to them, as well as the raw EEG output focusing on alpha and beta changes, associated with changes in relaxation and attention respectively. Each walk lasted around fifteen minutes and was undertaken at the natural walking pace of the participant. The two walking environments were compared using a form of high dimensional correlated component regression (CCR) on difference data between the urban busy and urban green spaces. For the Emotiv parameters, results showed that levels of ‘engagement’ increased in the urban green space (with a subsequent decrease in the urban busy built space) whereas levels of ‘excitement’ increased in the urban busy environment (with a subsequent decrease in the urban green space). In the raw data, low beta (13 – 19 Hz) increased in the urban busy space with a subsequent decrease shown in the green space, similar to the pattern shown with the ‘excitement’ result. Alpha activity (9 – 13 Hz) shows a correlation with low beta, but not with dependent change in the regression model. This suggests that alpha is acting as a suppressor variable. These results suggest that there are neural signatures associated with the experience of urban spaces which may reflect the age of the cohort or the spatiality of the settings themselves. These are shown both in the outputs of the proprietary software as well as the raw EEG output. Built busy urban spaces appear to induce neural activity associated with vigilance and low level stress, while this effect is ameliorated in the urban green space, potentially suggesting a beneficial effect on attentional capacity in urban green space in this participant group. The interaction between low beta and alpha requires further investigation, in particular the role of alpha in this relationship.

Keywords: ageing, EEG, green space, urban space

Procedia PDF Downloads 199
2649 On Boundary Values of Hardy Space Banach Space-Valued Functions

Authors: Irina Peterburgsky

Abstract:

Let T be a unit circumference of a complex plane, E be a Banach space, E* and E** be its conjugate and second conjugate, respectively. In general, a Hardy space Hp(E), p ≥1, where functions act from the open unit disk to E, could contain a function for which even weak nontangential (angular) boundary value in the space E** does not exist at any point of the unit circumference T (C. Grossetete.) The situation is "better" when certain restrictions to the Banach space of values are applied (more or less resembling a classical case of scalar-valued functions depending on constrains, as shown by R. Ryan.) This paper shows that, nevertheless, in the case of a Banach space of a general type, the following positive statement is true: Proposition. For any function f(z) from Hp(E), p ≥ 1, there exists a function F(eiθ) on the unit circumference T to E** whose Poisson (in the Pettis sense) is integral regains the function f(z) on the open unit disk. Some characteristics of the function F(eiθ) are demonstrated.

Keywords: hardy spaces, Banach space-valued function, boundary values, Pettis integral

Procedia PDF Downloads 218
2648 Seismic Performance Assessment of Pre-70 RC Frame Buildings with FEMA P-58

Authors: D. Cardone

Abstract:

Past earthquakes have shown that seismic events may incur large economic losses in buildings. FEMA P-58 provides engineers a practical tool for the performance seismic assessment of buildings. In this study, FEMA P-58 is applied to two typical Italian pre-1970 reinforced concrete frame buildings, characterized by plain rebars as steel reinforcement and masonry infills and partitions. Given that suitable tools for these buildings are missing in FEMA P- 58, specific fragility curves and loss functions are first developed. Next, building performance is evaluated following a time-based assessment approach. Finally, expected annual losses for the selected buildings are derived and compared with past applications to old RC frame buildings representative of the US building stock. 

Keywords: FEMA P-58, RC frame buildings, plain rebars, Masonry infills, fragility functions, loss functions, expected annual loss

Procedia PDF Downloads 300
2647 Changes of First-Person Pronoun Pragmatic Functions in Three Historical Chinese Texts

Authors: Cher Leng Lee

Abstract:

The existence of multiple first-person pronouns (1PPs) in classical Chinese is an issue that has not been resolved despite linguists using the grammatical perspective. This paper proposes pragmatics as a viable solution. There is also a lack of research exploring the evolving usage patterns of 1PPs within the historical context of Chinese language use. Such research can help us comprehend the changes and developments of these linguistic elements. To fill these research gaps, we use the diachronic pragmatics approach to contrast the functions of Chinese 1PPs in three representative texts from three different historical periods: The Analects (The Spring and Autumn Period), The Grand Scribe’s Records (Grand Records) (Qin and Han Period), and A New Account of Tales of the World (New Account) (The Wei, Jin and Southern and Northern Period). The 1PPs of these texts are manually identified and classified according to the pragmatic functions in the given contexts to observe their historical changes, understand the factors that contribute to these changes, and provide possible answers to the development of how wo became the only 1PP in today’s spoken Mandarin.

Keywords: historical, Chinese, pronouns, pragmatics

Procedia PDF Downloads 22
2646 Method of Synthesis of Controlled Generators Balanced a Strictly Avalanche Criteria-Functions

Authors: Ali Khwaldeh, Nimer Adwan

Abstract:

In this paper, a method for constructing a controlled balanced Boolean function satisfying the criterion of a Strictly Avalanche Criteria (SAC) effect is proposed. The proposed method is based on the use of three orthogonal nonlinear components which is unlike the high-order SAC functions. So, the generator synthesized by the proposed method has separate sets of control and information inputs. The proposed method proves its simplicity and the implementation ability. The proposed method allows synthesizing a SAC function generator with fixed control and information inputs. This ensures greater efficiency of the built-in oscillator compared to high-order SAC functions that can be used as a generator. Accordingly, the method is completely formalized and implemented as a software product.

Keywords: boolean function, controlled balanced boolean function, strictly avalanche criteria, orthogonal nonlinear

Procedia PDF Downloads 131
2645 Cerebral Pulsatility Mediates the Link Between Physical Activity and Executive Functions in Older Adults with Cardiovascular Risk Factors: A Longitudinal NIRS Study

Authors: Hanieh Mohammadi, Sarah Fraser, Anil Nigam, Frederic Lesage, Louis Bherer

Abstract:

A chronically higher cerebral pulsatility is thought to damage cerebral microcirculation, leading to cognitive decline in older adults. Although it is widely known that regular physical activity is linked to improvement in some cognitive domains, including executive functions, the mediating role of cerebral pulsatility on this link remains to be elucidated. This study assessed the impact of 6 months of regular physical activity upon changes in an optical index of cerebral pulsatility and the role of physical activity for the improvement of executive functions. 27 older adults (aged 57-79, 66.7% women) with cardiovascular risk factors (CVRF) were enrolled in the study. The participants completed the behavioral Stroop test, which was extracted from the Delis-Kaplan executive functions system battery at baseline (T0) and after 6 months (T6) of physical activity. Near-infrared spectroscopy (NIRS) was applied for an innovative approach to indexing cerebral pulsatility in the brain microcirculation at T0 and T6. The participants were at standing rest while a NIRS device recorded hemodynamics data from frontal and motor cortex subregions at T0 and T6. The cerebral pulsatility index of interest was cerebral pulse amplitude, which was extracted from the pulsatile component of NIRS data. Our data indicated that 6 months of physical activity was associated with a reduction in the response time for the executive functions, including inhibition (T0: 56.33± 18.2 to T6: 53.33± 15.7,p= 0.038)and Switching(T0: 63.05± 5.68 to T6: 57.96 ±7.19,p< 0.001) conditions of the Stroop test. Also, physical activity was associated with a reduction in cerebral pulse amplitude (T0: 0.62± 0.05 to T6: 0.55± 0.08, p < 0.001). Notably, cerebral pulse amplitude was a significant mediator of the link between physical activity and response to the Stroop test for both inhibition (β=0.33 (0.61,0.23),p< 0.05)and switching (β=0.42 (0.69,0.11),p <0.01) conditions. This study suggests that regular physical activity may support cognitive functions through the improvement of cerebral pulsatility in older adults with CVRF.

Keywords: near-infrared spectroscopy, cerebral pulsatility, physical activity, cardiovascular risk factors, executive functions

Procedia PDF Downloads 166
2644 Thermal Decomposition Behaviors of Hexafluoroethane (C2F6) Using Zeolite/Calcium Oxide Mixtures

Authors: Kazunori Takai, Weng Kaiwei, Sadao Araki, Hideki Yamamoto

Abstract:

HFC and PFC gases have been commonly and widely used as refrigerant of air conditioner and as etching agent of semiconductor manufacturing process, because of their higher heat of vaporization and chemical stability. On the other hand, HFCs and PFCs gases have the high global warming effect on the earth. Therefore, we have to be decomposed these gases emitted from chemical apparatus like as refrigerator. Until now, disposal of these gases were carried out by using combustion method like as Rotary kiln treatment mainly. However, this treatment needs extremely high temperature over 1000 °C. In the recent year, in order to reduce the energy consumption, a hydrolytic decomposition method using catalyst and plasma decomposition treatment have been attracted much attention as a new disposal treatment. However, the decomposition of fluorine-containing gases under the wet condition is not able to avoid the generation of hydrofluoric acid. Hydrofluoric acid is corrosive gas and it deteriorates catalysts in the decomposition process. Moreover, an additional process for the neutralization of hydrofluoric acid is also indispensable. In this study, the decomposition of C2F6 using zeolite and zeolite/CaO mixture as reactant was evaluated in the dry condition at 923 K. The effect of the chemical structure of zeolite on the decomposition reaction was confirmed by using H-Y, H-Beta, H-MOR and H-ZSM-5. The formation of CaF2 in zeolite/CaO mixtures after the decomposition reaction was confirmed by XRD measurements. The decomposition of C2F6 using zeolite as reactant showed the closely similar behaviors regardless the type of zeolite (MOR, Y, ZSM-5, Beta type). There was no difference of XRD patterns of each zeolite before and after reaction. On the other hand, the difference in the C2F6 decomposition for each zeolite/CaO mixtures was observed. These results suggested that the rate-determining process for the C2F6 decomposition on zeolite alone is the removal of fluorine from reactive site. In other words, the C2F6 decomposition for the zeolite/CaO improved compared with that for the zeolite alone by the removal of the fluorite from reactive site. HMOR/CaO showed 100% of the decomposition for 3.5 h and significantly improved from zeolite alone. On the other hand, Y type zeolite showed no improvement, that is, the almost same value of Y type zeolite alone. The descending order of C2F6 decomposition was MOR, ZSM-5, beta and Y type zeolite. This order is similar to the acid strength characterized by NH3-TPD. Hence, it is considered that the C-F bond cleavage is closely related to the acid strength.

Keywords: hexafluoroethane, zeolite, calcium oxide, decomposition

Procedia PDF Downloads 439
2643 Tsunami Vulnerability of Critical Infrastructure: Development and Application of Functions for Infrastructure Impact Assessment

Authors: James Hilton Williams

Abstract:

Recent tsunami events, including the 2011 Tohoku Tsunami, Japan, and the 2015 Illapel Tsunami, Chile, have highlighted the potential for tsunami impacts on the built environment. International research in the tsunami impacts domain has been largely focused toward impacts on buildings and casualty estimations, while only limited attention has been placed on the impacts on infrastructure which is critical for the recovery of impacted communities. New Zealand, with 75% of the population within 10 km of the coast, has a large amount of coastal infrastructure exposed to local, regional and distant tsunami sources. To effectively manage tsunami risk for New Zealand critical infrastructure, including energy, transportation, and communications, the vulnerability of infrastructure networks and components must first be determined. This research develops infrastructure asset vulnerability, functionality and repair- cost functions based on international post-event tsunami impact assessment data from technologically similar countries, including Japan and Chile, and adapts these to New Zealand. These functions are then utilized within a New Zealand based impact framework, allowing for cost benefit analyses, effective tsunami risk management strategies and mitigation options for exposed critical infrastructure to be determined, which can also be applied internationally.

Keywords: impact assessment, infrastructure, tsunami impacts, vulnerability functions

Procedia PDF Downloads 129
2642 Effects of Partial Sleep Deprivation on Prefrontal Cognitive Functions in Adolescents

Authors: Nurcihan Kiris

Abstract:

Restricted sleep is common in young adults and adolescents. The results of a few objective studies of sleep deprivation on cognitive performance were not clarified. In particular, the effect of sleep deprivation on the cognitive functions associated with frontal lobe such as attention, executive functions, working memory is not well known. The aim of this study is to investigate the effect of partial sleep deprivation experimentally in adolescents on the cognitive tasks of frontal lobe including working memory, strategic thinking, simple attention, continuous attention, executive functions, and cognitive flexibility. Subjects of the study were recruited from voluntary students of Cukurova University. Eighteen adolescents underwent four consecutive nights of monitored sleep restriction (6–6.5 hr/night) and four nights of sleep extension (10–10.5 hr/night), in counterbalanced order, and separated by a washout period. Following each sleep period, cognitive performance was assessed, at a fixed morning time, using a computerized neuropsychological battery based on frontal lobe functions task, a timed test providing both accuracy and reaction time outcome measures. Only the spatial working memory performance of cognitive tasks was found to be statistically lower in a restricted sleep condition than the extended sleep condition. On the other hand, there was no significant difference in the performance of cognitive tasks evaluating simple attention, constant attention, executive functions, and cognitive flexibility. It is thought that especially the spatial working memory and strategic thinking skills of adolescents may be susceptible to sleep deprivation. On the other hand, adolescents are predicted to be optimally successful in ideal sleep conditions, especially in the circumstances requiring for the short term storage of visual information, processing of stored information, and strategic thinking. The findings of this study may also be associated with possible negative functional effects on the processing of academic social and emotional inputs in adolescents for partial sleep deprivation. Acknowledgment: This research was supported by Cukurova University Scientific Research Projects Unit.

Keywords: attention, cognitive functions, sleep deprivation, working memory

Procedia PDF Downloads 116
2641 The Impact of Political Connections on the Funtion of Independent Directors

Authors: Chih-Lin Chang, Tzu-Ching Weng

Abstract:

The purpose of this study is to explore the relationship between corporate political ties and independent directors' functions. With reference to the literature variables such as the characteristics of the relevant board of directors in the past, a single comprehensive function indicator is established as a substitute variable for the function of independent directors, and the impact of political connection on the independent board of directors is further discussed. This research takes Taiwan listed enterprises from 2014 to 2020 as the main research object and conducts empirical research through descriptive statistics, correlation and regression analysis. The empirical results show that companies with political connections will have a positive impact on the number of independent directors; political connections also have a significant positive relationship with the functional part of independent directors, which means that because companies have political connections, they have a positive impact on the seats or functions of independent directors. will pay more attention and increase their oversight functions.

Keywords: political, connection, independent, director, function

Procedia PDF Downloads 69
2640 Segmented Pupil Phasing with Deep Learning

Authors: Dumont Maxime, Correia Carlos, Sauvage Jean-François, Schwartz Noah, Gray Morgan

Abstract:

Context: The concept of the segmented telescope is unavoidable to build extremely large telescopes (ELT) in the quest for spatial resolution, but it also allows one to fit a large telescope within a reduced volume of space (JWST) or into an even smaller volume (Standard Cubesat). Cubesats have tight constraints on the computational burden available and the small payload volume allowed. At the same time, they undergo thermal gradients leading to large and evolving optical aberrations. The pupil segmentation comes nevertheless with an obvious difficulty: to co-phase the different segments. The CubeSat constraints prevent the use of a dedicated wavefront sensor (WFS), making the focal-plane images acquired by the science detector the most practical alternative. Yet, one of the challenges for the wavefront sensing is the non-linearity between the image intensity and the phase aberrations. Plus, for Earth observation, the object is unknown and unrepeatable. Recently, several studies have suggested Neural Networks (NN) for wavefront sensing; especially convolutional NN, which are well known for being non-linear and image-friendly problem solvers. Aims: We study in this paper the prospect of using NN to measure the phasing aberrations of a segmented pupil from the focal-plane image directly without a dedicated wavefront sensing. Methods: In our application, we take the case of a deployable telescope fitting in a CubeSat for Earth observations which triples the aperture size (compared to the 10cm CubeSat standard) and therefore triples the angular resolution capacity. In order to reach the diffraction-limited regime in the visible wavelength, typically, a wavefront error below lambda/50 is required. The telescope focal-plane detector, used for imaging, will be used as a wavefront-sensor. In this work, we study a point source, i.e. the Point Spread Function [PSF] of the optical system as an input of a VGG-net neural network, an architecture designed for image regression/classification. Results: This approach shows some promising results (about 2nm RMS, which is sub lambda/50 of residual WFE with 40-100nm RMS of input WFE) using a relatively fast computational time less than 30 ms which translates a small computation burder. These results allow one further study for higher aberrations and noise.

Keywords: wavefront sensing, deep learning, deployable telescope, space telescope

Procedia PDF Downloads 74
2639 Comparison of Parallel CUDA and OpenMP Implementations of Memetic Algorithms for Solving Optimization Problems

Authors: Jason Digalakis, John Cotronis

Abstract:

Memetic algorithms (MAs) are useful for solving optimization problems. It is quite difficult to search the search space of the optimization problem with large dimensions. There is a challenge to use all the cores of the system. In this study, a sequential implementation of the memetic algorithm is converted into a concurrent version, which is executed on the cores of both CPU and GPU. For this reason, CUDA and OpenMP libraries are operated on the parallel algorithm to make a concurrent execution on CPU and GPU, respectively. The aim of this study is to compare CPU and GPU implementation of the memetic algorithm. For this purpose, fourteen benchmark functions are selected as test problems. The obtained results indicate that our approach leads to speedups up to five thousand times higher compared to one CPU thread while maintaining a reasonable results quality. This clearly shows that GPUs have the potential to acceleration of MAs and allow them to solve much more complex tasks.

Keywords: memetic algorithm, CUDA, GPU-based memetic algorithm, open multi processing, multimodal functions, unimodal functions, non-linear optimization problems

Procedia PDF Downloads 56
2638 EEG Analysis of Brain Dynamics in Children with Language Disorders

Authors: Hamed Alizadeh Dashagholi, Hossein Yousefi-Banaem, Mina Naeimi

Abstract:

Current study established for EEG signal analysis in patients with language disorder. Language disorder can be defined as meaningful delay in the use or understanding of spoken or written language. The disorder can include the content or meaning of language, its form, or its use. Here we applied Z-score, power spectrum, and coherence methods to discriminate the language disorder data from healthy ones. Power spectrum of each channel in alpha, beta, gamma, delta, and theta frequency bands was measured. In addition, intra hemispheric Z-score obtained by scoring algorithm. Obtained results showed high Z-score and power spectrum in posterior regions. Therefore, we can conclude that peoples with language disorder have high brain activity in frontal region of brain in comparison with healthy peoples. Results showed that high coherence correlates with irregularities in the ERP and is often found during complex task, whereas low coherence is often found in pathological conditions. The results of the Z-score analysis of the brain dynamics showed higher Z-score peak frequency in delta, theta and beta sub bands of Language Disorder patients. In this analysis there were activity signs in both hemispheres and the left-dominant hemisphere was more active than the right.

Keywords: EEG, electroencephalography, coherence methods, language disorder, power spectrum, z-score

Procedia PDF Downloads 385
2637 Analysis of Brushless DC Motor with Trapezoidal Back EMF Using Matlab

Authors: Taha Ahmed Husain

Abstract:

The dynamic characteristics such as speed and torque as well as voltages and currents of pwm brushless DC motor inverter are analyzed with a MATLAB model. The contribution of external load torque and friction torque is monitored. The switching function technique is adopted for the current control of the embedded three phase inverter that drives the brushless DC motor.In switching functions the power conversions circuits can be modeled according to their functions rather than circuit topologies. Therefore, it can achieve simplification of the overall power conversion functions. The trapezoidal type (back emf) is used in the model as ithas lower switching loss compared with sinusoidal type (back emf). Results show reliable time analysis for speed, torque, phase and line voltages and currents and the effect of current commutation is clearly observed.

Keywords: BLDC motor, brushless dc motors, pwm inverter, DC motor control, trapezoidal back emf, ripple torque in brushless DC motor

Procedia PDF Downloads 560
2636 The Sensitization Profile of Children Allergic to IgE-mediated Cow's Milk Proteins

Authors: Gadiri Sabiha

Abstract:

Introduction : IgE-dependent cow's milk protein allergy (APLV) is one of the most common allergies in children and is one of the three most common allergies observed in children under 6 years of age. Its natural evolution is most often towards healing. The objective is to determine the sensitization profile of patients allergic to cow's milk (VL). Material and method :A retrospective study carried out on a pediatric population (age < 12 years) over a period of four years (2018-2021) in the context of a suspected food allergy to cow's milk proteins carried out on 121 children aged between 8 months -12 years The search for specific IgE was carried out by immunodot (EUROLINE Pediatric; EUROIMMUN) test which allows a semi-quantitative determination of specific IgE. Results 36 patients (29.7%) had a cow's milk protein allergy (ALPV) with a slight female predominance (58.33% girls vs 41.66% boys) The main clinical signs were: acute diarrhoea; vomiting; Intense abdominal pain, and cutaneous signs (pruritus/urticaria) with respective frequencies of 72%; 58%; 44% and 19%. The 3 major and specific VL allergens identified were beta-lactoglobulin 59% caseins 51% and alpha-lactalbumin 29.7%, The profile of sensitization to LV varies according to age, in infants before 1 year of anti-casein, IgE are predominant 83.3%, followed by beta-lactoglobulin 66.66% and alpha-lactolbumin 50% Conclusion CMPA is a frequent pathology which ranks among the three most common food allergies in children. This is the first to appear, most often starting in infants under 6 months old.

Keywords: specific Ige, food allergy, cow 's milk, child

Procedia PDF Downloads 49
2635 Effect of Heat Treatment on Nutrients, Bioactive Contents and Biological Activities of Red Beet (Beta Vulgaris L.)

Authors: Amessis-Ouchemoukh Nadia, Salhi Rim, Ouchemoukh Salim, Ayad Rabha, Sadou Dyhia, Guenaoui Nawel, Hamouche Sara, Madani Khodir

Abstract:

The cooking method is a key factor influencing the quality of vegetables. In this study, the effect of the most common cooking methods on the nutritional composition, phenolic content, pigment content and antioxidant activities (evaluated by DPPH, ABTS, CUPRAC, FRAP, reducing power and phosphomolybdene method) of fresh, steamed, and boiled red beet was investigated. The fresh samples showed the highest nutritional and bioactive composition compared to the cooked ones. The boiling method didn’t lead to a significant reduction (p< 0.05) in the content of phenolics, flavonoids, flavanols and DPPH, ABTS, FRAP, CUPRAC, phosphomolybdeneum and reducing power capacities. This effect was less pronounced when steam cooking was used, and the losses of bioactive compounds were lower. As a result, steam cooking resulted in greater retention of bioactive compounds and antioxidant activity compared to boiling. Overall, this study suggests that steam cooking is a better method in terms of retention of pigments and bioactive compounds and antioxidant activity of beetroot.

Keywords: beta vulgaris, cooking methods, bioactive compounds, antioxidant activities

Procedia PDF Downloads 20
2634 Associations between Surrogate Insulin Resistance Indices and the Risk of Metabolic Syndrome in Children

Authors: Mustafa M. Donma, Orkide Donma

Abstract:

A well-defined insulin resistance (IR) is one of the requirements for the good understanding and evaluation of metabolic syndrome (MetS). However, underlying causes for the development of IR are not clear. Endothelial dysfunction also participates in the pathogenesis of this disease. IR indices are being determined in various obesity groups and also in diagnosing MetS. Components of MetS have been well established and used in adult studies. However, there are some ambiguities particularly in the field of pediatrics. The aims of this study were to compare the performance of fasting blood glucose (FBG), one of MetS components, with some other IR indices and check whether FBG may be replaced by some other parameter or ratio for a better evaluation of pediatric MetS. Five-hundred and forty-nine children were involved in the study. Five groups were constituted. Groups 109, 40, 100, 166, 110, 24 children were included in normal-body mass index (N-BMI), overweight (OW), obese (OB), morbid obese (MO), MetS with two components (MetS2) and MetS with three components (MetS3) groups, respectively. Age and sex-adjusted BMI percentiles tabulated by World Health Organization were used for the classification of obesity groups. MetS components were determined. Aside from one of the MetS components-FBG, eight measures of IR [homeostatic model assessment of IR (HOMA-IR), homeostatic model assessment of beta cell function (HOMA-%β), alanine transaminase-to-aspartate transaminase ratio (ALT/AST), alanine transaminase (ALT), insulin (INS), insulin-to-FBG ratio (INS/FBG), the product of fasting triglyceride and glucose (TyG) index, McAuley index] were evaluated. Statistical analyses were performed. A p value less than 0.05 was accepted as the statistically significance degree. Mean values for BMI of the groups were 15.7 kg/m2, 21.0 kg/m2, 24.7 kg/m2, 27.1 kg/m2, 28.7 kg/m2, 30.4 kg/m2 for N-BMI, OW, OB, MO, MetS2, MetS3, respectively. Differences between the groups were significant (p < 0.001). The only exception was MetS2-MetS3 couple, in spite of an increase detected in MetS3 group. Waist-to-hip circumference ratios significantly differed only for N-BMI vs, OB, MO, MetS2; OW vs MO; OB vs MO, MetS2 couples. ALT and ALT/AST did not differ significantly among MO-MetS2-MetS3. HOMA-%β differed only between MO and MetS2. INS/FBG, McAuley index and TyG were not significant between MetS2 and MetS3. HOMA-IR and FBG were not significant between MO and MetS2. INS was the only parameter, which showed statistically significant differences between MO-MetS2, MO-MetS3, and MetS2-MetS3. In conclusion, these findings have suggested that FBG presently considered as one of the five MetS components, may be replaced by INS during the evaluation of pediatric morbid obesity and MetS.

Keywords: children, insulin resistance indices, metabolic syndrome, obesity

Procedia PDF Downloads 100
2633 Integrating Process Planning, WMS Dispatching, and WPPW Weighted Due Date Assignment Using a Genetic Algorithm

Authors: Halil Ibrahim Demir, Tarık Cakar, Ibrahim Cil, Muharrem Dugenci, Caner Erden

Abstract:

Conventionally, process planning, scheduling, and due-date assignment functions are performed separately and sequentially. The interdependence of these functions requires integration. Although integrated process planning and scheduling, and scheduling with due date assignment problems are popular research topics, only a few works address the integration of these three functions. This work focuses on the integration of process planning, WMS scheduling, and WPPW due date assignment. Another novelty of this work is the use of a weighted due date assignment. In the literature, due dates are generally assigned without considering the importance of customers. However, in this study, more important customers get closer due dates. Typically, only tardiness is punished, but the JIT philosophy punishes both earliness and tardiness. In this study, all weighted earliness, tardiness, and due date related costs are penalized. As no customer desires distant due dates, such distant due dates should be penalized. In this study, various levels of integration of these three functions are tested and genetic search and random search are compared both with each other and with ordinary solutions. Higher integration levels are superior, while search is always useful. Genetic searches outperformed random searches.

Keywords: process planning, weighted scheduling, weighted due-date assignment, genetic algorithm, random search

Procedia PDF Downloads 355
2632 A Simple Finite Element Method for Glioma Tumor Growth Model with Density Dependent Diffusion

Authors: Shangerganesh Lingeshwaran

Abstract:

In this presentation, we have performed numerical simulations for a reaction-diffusion equation with various nonlinear density-dependent diffusion operators and proliferation functions. The mathematical model represented by parabolic partial differential equation is considered to study the invasion of gliomas (the most common type of brain tumors) and to describe the growth of cancer cells and response to their treatment. The unknown quantity of the given reaction-diffusion equation is the density of cancer cells and the mathematical model based on the proliferation and migration of glioma cells. A standard Galerkin finite element method is used to perform the numerical simulations of the given model. Finally, important observations on the each of nonlinear diffusion functions and proliferation functions are presented with the help of computational results.

Keywords: glioma invasion, nonlinear diffusion, reaction-diffusion, finite eleament method

Procedia PDF Downloads 203
2631 Aerodynamic Design an UAV with Application on the Spraying Agricola with Method of Genetic Algorithm Optimization

Authors: Saul A. Torres Z., Eduardo Liceaga C., Alfredo Arias M.

Abstract:

Agriculture in the world falls within the main sources of economic and global needs, so care of crop is extremely important for owners and workers; one of the major causes of loss of product is the pest infection of different types of organisms. We seek to develop a UAV for agricultural spraying at a maximum altitude of 5000 meters above sea level, with a payload of 100 liters of fumigant. For the developing the aerodynamic design of the aircraft is using computational tools such as the "Vortex Lattice Athena" software, "MATLAB"," ANSYS FLUENT"," XFoil " package among others. Also methods are being used structured programming, exhaustive analysis of optimization methods and search. The results have a very low margin of error, and the multi- objective problems can be helpful for future developments. The program has 10 functions developed in MATLAB, these functions are related to each other to enable the development of design, and all these functions are controlled by the principal code "Master.m".

Keywords: aerodynamics design, optimization, algorithm genetic, multi-objective problem, stability, vortex

Procedia PDF Downloads 506
2630 Restricted Boltzmann Machines and Deep Belief Nets for Market Basket Analysis: Statistical Performance and Managerial Implications

Authors: H. Hruschka

Abstract:

This paper presents the first comparison of the performance of the restricted Boltzmann machine and the deep belief net on binary market basket data relative to binary factor analysis and the two best-known topic models, namely Dirichlet allocation and the correlated topic model. This comparison shows that the restricted Boltzmann machine and the deep belief net are superior to both binary factor analysis and topic models. Managerial implications that differ between the investigated models are treated as well. The restricted Boltzmann machine is defined as joint Boltzmann distribution of hidden variables and observed variables (purchases). It comprises one layer of observed variables and one layer of hidden variables. Note that variables of the same layer are not connected. The comparison also includes deep belief nets with three layers. The first layer is a restricted Boltzmann machine based on category purchases. Hidden variables of the first layer are used as input variables by the second-layer restricted Boltzmann machine which then generates second-layer hidden variables. Finally, in the third layer hidden variables are related to purchases. A public data set is analyzed which contains one month of real-world point-of-sale transactions in a typical local grocery outlet. It consists of 9,835 market baskets referring to 169 product categories. This data set is randomly split into two halves. One half is used for estimation, the other serves as holdout data. Each model is evaluated by the log likelihood for the holdout data. Performance of the topic models is disappointing as the holdout log likelihood of the correlated topic model – which is better than Dirichlet allocation - is lower by more than 25,000 compared to the best binary factor analysis model. On the other hand, binary factor analysis on its own is clearly surpassed by both the restricted Boltzmann machine and the deep belief net whose holdout log likelihoods are higher by more than 23,000. Overall, the deep belief net performs best. We also interpret hidden variables discovered by binary factor analysis, the restricted Boltzmann machine and the deep belief net. Hidden variables characterized by the product categories to which they are related differ strongly between these three models. To derive managerial implications we assess the effect of promoting each category on total basket size, i.e., the number of purchased product categories, due to each category's interdependence with all the other categories. The investigated models lead to very different implications as they disagree about which categories are associated with higher basket size increases due to a promotion. Of course, recommendations based on better performing models should be preferred. The impressive performance advantages of the restricted Boltzmann machine and the deep belief net suggest continuing research by appropriate extensions. To include predictors, especially marketing variables such as price, seems to be an obvious next step. It might also be feasible to take a more detailed perspective by considering purchases of brands instead of purchases of product categories.

Keywords: binary factor analysis, deep belief net, market basket analysis, restricted Boltzmann machine, topic models

Procedia PDF Downloads 167
2629 Applying Element Free Galerkin Method on Beam and Plate

Authors: Mahdad M’hamed, Belaidi Idir

Abstract:

This paper develops a meshless approach, called Element Free Galerkin (EFG) method, which is based on the weak form Moving Least Squares (MLS) of the partial differential governing equations and employs the interpolation to construct the meshless shape functions. The variation weak form is used in the EFG where the trial and test functions are approximated bye the MLS approximation. Since the shape functions constructed by this discretization have the weight function property based on the randomly distributed points, the essential boundary conditions can be implemented easily. The local weak form of the partial differential governing equations is obtained by the weighted residual method within the simple local quadrature domain. The spline function with high continuity is used as the weight function. The presently developed EFG method is a truly meshless method, as it does not require the mesh, either for the construction of the shape functions, or for the integration of the local weak form. Several numerical examples of two-dimensional static structural analysis are presented to illustrate the performance of the present EFG method. They show that the EFG method is highly efficient for the implementation and highly accurate for the computation. The present method is used to analyze the static deflection of beams and plate hole

Keywords: numerical computation, element-free Galerkin (EFG), moving least squares (MLS), meshless methods

Procedia PDF Downloads 262
2628 Towards a Strategic Framework for State-Level Epistemological Functions

Authors: Mark Darius Juszczak

Abstract:

While epistemology, as a sub-field of philosophy, is generally concerned with theoretical questions about the nature of knowledge, the explosion in digital media technologies has resulted in an exponential increase in the storage and transmission of human information. That increase has resulted in a particular non-linear dynamic – digital epistemological functions are radically altering how and what we know. Neither the rate of that change nor the consequences of it have been well studied or taken into account in developing state-level strategies for epistemological functions. At the current time, US Federal policy, like that of virtually all other countries, maintains, at the national state level, clearly defined boundaries between various epistemological agencies - agencies that, in one way or another, mediate the functional use of knowledge. These agencies can take the form of patent and trademark offices, national library and archive systems, departments of education, departments such as the FTC, university systems and regulations, military research systems such as DARPA, federal scientific research agencies, medical and pharmaceutical accreditation agencies, federal funding for scientific research and legislative committees and subcommittees that attempt to alter the laws that govern epistemological functions. All of these agencies are in the constant process of creating, analyzing, and regulating knowledge. Those processes are, at the most general level, epistemological functions – they act upon and define what knowledge is. At the same time, however, there are no high-level strategic epistemological directives or frameworks that define those functions. The only time in US history where a proxy state-level epistemological strategy existed was between 1961 and 1969 when the Kennedy Administration committed the United States to the Apollo program. While that program had a singular technical objective as its outcome, that objective was so technologically advanced for its day and so complex so that it required a massive redirection of state-level epistemological functions – in essence, a broad and diverse set of state-level agencies suddenly found themselves working together towards a common epistemological goal. This paper does not call for a repeat of the Apollo program. Rather, its purpose is to investigate the minimum structural requirements for a national state-level epistemological strategy in the United States. In addition, this paper also seeks to analyze how the epistemological work of the multitude of national agencies within the United States would be affected by such a high-level framework. This paper is an exploratory study of this type of framework. The primary hypothesis of the author is that such a function is possible but would require extensive re-framing and reclassification of traditional epistemological functions at the respective agency level. In much the same way that, for example, DHS (Department of Homeland Security) evolved to respond to a new type of security threat in the world for the United States, it is theorized that a lack of coordination and alignment in epistemological functions will equally result in a strategic threat to the United States.

Keywords: strategic security, epistemological functions, epistemological agencies, Apollo program

Procedia PDF Downloads 53
2627 Estimation of Hydrogen Production from PWR Spent Fuel Due to Alpha Radiolysis

Authors: Sivakumar Kottapalli, Abdesselam Abdelouas, Christoph Hartnack

Abstract:

Spent nuclear fuel generates a mixed field of ionizing radiation to the water. This radiation field is generally dominated by gamma rays and a limited flux of fast neutrons. The fuel cladding effectively attenuates beta and alpha particle radiation. Small fraction of the spent nuclear fuel exhibits some degree of fuel cladding penetration due to pitting corrosion and mechanical failure. Breaches in the fuel cladding allow the exposure of small volumes of water in the cask to alpha and beta ionizing radiation. The safety of the transport of radioactive material is assured by the package complying with the IAEA Requirements for the Safe Transport of Radioactive Material SSR-6. It is of high interest to avoid generation of hydrogen inside the cavity which may to an explosive mixture. The risk of hydrogen production along with other radiation gases should be analyzed for a typical spent fuel for safety issues. This work aims to perform a realistic study of the production of hydrogen by radiolysis assuming most penalizing initial conditions. It consists in the calculation of the radionuclide inventory of a pellet taking into account the burn up and decays. Westinghouse 17X17 PWR fuel has been chosen and data has been analyzed for different sets of enrichment, burnup, cycles of irradiation and storage conditions. The inventory is calculated as the entry point for the simulation studies of hydrogen production by radiolysis kinetic models by MAKSIMA-CHEMIST. Dose rates decrease strongly within ~45 μm from the fuel surface towards the solution(water) in case of alpha radiation, while the dose rate decrease is lower in case of beta and even slower in case of gamma radiation. Calculations are carried out to obtain spectra as a function of time. Radiation dose rate profiles are taken as the input data for the iterative calculations. Hydrogen yield has been found to be around 0.02 mol/L. Calculations have been performed for a realistic scenario considering a capsule containing the spent fuel rod. Thus, hydrogen yield has been debated. Experiments are under progress to validate the hydrogen production rate using cyclotron at > 5MeV (at ARRONAX, Nantes).

Keywords: radiolysis, spent fuel, hydrogen, cyclotron

Procedia PDF Downloads 492