Search results for: finished volumes
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 599

Search results for: finished volumes

389 Age-Dependent Anatomical Abnormalities of the Amygdala in Autism Spectrum Disorder and their Implications for Altered Socio-Emotional Development

Authors: Gabriele Barrocas, Habon Issa

Abstract:

The amygdala is one of various brain regions that tend to be pathological in individuals with autism spectrum disorder (ASD). ASD is a prevalent and heterogeneous developmental disorder affecting all ethnic and socioeconomic groups and consists of a broad range of severity, etiology, and behavioral symptoms. Common features of ASD include but are not limited to repetitive behaviors, obsessive interests, and anxiety. Neuroscientists view the amygdala as the core of the neural system that regulates behavioral responses to anxiogenic and threatening stimuli. Despite this consensus, many previous studies and literature reviews on the amygdala’s alterations in individuals with ASD have reported inconsistent findings. In this review, we will address these conflicts by highlighting recent studies which reveal that anatomical and related socio-emotional differences detected between individuals with and without ASD are highly age-dependent. We will specifically discuss studies using functional magnetic resonance imaging (fMRI), structural MRI, and diffusion tensor imaging (DTI) to provide insights into the neuroanatomical substrates of ASD across development, with a focus on amygdala volumes, cell densities, and connectivity.

Keywords: autism, amygdala, development, abnormalities

Procedia PDF Downloads 125
388 Agile Supply Chains and Its Dependency on Air Transport Mode: A Case Study in Amazon

Authors: Fabiana Lucena Oliveira, Aristides da Rocha Oliveira Junior

Abstract:

This article discusses the dependence on air transport mode of agile supply chains. The agile supply chains are the result of the analysis of the uncertainty supply chain model, which ranks the supply chain, according to the respective product. Thus, understanding the Uncertainty Model and life cycle of products considered standard and innovative is critical to understanding these. The innovative character in the intersection of supply chains arising from the uncertainty model with its most appropriate transport mode. Consider here the variables availability, security and freight as determinants for choosing these modes. Therefore, the research problem is: How agile supply chains maintains logistics competitiveness, as these are dependent on air transport mode? A case study in Manaus Industrial Pole (MIP), an agglomeration model that includes six hundred industries from different backgrounds and billings, located in the Brazilian Amazon. The sample of companies surveyed include those companies whose products are classified in agile supply chains , as innovative and therefore live with the variable uncertainty in the demand for inputs or the supply of finished products. The results confirm the hypothesis that the dependency level of air transport mode is greater than fifty percent. It follows then, that maintain agile supply chain away from suppliers base is expensive (1) , and continuity analysis needs to be remade on each twenty four months (2) , consider that additional freight, handling and storage as members of the logistics costs (3) , and the comparison with the upcoming agile supply chains the world need to consider the location effect (4).

Keywords: uncertainty model, air transport mode, competitiveness, logistics

Procedia PDF Downloads 511
387 A Model of Foam Density Prediction for Expanded Perlite Composites

Authors: M. Arifuzzaman, H. S. Kim

Abstract:

Multiple sets of variables associated with expanded perlite particle consolidation in foam manufacturing were analyzed to develop a model for predicting perlite foam density. The consolidation of perlite particles based on the flotation method and compaction involves numerous variables leading to the final perlite foam density. The variables include binder content, compaction ratio, perlite particle size, various perlite particle densities and porosities, and various volumes of perlite at different stages of process. The developed model was found to be useful not only for prediction of foam density but also for optimization between compaction ratio and binder content to achieve a desired density. Experimental verification was conducted using a range of foam densities (0.15–0.5 g/cm3) produced with a range of compaction ratios (1.5-3.5), a range of sodium silicate contents (0.05–0.35 g/ml) in dilution, a range of expanded perlite particle sizes (1-4 mm), and various perlite densities (such as skeletal, material, bulk, and envelope densities). A close agreement between predictions and experimental results was found.

Keywords: expanded perlite, flotation method, foam density, model, prediction, sodium silicate

Procedia PDF Downloads 408
386 Characteristics of Different Volumes of Waste Cellular Concrete Powder-Cement Paste for Sustainable Construction

Authors: Mohammed Abed, Rita Nemes

Abstract:

Cellular concrete powder (CCP) is not used widely as supplementary cementitious material, but in the literature, its efficiency is proved when it used as a replacement of cement in concrete mixtures. In this study, different amounts of raw CCP (CCP as a waste material without any industrial modification) will be used to investigate the characteristics of cement pastes and the effects of CCP on the properties of the cement pastes. It is an attempt to produce green binder paste, which is useful for sustainable construction applications. The fresh and hardened properties of a number of CCP blended cement paste will be tested in different life periods, and the optimized CCP volume will be reported with more significant investigations on durability properties. Different replacing of mass percentage (low and high) of the cement mass will be conducted (0%, 10%, 15%, 20%, 30%, 40%, 50%, 60%, 70%, 80%, and 90%). The consistency, flexural strength, and compressive strength will be the base indicator for the further properties' investigations. The CCP replacement until 50% have been tested until 7 days, and the initial results showed a linear relationship between strength and the percentage of the replacement; that is an optimistic indicator for further replacement percentages of waste CCP.

Keywords: cellular concrete powder, supplementary cementitious material, sustainable construction, green concrete

Procedia PDF Downloads 326
385 Association between Polygenic Risk of Alzheimer's Dementia, Brain MRI and Cognition in UK Biobank

Authors: Rachana Tank, Donald. M. Lyall, Kristin Flegal, Joey Ward, Jonathan Cavanagh

Abstract:

Alzheimer’s research UK estimates by 2050, 2 million individuals will be living with Late Onset Alzheimer’s disease (LOAD). However, individuals experience considerable cognitive deficits and brain pathology over decades before reaching clinically diagnosable LOAD and studies have utilised gene candidate studies such as genome wide association studies (GWAS) and polygenic risk (PGR) scores to identify high risk individuals and potential pathways. This investigation aims to determine whether high genetic risk of LOAD is associated with worse brain MRI and cognitive performance in healthy older adults within the UK Biobank cohort. Previous studies investigating associations of PGR for LOAD and measures of MRI or cognitive functioning have focused on specific aspects of hippocampal structure, in relatively small sample sizes and with poor ‘controlling’ for confounders such as smoking. Both the sample size of this study and the discovery GWAS sample are bigger than previous studies to our knowledge. Genetic interaction between loci showing largest effects in GWAS have not been extensively studied and it is known that APOE e4 poses the largest genetic risk of LOAD with potential gene-gene and gene-environment interactions of e4, for this reason we  also analyse genetic interactions of PGR with the APOE e4 genotype. High genetic loading based on a polygenic risk score of 21 SNPs for LOAD is associated with worse brain MRI and cognitive outcomes in healthy individuals within the UK Biobank cohort. Summary statistics from Kunkle et al., GWAS meta-analyses (case: n=30,344, control: n=52,427) will be used to create polygenic risk scores based on 21 SNPs and analyses will be carried out in N=37,000 participants in the UK Biobank. This will be the largest study to date investigating PGR of LOAD in relation to MRI. MRI outcome measures include WM tracts, structural volumes. Cognitive function measures include reaction time, pairs matching, trail making, digit symbol substitution and prospective memory. Interaction of the APOE e4 alleles and PGR will be analysed by including APOE status as an interaction term coded as either 0, 1 or 2 e4 alleles. Models will be adjusted partially for adjusted for age, BMI, sex, genotyping chip, smoking, depression and social deprivation. Preliminary results suggest PGR score for LOAD is associated with decreased hippocampal volumes including hippocampal body (standardised beta = -0.04, P = 0.022) and tail (standardised beta = -0.037, P = 0.030), but not with hippocampal head. There were also associations of genetic risk with decreased cognitive performance including fluid intelligence (standardised beta = -0.08, P<0.01) and reaction time (standardised beta = 2.04, P<0.01). No genetic interactions were found between APOE e4 dose and PGR score for MRI or cognitive measures. The generalisability of these results is limited by selection bias within the UK Biobank as participants are less likely to be obese, smoke, be socioeconomically deprived and have fewer self-reported health conditions when compared to the general population. Lack of a unified approach or standardised method for calculating genetic risk scores may also be a limitation of these analyses. Further discussion and results are pending.

Keywords: Alzheimer's dementia, cognition, polygenic risk, MRI

Procedia PDF Downloads 114
384 Mechanical Characterization and Impact Study on the Environment of Raw Sediments and Sediments Dehydrated by Addition of Polymer

Authors: A. Kasmi, N. E. Abriak, M. Benzerzour, I. Shahrour

Abstract:

Large volumes of river sediments are dredged each year in Europe in order to maintain harbour activities and prevent floods. The management of this sediment has become increasingly complex. Several European projects were implemented to find environmentally sound solutions for these materials. The main objective of this study is to show the ability of river sediment to be used in road. Since sediments contain a high amount of water, then a dehydrating treatment by addition of the flocculation aid has been used. Firstly, a lot of physical characteristics are measured and discussed for a better identification of the raw sediment and this dehydrated sediment by addition the flocculation aid. The identified parameters are, for example, the initial water content, the density, the organic matter content, the grain size distribution, the liquid limit and plastic limit and geotechnical parameters. The environmental impacts of the used material were evaluated. The results obtained show that there is a slight change on the physical-chemical and geotechnical characteristics of sediment after dehydration by the addition of polymer. However, these sediments cannot be used in road construction.

Keywords: rive sediment, dehydration, flocculation aid or polymer, characteristics, treatments, valorisation, road construction

Procedia PDF Downloads 380
383 Comparative Assessment of a Distributed Model and a Lumped Model for Estimating of Sediments Yielding in Small Urban Areas

Authors: J.Zambrano Nájera, M.Gómez Valentín

Abstract:

Increases in urbanization during XX century, have brought as one major problem the increased of sediment production. Hydraulic erosion is one of the major causes of increasing of sediments in small urban catchments. Such increments in sediment yielding in header urban catchments can caused obstruction of drainage systems, making impossible to capture urban runoff, increasing runoff volumes and thus exacerbating problems of urban flooding. For these reasons, it is more and more important to study of sediment production in urban watershed for properly analyze and solve problems associated to sediments. The study of sediments production has improved with the use of mathematical modeling. For that reason, it is proposed a new physically based model applicable to small header urban watersheds that includes the advantages of distributed physically base models, but with more realistic data requirements. Additionally, in this paper the model proposed is compared with a lumped model, reviewing the results, the advantages and disadvantages between the both of them.

Keywords: erosion, hydrologic modeling, urban runoff, sediment modeling, sediment yielding, urban planning

Procedia PDF Downloads 348
382 Development of a Thermodynamic Model for Ladle Metallurgy Steel Making Processes Using Factsage and Its Macro Facility

Authors: Prasenjit Singha, Ajay Kumar Shukla

Abstract:

To produce high-quality steel in larger volumes, dynamic control of composition and temperature throughout the process is essential. In this paper, we developed a mass transfer model based on thermodynamics to simulate the ladle metallurgy steel-making process using FactSage and its macro facility. The overall heat and mass transfer processes consist of one equilibrium chamber, two non-equilibrium chambers, and one adiabatic reactor. The flow of material, as well as heat transfer, occurs across four interconnected unit chambers and a reactor. We used the macro programming facility of FactSage™ software to understand the thermochemical model of the secondary steel making process. In our model, we varied the oxygen content during the process and studied their effect on the composition of the final hot metal and slag. The model has been validated with respect to the plant data for the steel composition, which is similar to the ladle metallurgy steel-making process in the industry. The resulting composition profile serves as a guiding tool to optimize the process of ladle metallurgy in steel-making industries.

Keywords: desulphurization, degassing, factsage, reactor

Procedia PDF Downloads 218
381 Predictive Analysis for Big Data: Extension of Classification and Regression Trees Algorithm

Authors: Ameur Abdelkader, Abed Bouarfa Hafida

Abstract:

Since its inception, predictive analysis has revolutionized the IT industry through its robustness and decision-making facilities. It involves the application of a set of data processing techniques and algorithms in order to create predictive models. Its principle is based on finding relationships between explanatory variables and the predicted variables. Past occurrences are exploited to predict and to derive the unknown outcome. With the advent of big data, many studies have suggested the use of predictive analytics in order to process and analyze big data. Nevertheless, they have been curbed by the limits of classical methods of predictive analysis in case of a large amount of data. In fact, because of their volumes, their nature (semi or unstructured) and their variety, it is impossible to analyze efficiently big data via classical methods of predictive analysis. The authors attribute this weakness to the fact that predictive analysis algorithms do not allow the parallelization and distribution of calculation. In this paper, we propose to extend the predictive analysis algorithm, Classification And Regression Trees (CART), in order to adapt it for big data analysis. The major changes of this algorithm are presented and then a version of the extended algorithm is defined in order to make it applicable for a huge quantity of data.

Keywords: predictive analysis, big data, predictive analysis algorithms, CART algorithm

Procedia PDF Downloads 142
380 Understanding the Role of Gas Hydrate Morphology on the Producibility of a Hydrate-Bearing Reservoir

Authors: David Lall, Vikram Vishal, P. G. Ranjith

Abstract:

Numerical modeling of gas production from hydrate-bearing reservoirs requires the solution of various thermal, hydrological, chemical, and mechanical phenomena in a coupled manner. Among the various reservoir properties that influence gas production estimates, the distribution of permeability across the domain is one of the most crucial parameters since it determines both heat transfer and mass transfer. The aspect of permeability in hydrate-bearing reservoirs is particularly complex compared to conventional reservoirs since it depends on the saturation of gas hydrates and hence, is dynamic during production. The dependence of permeability on hydrate saturation is mathematically represented using permeability-reduction models, which are specific to the expected morphology of hydrate accumulations (such as grain-coating or pore-filling hydrates). In this study, we demonstrate the impact of various permeability-reduction models, and consequently, different morphologies of hydrate deposits on the estimates of gas production using depressurization at the reservoir scale. We observe significant differences in produced water volumes and cumulative mass of produced gas between the models, thereby highlighting the uncertainty in production behavior arising from the ambiguity in the prevalent gas hydrate morphology.

Keywords: gas hydrate morphology, multi-scale modeling, THMC, fluid flow in porous media

Procedia PDF Downloads 221
379 Evaluation of Random Forest and Support Vector Machine Classification Performance for the Prediction of Early Multiple Sclerosis from Resting State FMRI Connectivity Data

Authors: V. Saccà, A. Sarica, F. Novellino, S. Barone, T. Tallarico, E. Filippelli, A. Granata, P. Valentino, A. Quattrone

Abstract:

The work aim was to evaluate how well Random Forest (RF) and Support Vector Machine (SVM) algorithms could support the early diagnosis of Multiple Sclerosis (MS) from resting-state functional connectivity data. In particular, we wanted to explore the ability in distinguishing between controls and patients of mean signals extracted from ICA components corresponding to 15 well-known networks. Eighteen patients with early-MS (mean-age 37.42±8.11, 9 females) were recruited according to McDonald and Polman, and matched for demographic variables with 19 healthy controls (mean-age 37.55±14.76, 10 females). MRI was acquired by a 3T scanner with 8-channel head coil: (a)whole-brain T1-weighted; (b)conventional T2-weighted; (c)resting-state functional MRI (rsFMRI), 200 volumes. Estimated total lesion load (ml) and number of lesions were calculated using LST-toolbox from the corrected T1 and FLAIR. All rsFMRIs were pre-processed using tools from the FMRIB's Software Library as follows: (1) discarding of the first 5 volumes to remove T1 equilibrium effects, (2) skull-stripping of images, (3) motion and slice-time correction, (4) denoising with high-pass temporal filter (128s), (5) spatial smoothing with a Gaussian kernel of FWHM 8mm. No statistical significant differences (t-test, p < 0.05) were found between the two groups in the mean Euclidian distance and the mean Euler angle. WM and CSF signal together with 6 motion parameters were regressed out from the time series. We applied an independent component analysis (ICA) with the GIFT-toolbox using the Infomax approach with number of components=21. Fifteen mean components were visually identified by two experts. The resulting z-score maps were thresholded and binarized to extract the mean signal of the 15 networks for each subject. Statistical and machine learning analysis were then conducted on this dataset composed of 37 rows (subjects) and 15 features (mean signal in the network) with R language. The dataset was randomly splitted into training (75%) and test sets and two different classifiers were trained: RF and RBF-SVM. We used the intrinsic feature selection of RF, based on the Gini index, and recursive feature elimination (rfe) for the SVM, to obtain a rank of the most predictive variables. Thus, we built two new classifiers only on the most important features and we evaluated the accuracies (with and without feature selection) on test-set. The classifiers, trained on all the features, showed very poor accuracies on training (RF:58.62%, SVM:65.52%) and test sets (RF:62.5%, SVM:50%). Interestingly, when feature selection by RF and rfe-SVM were performed, the most important variable was the sensori-motor network I in both cases. Indeed, with only this network, RF and SVM classifiers reached an accuracy of 87.5% on test-set. More interestingly, the only misclassified patient resulted to have the lowest value of lesion volume. We showed that, with two different classification algorithms and feature selection approaches, the best discriminant network between controls and early MS, was the sensori-motor I. Similar importance values were obtained for the sensori-motor II, cerebellum and working memory networks. These findings, in according to the early manifestation of motor/sensorial deficits in MS, could represent an encouraging step toward the translation to the clinical diagnosis and prognosis.

Keywords: feature selection, machine learning, multiple sclerosis, random forest, support vector machine

Procedia PDF Downloads 241
378 Graph Based Traffic Analysis and Delay Prediction Using a Custom Built Dataset

Authors: Gabriele Borg, Alexei Debono, Charlie Abela

Abstract:

There on a constant rise in the availability of high volumes of data gathered from multiple sources, resulting in an abundance of unprocessed information that can be used to monitor patterns and trends in user behaviour. Similarly, year after year, Malta is also constantly experiencing ongoing population growth and an increase in mobilization demand. This research takes advantage of data which is continuously being sourced and converting it into useful information related to the traffic problem on the Maltese roads. The scope of this paper is to provide a methodology to create a custom dataset (MalTra - Malta Traffic) compiled from multiple participants from various locations across the island to identify the most common routes taken to expose the main areas of activity. This use of big data is seen being used in various technologies and is referred to as ITSs (Intelligent Transportation Systems), which has been concluded that there is significant potential in utilising such sources of data on a nationwide scale. Furthermore, a series of traffic prediction graph neural network models are conducted to compare MalTra to large-scale traffic datasets.

Keywords: graph neural networks, traffic management, big data, mobile data patterns

Procedia PDF Downloads 133
377 Prevalence of Anxiety and Depression: A Descriptive Cross-Sectional Study among Individuals with Substance-Related Disorders in Argentina

Authors: Badino Manuel, Farias María Alejandra

Abstract:

Anxiety and depression are considered the main mental health issues found in people with substance-related disorders. Furthermore, substance-related disorders, anxiety-related and depressive disorders are among the leading causes of disability and are associated with increased mortality. The co-occurrence of substance-related disorders and these mental health conditions affect the accuracy in diagnosis, treatment plan, and recovery process. The aim is to describe the prevalence of anxiety and depression in patients with substance-related disorders in a mental health service in Córdoba, Argentina. A descriptive cross-sectional study was conducted among patients with substance-related disorders (N=305). Anxiety and depression were assessed using the Patient Health Questionnaire-4 (PHQ-4) during the period from December 2021 to March 2022. For a total of 305 participants, 71,8% were male, 25,6% female and 2,6% non-binary. As regards marital status, 51,5% were single, 21,6% as a couple, 5,9% married, 15,4% separated and 5,6% divorced. In relation to education status, 26,2% finished university, 56,1% high school, 16,4% only primary school and 1,3% no formal schooling. Regarding age, 10,8% were young, 84,3% were adults, and 4,9% were elderly. In-person treatment represented 64,6% of service users, and 35,4% were conducted through teleconsultation. 15,7% of service users scored 3 or higher for anxiety, and 32,1% scored 3 or higher for depression in the PHQ-4. 13,1% obtained a score of 3 or higher for both anxiety and depression. It is recommended to identify anxiety and depression among patients with substance-related disorders to improve the quality of diagnosis, treatment, and recovery. It is suggested to apply PHQ-4, PHQ-9 within the protocol of care for these patients.

Keywords: addiction, anxiety, depression, mental health

Procedia PDF Downloads 102
376 Photocatalytic Self-Cleaning Concrete Production Using Nano-Size Titanium Dioxide

Authors: Amin Akhnoukh, Halla Elea, Lawrence Benzmiller

Abstract:

The objective of this research is to evaluate the possibility of using nano-sized materials, mainly titanium dioxide (TiO2), in producing economic self-cleaning concrete using photo-catalysis process. In photo-catalysis, the nano-particles react and dissolve smog, dust, and dirt particles in the presence of sunlight, resulting in a cleaned concrete surface. To-date, the Italian cement company (Italcementi) produces a proprietary self-cleaning cementitious material that is currently used in government buildings and major highways in Europe. The high initial cost of the proprietary product represents a major obstacle to the wide spread of the self-cleaning concrete in industrial and commercial projects. In this research project, titanium dioxide nano-sized particles are infused to the top layer of a concrete pour before the concrete surface is finished. Once hardened, a blue dye is applied to the concrete surface to simulate smog and dirt effect. The concrete surface is subjected to direct light to investigate the effectiveness of the nano-sized titanium dioxide in cleaning the concrete surface. The outcome of this research project proved that the titanium dioxide can be successfully used in reducing smog and dirt particles attached to the concrete when infused to the surface concrete layer. The majority of cleansing effect due to photocatalysis happens within 24 hours of photocatalysis process. The non-proprietary mix can be used in highway, industrial, and commercial projects due to its economy and ease of production.

Keywords: self-cleaning concrete, photocatalysis, Smog-eating concrete, titanium dioxide

Procedia PDF Downloads 355
375 “Octopub”: Geographical Sentiment Analysis Using Named Entity Recognition from Social Networks for Geo-Targeted Billboard Advertising

Authors: Oussama Hafferssas, Hiba Benyahia, Amina Madani, Nassima Zeriri

Abstract:

Although data nowadays has multiple forms; from text to images, and from audio to videos, yet text is still the most used one at a public level. At an academical and research level, and unlike other forms, text can be considered as the easiest form to process. Therefore, a brunch of Data Mining researches has been always under its shadow, called "Text Mining". Its concept is just like data mining’s, finding valuable patterns in data, from large collections and tremendous volumes of data, in this case: Text. Named entity recognition (NER) is one of Text Mining’s disciplines, it aims to extract and classify references such as proper names, locations, expressions of time and dates, organizations and more in a given text. Our approach "Octopub" does not aim to find new ways to improve named entity recognition process, rather than that it’s about finding a new, and yet smart way, to use NER in a way that we can extract sentiments of millions of people using Social Networks as a limitless information source, and Marketing for product promotion as the main domain of application.

Keywords: textmining, named entity recognition(NER), sentiment analysis, social media networks (SN, SMN), business intelligence(BI), marketing

Procedia PDF Downloads 590
374 Quantitative Analysis of Presence, Consciousness, Subconsciousness, and Unconsciousness

Authors: Hooshmand Kalayeh

Abstract:

The human brain consists of reptilian, mammalian, and thinking brain. And mind consists of conscious, subconscious, and unconscious parallel neural-net programs. The primary objective of this paper is to propose a methodology for quantitative analysis of neural-nets associated with these mental activities in the neocortex. The secondary objective of this paper is to suggest a methodology for quantitative analysis of presence; the proposed methodologies can be used as a first-step to measure, monitor, and understand consciousness and presence. This methodology is based on Neural-Networks (NN), number of neuron in each NN associated with consciousness, subconsciouness, and unconsciousness, and number of neurons in neocortex. It is assumed that the number of neurons in each NN is correlated with the associated area and volume. Therefore, online and offline visualization techniques can be used to identify these neural-networks, and online and offline measurement methods can be used to measure areas and volumes associated with these NNs. So, instead of the number of neurons in each NN, the associated area or volume also can be used in the proposed methodology. This quantitative analysis and associated online and offline measurements and visualizations of different Neural-Networks enable us to rewire the connections in our brain for a more balanced living.

Keywords: brain, mind, consciousness, presence, sub-consciousness, unconsciousness, skills, concentrations, attention

Procedia PDF Downloads 315
373 Agile Implementation of 'PULL' Principles in a Manufacturing Process Chain for Aerospace Composite Parts

Authors: Torsten Mielitz, Dietmar Schulz, York C. Roth

Abstract:

Market forecasts show a significant increase in the demand for aircraft within the next two decades and production rates will be adapted accordingly. Improvements and optimizations in the industrial system are becoming more important to cope with future challenges in manufacturing and assembly. Highest quality standards have to be met for aerospace parts, whereas cost effective production in industrial systems and methodologies are also a key driver. A look at other industries like e.g., automotive shows well established processes to streamline existing manufacturing systems. In this paper, the implementation of 'PULL' principles in an existing manufacturing process chain for a large scale composite part is presented. A nonlinear extrapolation based on 'Little's Law' showed a risk of a significant increase of parts needed in the process chain to meet future demand. A project has been set up to mitigate the risk whereas the methodology has been changed from a traditional milestone approach in the beginning towards an agile way of working in the end in order to facilitate immediate benefits in the shop-floor. Finally, delivery rates could be increased avoiding more semi-finished parts in the process chain (work in progress & inventory) by the successful implementation of the 'PULL' philosophy in the shop-floor between the work stations. Lessons learned during the running project as well as implementation and operations phases are discussed in order to share best practices.

Keywords: aerospace composite part manufacturing, PULL principles, shop-floor implementation, lessons learned

Procedia PDF Downloads 174
372 Developing a Virtual Reality System to Assist in Anatomy Teaching and Evaluating the Effectiveness of That System

Authors: Tarek Abdelkader, Suresh Selvaraj, Prasad Iyer, Yong Mun Hin, Hajmath Begum, P. Gopalakrishnakone

Abstract:

Nowadays, more and more educational institutes, as well as students, rely on 3D anatomy programs as an important tool that helps students correlate the actual locations of anatomical structures in a 3D dimension. Lately, virtual reality (VR) is gaining more favor from the younger generations due to its higher interactive mode. As a result, using virtual reality as a gamified learning platform for anatomy became the current goal. We present a model where a Virtual Human Anatomy Program (VHAP) was developed to assist with the anatomy learning experience of students. The anatomy module has been built, mostly, from real patient CT scans. Segmentation and surface rendering were used to create the 3D model by direct segmentation of CT scans for each organ individually and exporting that model as a 3D file. After acquiring the 3D files for all needed organs, all the files were introduced into a Virtual Reality environment as a complete body anatomy model. In this ongoing experiment, students from different Allied Health orientations are testing the VHAP. Specifically, the cardiovascular system has been selected as the focus system of study since all of our students finished learning about it in the 1st trimester. The initial results suggest that the VHAP system is adding value to the learning process of our students, encouraging them to get more involved and to ask more questions. Involved students comments show that they are excited about the VHAP system with comments about its interactivity as well as the ability to use it solo as a self-learning aid in combination with the lectures. Some students also experienced minor side effects like dizziness.

Keywords: 3D construction, health sciences, teaching pedagogy, virtual reality

Procedia PDF Downloads 158
371 Performing the Landscape: Temporary and Performative Practices in Landscape Production

Authors: Miguel Costa

Abstract:

Despite the "time" element being an intrinsic characteristic of the work with the landscape, its execution and completion are also often dependent on external factors, i.e., the slow bureaucratic procedures required for the implementation of a project. In the urban areas of the city, these conditions are even more present — some landscape projects are articulated with the architectural/urban design, transporting itself long, expensive and inflexible processes related with the constant transformations of contemporary urban culture, where the needs and expectations could change before the project is finished. However, despite the renewed interest and growing concern for issues related to the landscapes (particularly since the European Landscape Convention, its scope and fields of action, extended to all the landscapes and not just the selected ones), still lacks the need for a greater inclusion of citizens in its protection and construction processes as well as a greater transparency and clarity of the consequences and results of their active participation. This article aims to reflect on the production processes of urban landscapes, on its completion runtime and its relationship with the citizens by introducing temporary projects as a fieldwork methodology, as well as using the contribution of different professional practices and knowledge for its monitoring, execution, and implementation. These strategies address a more interdisciplinary, transdisciplinary and performative approach, not only from the ephemeral experience of objects and actions but also from the processes and the dynamic events that are organized from these objects and actions over the landscape. The goal is to discuss the results of these approaches on its different dimensions: critical dimension; experimental and strategic dimension; pedagogical dimension; political dimension; cultural.

Keywords: landscape fieldwork, interdisciplinarity, public inclusion, public participation, temporary projects, transdisciplinarity

Procedia PDF Downloads 324
370 Cerebrovascular Modeling: A Vessel Network Approach for Fluid Distribution

Authors: Karla E. Sanchez-Cazares, Kim H. Parker, Jennifer H. Tweedy

Abstract:

The purpose of this work is to develop a simple compartmental model of cerebral fluid balance including blood and cerebrospinal-fluid (CSF). At the first level the cerebral arteries and veins are modelled as bifurcating trees with constant scaling factors between generations which are connected through a homogeneous microcirculation. The arteries and veins are assumed to be non-rigid and the cross-sectional area, resistance and mean pressure in each generation are determined as a function of blood volume flow rate. From the mean pressure and further assumptions about the variation of wall permeability, the transmural fluid flux can be calculated. The results suggest the next level of modelling where the cerebral vasculature is divided into three compartments; the large arteries, the small arteries, the capillaries and the veins with effective compliances and permeabilities derived from the detailed vascular model. These vascular compartments are then linked to other compartments describing the different CSF spaces, the cerebral ventricles and the subarachnoid space. This compartmental model is used to calculate the distribution of fluid in the cranium. Known volumes and flows for normal conditions are used to determine reasonable parameters for the model, which can then be used to help understand pathological behaviour and suggest clinical interventions.

Keywords: cerebrovascular, compartmental model, CSF model, vascular network

Procedia PDF Downloads 276
369 Applying Concurrent Development Process for the Web Using Aspect-Oriented Approach

Authors: Hiroaki Fukuda

Abstract:

This paper shows a concurrent development process for modern web application, called Rich Internet Application (RIA), and describes its effect using a non-trivial application development. In the last years, RIAs such as Ajax and Flex have become popular based mainly on high-speed network. RIA provides sophisticated interfaces and user experiences, therefore, the development of RIA requires two kinds of engineer: a developer who implements business logic, and a designer who designs interface and experiences. Although collaborative works are becoming important for the development of RIAs, shared resources such as source code make it difficult. For example, if a design of interface is modified after developers have finished business logic implementations, they need to repeat the same implementations, and also tests to verify application’s behavior. MVC architecture and Object-oriented programming (OOP) enables to dividing an application into modules such as interfaces and logic, however, developers and/or designers have to write pieces of code (e.g., event handlers) that make these modules work as an application. On the other hand, Aspect-oriented programming (AOP) is ex- pected to solve complexity of application software development nowadays. AOP provides methods to separate crosscutting concerns that are scattered pieces of code from primary concerns. In this paper, we provide a concurrent development process for RIAs by introducing AOP concept. This process makes it possible to reduce shared resources between developers and designers, therefore they can perform their tasks concurrently. In addition, we describe experiences of development for a practical application using our proposed development process to show its availability.

Keywords: aspect-oriented programming, concurrent, development process, rich internet application

Procedia PDF Downloads 301
368 Multi-Objective Electric Vehicle Charge Coordination for Economic Network Management under Uncertainty

Authors: Ridoy Das, Myriam Neaimeh, Yue Wang, Ghanim Putrus

Abstract:

Electric vehicles are a popular transportation medium renowned for potential environmental benefits. However, large and uncontrolled charging volumes can impact distribution networks negatively. Smart charging is widely recognized as an efficient solution to achieve both improved renewable energy integration and grid relief. Nevertheless, different decision-makers may pursue diverse and conflicting objectives. In this context, this paper proposes a multi-objective optimization framework to control electric vehicle charging to achieve both energy cost reduction and peak shaving. A weighted-sum method is developed due to its intuitiveness and efficiency. Monte Carlo simulations are implemented to investigate the impact of uncertain electric vehicle driving patterns and provide decision-makers with a robust outcome in terms of prospective cost and network loading. The results demonstrate that there is a conflict between energy cost efficiency and peak shaving, with the decision-makers needing to make a collaborative decision.

Keywords: electric vehicles, multi-objective optimization, uncertainty, mixed integer linear programming

Procedia PDF Downloads 180
367 Evaluation of Liquid Fermentation Strategies to Obtain a Biofertilizer Based on Rhizobium sp.

Authors: Andres Diaz Garcia, Ana Maria Ceballos Rojas, Duvan Albeiro Millan Montano

Abstract:

This paper describes the initial technological development stages in the area of liquid fermentation required to reach the quantities of biomass of the biofertilizer microorganism Rhizobium sp. strain B02, for the application of the unitary stages downstream at laboratory scale. In the first stage, the adjustment and standardization of the fermentation process in conventional batch mode were carried out. In the second stage, various fed-batch and continuous fermentation strategies were evaluated in 10L-bioreactor in order to optimize the yields in concentration (Colony Forming Units/ml•h) and biomass (g/l•h), to make feasible the application of unit operations downstream of process. The growth kinetics, the evolution of dissolved oxygen and the pH profile generated in each of the strategies were monitored and used to make sequential adjustments. Once the fermentation was finished, the final concentration and viability of the obtained biomass were determined and performance parameters were calculated with the purpose of select the optimal operating conditions that significantly improved the baseline results. Under the conditions adjusted and standardized in batch mode, concentrations of 6.67E9 CFU/ml were reached after 27 hours of fermentation and a subsequent noticeable decrease was observed associated with a basification of the culture medium. By applying fed-batch and continuous strategies, significant increases in yields were achieved, but with similar concentration levels, which involved the design of several production scenarios based on the availability of equipment usage time and volume of required batch.

Keywords: biofertilizer, liquid fermentation, Rhizobium sp., standardization of processes

Procedia PDF Downloads 177
366 Statistical Manufacturing Cell/Process Qualification Sample Size Optimization

Authors: Angad Arora

Abstract:

In production operations/manufacturing, a cell or line is typically a bunch of similar machines (computer numerical control (CNCs), advanced cutting, 3D printing or special purpose machines. For qualifying a typical manufacturing line /cell / new process, Ideally, we need a sample of parts that can be flown through the process and then we make a judgment on the health of the line/cell. However, with huge volumes and mass production scope, such as in the mobile phone industry, for example, the actual cells or lines can go in thousands and to qualify each one of them with statistical confidence means utilizing samples that are very large and eventually add to product /manufacturing cost + huge waste if the parts are not intended to be customer shipped. To solve this, we come up with 2 steps statistical approach. We start with a small sample size and then objectively evaluate whether the process needs additional samples or not. For example, if a process is producing bad parts and we saw those samples early, then there is a high chance that the process will not meet the desired yield and there is no point in keeping adding more samples. We used this hypothesis and came up with 2 steps binomial testing approach. Further, we also prove through results that we can achieve an 18-25% reduction in samples while keeping the same statistical confidence.

Keywords: statistics, data science, manufacturing process qualification, production planning

Procedia PDF Downloads 98
365 Pharmaceutical Scale up for Solid Dosage Forms

Authors: A. Shashank Tiwari, S. P. Mahapatra

Abstract:

Scale-up is defined as the process of increasing batch size. Scale-up of a process viewed as a procedure for applying the same process to different output volumes. There is a subtle difference between these two definitions: batch size enlargement does not always translate into a size increase of the processing volume. In mixing applications, scale-up is indeed concerned with increasing the linear dimensions from the laboratory to the plant size. On the other hand, processes exist (e.g., tableting) where the term ‘scale-up’ simply means enlarging the output by increasing the speed. To complete the picture, one should point out special procedures where an increase of the scale is counterproductive and ‘scale-down’ is required to improve the quality of the product. In moving from Research and Development (R&D) to production scale, it is sometimes essential to have an intermediate batch scale. This is achieved at the so-called pilot scale, which is defined as the manufacturing of drug product by a procedure fully representative of and simulating that used for full manufacturing scale. This scale also makes it possible to produce enough products for clinical testing and to manufacture samples for marketing. However, inserting an intermediate step between R&D and production scales does not, in itself, guarantee a smooth transition. A well-defined process may generate a perfect product both in the laboratory and the pilot plant and then fail quality assurance tests in production.

Keywords: scale up, research, size, batch

Procedia PDF Downloads 414
364 Inducing Flow Experience in Mobile Learning: An Experiment Using a Spanish Learning Mobile Application

Authors: S. Jonsson, D. Millard, C. Bokhove

Abstract:

Smartphones are ubiquitous and frequently used as learning tools, which makes the design of educational apps an important area of research. A key issue is designing apps to encourage engagement while maintaining a focus on the educational aspects of the app. Flow experience is a promising method for addressing this issue, which refers to a mental state of cognitive absorption and positive emotion. Flow experience has been shown to be associated with positive emotion and increased learning performance. Studies have shown that immediate feedback is an antecedent to Flow. This experiment investigates the effect of immediate feedback on Flow experience. An app teaching Spanish phrases was developed, and 30 participants completed both a 10min session with immediate feedback and a 10min session with delayed feedback. The app contained a task where the user assembles Spanish phrases by pressing bricks with Spanish words. Immediate feedback was implemented by incorrect bricks recoiling, while correct brick moved to form part of the finished phrase. In the delayed feedback condition, the user did not know if the bricks they pressed were correct until the phrase was complete. The level of Flow experienced by the participants was measured after each session using the Flow Short Scale. The results showed that higher levels of Flow were experienced in the immediate feedback session. It was also found that 14 of the participants indicated that the demands of the task were ‘just right’ in the immediate feedback session, while only one did in the delayed feedback session. These results have implications for how to design educational technology and opens up questions for how Flow experience can be used to increase performance and engagement.

Keywords: feedback timing, flow experience, L2 language learning, mobile learning

Procedia PDF Downloads 135
363 Quality Approaches for Mass-Produced Fashion: A Study in Malaysian Garment Manufacturing

Authors: N. J. M. Yusof, T. Sabir, J. McLoughlin

Abstract:

Garment manufacturing industry involves sequential processes that are subjected to uncontrollable variations. The industry depends on the skill of labour in handling the varieties of fabrics and accessories, machines, and also a complicated sewing operation. Due to these reasons, garment manufacturers created systems to monitor and control the product’s quality regularly by conducting quality approaches to minimize variation. The aims of this research were to ascertain the quality approaches deployed by Malaysian garment manufacturers in three key areas-quality systems and tools; quality control and types of inspection; sampling procedures chosen for garment inspection. The focus of this research also aimed to distinguish quality approaches used by companies that supplied the finished garments to both domestic and international markets. The feedback from each of company’s representatives was obtained using the online survey, which comprised of five sections and 44 questions on the organizational profile and quality approaches used in the garment industry. The results revealed that almost all companies had established their own mechanism of process control by conducting a series of quality inspection for daily production either it was formally been set up or vice versa. Quality inspection was the predominant quality control activity in the garment manufacturing and the level of complexity of these activities was substantially dictated by the customers. AQL-based sampling was utilized by companies dealing with the export market, whilst almost all the companies that only concentrated on the domestic market were comfortable using their own sampling procedures for garment inspection. This research provides an insight into the implementation of quality approaches that were perceived as important and useful in the garment manufacturing sector, which is truly labour-intensive.

Keywords: garment manufacturing, quality approaches, quality control, inspection, Acceptance Quality Limit (AQL), sampling

Procedia PDF Downloads 445
362 Training Burnout and Leisure Participation of Athletes in College

Authors: An-Hsu Chen

Abstract:

The study intends to explore how the athletic trainings (12 hours per day, four days per week) have impacts on athlete burnout and their leisure participations. The connection between athlete burnout and leisure participation of collegiate athletes is also discussed. Athlete burnout and leisure participation questionnaire were administrated and 186 valid responses were collected. The data were analyzed with descriptive statistics, t-test, one-way ANOVA, Pearson product-moment correlation coefficient. Results suggest that athlete burnout among collegiate athletes with different specialties are significant distinct. Participants who train more days per week are more likely to participate in entertainment activities while those who have higher training hours per day tend to avoid knowledge-based activities. The research also finds there is a significant positive correlation between athlete burnout and leisure participation of collegiate athletes while sport devaluation is negatively correlated with sport activities in leisure participation. Hence, adjust and well-arrange training quality and quantity may help to avoid over-trainings. Away trainings, uploading training volumes, and group leisure activities are suggested to be arranged properly to allow athletes cope with the burnout and stress caused by long-term trainings and periodical competitions.

Keywords: emotional and physical exhaustion, leisure activities, sport devaluation, training hours

Procedia PDF Downloads 333
361 A Probabilistic Theory of the Buy-Low and Sell-High for Algorithmic Trading

Authors: Peter Shi

Abstract:

Algorithmic trading is a rapidly expanding domain within quantitative finance, constituting a substantial portion of trading volumes in the US financial market. The demand for rigorous and robust mathematical theories underpinning these trading algorithms is ever-growing. In this study, the author establishes a new stock market model that integrates the Efficient Market Hypothesis and the statistical arbitrage. The model, for the first time, finds probabilistic relations between the rational price and the market price in terms of the conditional expectation. The theory consequently leads to a mathematical justification of the old market adage: buy-low and sell-high. The thresholds for “low” and “high” are precisely derived using a max-min operation on Bayes’s error. This explicit connection harmonizes the Efficient Market Hypothesis and Statistical Arbitrage, demonstrating their compatibility in explaining market dynamics. The amalgamation represents a pioneering contribution to quantitative finance. The study culminates in comprehensive numerical tests using historical market data, affirming that the “buy-low” and “sell-high” algorithm derived from this theory significantly outperforms the general market over the long term in four out of six distinct market environments.

Keywords: efficient market hypothesis, behavioral finance, Bayes' decision, algorithmic trading, risk control, stock market

Procedia PDF Downloads 72
360 Achieving Shear Wave Elastography by a Three-element Probe for Wearable Human-machine Interface

Authors: Jipeng Yan, Xingchen Yang, Xiaowei Zhou, Mengxing Tang, Honghai Liu

Abstract:

Shear elastic modulus of skeletal muscles can be obtained by shear wave elastography (SWE) and has been linearly related to muscle force. However, SWE is currently implemented using array probes. Price and volumes of these probes and their driving equipment prevent SWE from being used in wearable human-machine interfaces (HMI). Moreover, beamforming processing for array probes reduces the real-time performance. To achieve SWE by wearable HMIs, a customized three-element probe is adopted in this work, with one element for acoustic radiation force generation and the others for shear wave tracking. In-phase quadrature demodulation and 2D autocorrelation are adopted to estimate velocities of tissues on the sound beams of the latter two elements. Shear wave speeds are calculated by phase shift between the tissue velocities. Three agar phantoms with different elasticities were made by changing the weights of agar. Values of the shear elastic modulus of the phantoms were measured as 8.98, 23.06 and 36.74 kPa at a depth of 7.5 mm respectively. This work verifies the feasibility of measuring shear elastic modulus by wearable devices.

Keywords: shear elastic modulus, skeletal muscle, ultrasound, wearable human-machine interface

Procedia PDF Downloads 162