Search results for: reliable facility location model
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20157

Search results for: reliable facility location model

12687 Consumption and Diffusion Based Model of Tissue Organoid Development

Authors: Elena Petersen, Inna Kornienko, Svetlana Guryeva, Sergey Simakov

Abstract:

In vitro organoid cultivation requires the simultaneous provision of necessary vascularization and nutrients perfusion of cells during organoid development. However, many aspects of this problem are still unsolved. The functionality of vascular network intergrowth is limited during early stages of organoid development since a function of the vascular network initiated on final stages of in vitro organoid cultivation. Therefore, a microchannel network should be created in early stages of organoid cultivation in hydrogel matrix aimed to conduct and maintain minimally required the level of nutrients perfusion for all cells in the expanding organoid. The network configuration should be designed properly in order to exclude hypoxic and necrotic zones in expanding organoid at all stages of its cultivation. In vitro vascularization is currently the main issue within the field of tissue engineering. As perfusion and oxygen transport have direct effects on cell viability and differentiation, researchers are currently limited only to tissues of few millimeters in thickness. These limitations are imposed by mass transfer and are defined by the balance between the metabolic demand of the cellular components in the system and the size of the scaffold. Current approaches include growth factor delivery, channeled scaffolds, perfusion bioreactors, microfluidics, cell co-cultures, cell functionalization, modular assembly, and in vivo systems. These approaches may improve cell viability or generate capillary-like structures within a tissue construct. Thus, there is a fundamental disconnect between defining the metabolic needs of tissue through quantitative measurements of oxygen and nutrient diffusion and the potential ease of integration into host vasculature for future in vivo implantation. A model is proposed for growth prognosis of the organoid perfusion based on joint simulations of general nutrient diffusion, nutrient diffusion to the hydrogel matrix through the contact surfaces and microchannels walls, nutrient consumption by the cells of expanding organoid, including biomatrix contraction during tissue development, which is associated with changed consumption rate of growing organoid cells. The model allows computing effective microchannel network design giving minimally required the level of nutrients concentration in all parts of growing organoid. It can be used for preliminary planning of microchannel network design and simulations of nutrients supply rate depending on the stage of organoid development.

Keywords: 3D model, consumption model, diffusion, spheroid, tissue organoid

Procedia PDF Downloads 307
12686 Developing a Health Promotion Program to Prevent and Solve Problem of the Frailty Elderly in the Community

Authors: Kunthida Kulprateepunya, Napat Boontiam, Bunthita Phuasa, Chatsuda Kankayant, Bantoeng Polsawat, Sumran Poontong

Abstract:

Frailty is the thin line between good health and illness. The syndrome is more common in the elderly who transition from strong to weak. (Vulnerability). Fragility can prevent and promote healthy recovery before it goes into disability. This research and development aim to analyze the situation analysis of frailty of the elderly, develop a program, and evaluate the effect of a health promotion program to prevent and solve the problem of frailty among the elderly. The research consisted of 3 phases: 1) analysis of the frailty situation, 2) development of a model, 3) evaluation of the effectiveness of the model. Samples were 328, 122 elderlies using the multi-stage random sampling method. The research instrument was a frailty questionnaire use of the five symptoms, the main characteristics were muscle weakness, slow walking, low physical activity. Fatigue and unintentional weight loss, criteria frailty use more than or equal to three or more symptoms are frailty. Data were analyzed by descriptive and t-test dependent test statistics. The findings showed three parts. First, frailty in the elderly was 23.05 percentage and 56.70% pre-frailty. Second, it was development of a health promotion program to prevent and solve the problem of frailty the elderly with a combination of Nine-Square Exercise, Elastic Band Exercise, Elastic Coconut Shell. Third, evaluation of the effectiveness of the model by comparison of the elderly's get up and go test, the average time before using the program was 14.42 and after using the program was 8.57. It was statistically significant at the .05 level. In conclusion, the findings can used to develop guidelines to promote the health of the frailty elderly.

Keywords: elderly, fragile, nine-square exercise, elastic coconut shell

Procedia PDF Downloads 102
12685 Prevalence of Endemic Goiter in School Children and Women of Reproductive Age Group during Post Salt Iodization Period in Andro Constituency, Imphal-East District, Manipur, India

Authors: Y. Suchitra Devi, L. Hemchandra Singh

Abstract:

Background: Because of its geographical location, Manipur lies in the conventional goiter endemic belt. During the post salt iodization period, endemic goiter was prevalent in the valley districts of Manipur without iodine deficiency. Objectives: The present study aim at the prevalence of goiter among school children (6-12 years) and women of reproductive age group (above 20 years) of Andro Assembly Constituency, Imphal- East, Manipur, India. Method: A total of 3992 individuals were clinically examined for thyroid enlargement. Hormones like TSH, FT₄, FT₃, and Anti-TPO, Anti-Tg were tested, UIC, USCN, testing of iodine in water and salt. Result: Total goiter prevalence was found to be 13.98%, median urinary iodine level was 166.0 µg/l, mean urinary thiocyanate concentration was 0.726 ± 0.408, mean water iodine concentration was 3.843 ± 2.291, and all the salt samples were above 15ppm. 6 out of 41 children and 93 out of 176 women were auto antibody positive. 41 children and 176 women were tested for TSH, FT₄, and FT₃, which shows disturbance in hormone level. Conclusion: The present study showed that the region is mildly goiter endemic without biochemical iodine deficiency.

Keywords: goiter, TSH, FT₄, FT₃, anti-TPO, anti-Tg, UIC, USCN, school children and women of reproductive age

Procedia PDF Downloads 108
12684 Is the Okun's Law Valid in Tunisia?

Authors: El Andari Chifaa, Bouaziz Rached

Abstract:

The central focus of this paper was to check whether the Okun’s law in Tunisia is valid or not. For this purpose, we have used quarterly time series data during the period 1990Q1-2014Q1. Firstly, we applied the error correction model instead of the difference version of Okun's Law, the Engle-Granger and Johansen test are employed to find out long run association between unemployment, production, and how error correction mechanism (ECM) is used for short run dynamic. Secondly, we used the gap version of Okun’s law where the estimation is done from three band pass filters which are mathematical tools used in macro-economic and especially in business cycles theory. The finding of the study indicates that the inverse relationship between unemployment and output is verified in the short and long term, and the Okun's law holds for the Tunisian economy, but with an Okun’s coefficient lower than required. Therefore, our empirical results have important implications for structural and cyclical policymakers in Tunisia to promote economic growth in a context of lower unemployment growth.

Keywords: Okun’s law, validity, unit root, cointegration, error correction model, bandpass filters

Procedia PDF Downloads 311
12683 Territorial Analysis of the Public Transport Supply: Case Study of Recife City

Authors: Cláudia Alcoforado, Anabela Ribeiro

Abstract:

This paper is part of an ongoing PhD thesis. It seeks to develop a model to identify the spatial failures of the public transportation supply. In the construction of the model, it also seeks to detect the social needs arising from the disadvantage in transport. The case study is carried out for the Brazilian city of Recife. Currently, Recife has a population density of 7,039.64 inhabitants per km². Unfortunately, only 46.9% of urban households on public roads have adequate urbanization. Allied to this reality, the trend of the occupation of the poorest population is that of the peripheries, a fact that has been consolidated in Brazil and Latin America, thus burdening the families' income, since the greater the distances covered for the basic activities and consequently also the transport costs. In this way, there have been great impacts caused by the supply of public transportation to locations with low demand or lack of urban infrastructure. The model under construction uses methods such as Currie’s Gap Assessment associated with the London’s Public Transport Access Level, and the Public Transport Accessibility Index developed by Saghapour. It is intended to present the stage of the thesis with the spatial/need gaps of the neighborhoods of Recife already detected. The benefits of the geographic information system are used in this paper. It should be noted that gaps are determined from the transport supply indices. In this case, considering the presence of walking catchment areas. Still in relation to the detection of gaps, the relevant demand index is also determined. This, in turn, is calculated through indicators that reflect social needs. With the use of the smaller Brazilian geographical unit, the census sector, the model with the inclusion of population density in the study areas should present more consolidated results. Based on the results achieved, an analysis of transportation disadvantage will be carried out as a factor of social exclusion in the study area. It is anticipated that the results obtained up to the present moment, already indicate a strong trend of public transportation in areas of higher income classes, leading to the understanding that the most disadvantaged population migrates to those neighborhoods in search of employment.

Keywords: gap assessment, public transport supply, social exclusion, spatial gaps

Procedia PDF Downloads 177
12682 Comparative Analysis of Reinforcement Learning Algorithms for Autonomous Driving

Authors: Migena Mana, Ahmed Khalid Syed, Abdul Malik, Nikhil Cherian

Abstract:

In recent years, advancements in deep learning enabled researchers to tackle the problem of self-driving cars. Car companies use huge datasets to train their deep learning models to make autonomous cars a reality. However, this approach has certain drawbacks in that the state space of possible actions for a car is so huge that there cannot be a dataset for every possible road scenario. To overcome this problem, the concept of reinforcement learning (RL) is being investigated in this research. Since the problem of autonomous driving can be modeled in a simulation, it lends itself naturally to the domain of reinforcement learning. The advantage of this approach is that we can model different and complex road scenarios in a simulation without having to deploy in the real world. The autonomous agent can learn to drive by finding the optimal policy. This learned model can then be easily deployed in a real-world setting. In this project, we focus on three RL algorithms: Q-learning, Deep Deterministic Policy Gradient (DDPG), and Proximal Policy Optimization (PPO). To model the environment, we have used TORCS (The Open Racing Car Simulator), which provides us with a strong foundation to test our model. The inputs to the algorithms are the sensor data provided by the simulator such as velocity, distance from side pavement, etc. The outcome of this research project is a comparative analysis of these algorithms. Based on the comparison, the PPO algorithm gives the best results. When using PPO algorithm, the reward is greater, and the acceleration, steering angle and braking are more stable compared to the other algorithms, which means that the agent learns to drive in a better and more efficient way in this case. Additionally, we have come up with a dataset taken from the training of the agent with DDPG and PPO algorithms. It contains all the steps of the agent during one full training in the form: (all input values, acceleration, steering angle, break, loss, reward). This study can serve as a base for further complex road scenarios. Furthermore, it can be enlarged in the field of computer vision, using the images to find the best policy.

Keywords: autonomous driving, DDPG (deep deterministic policy gradient), PPO (proximal policy optimization), reinforcement learning

Procedia PDF Downloads 140
12681 Investigating Trophic Relationships in Moroccan Marine Ecosystems: A Study of the Mediterranean and Atlantic Using Ecopath

Authors: Salma Aboussalam, Karima Khalil, Khalid Elkalay

Abstract:

An Ecopath model was employed to investigate the trophic structure, function, and current state of the Moroccan Mediterranean Sea ecosystem. The model incorporated 31 functional groups, including 21 fish species, 7 invertebrates, 2 primary producers, and a detritus group. The trophic interactions among these groups were analyzed, revealing an average trophic transfer efficiency of 23%. The results indicated that the ecosystem produced more energy than it consumed, with high respiration and consumption rates. Indicators of stability and development were low for the Finn cycle index (13.97), system omnivory index (0.18), and average Finn path length (3.09), indicating a disturbed ecosystem with a linear trophic structure. Keystone species were identified through the use of the keystone index and mixed trophic impact analysis, with demersal invertebrates, zooplankton, and cephalopods found to have a significant impact on other groups.

Keywords: Ecopath, food web, trophic flux, Moroccan Mediterranean Sea

Procedia PDF Downloads 98
12680 Effective Nutrition Label Use on Smartphones

Authors: Vladimir Kulyukin, Tanwir Zaman, Sarat Kiran Andhavarapu

Abstract:

Research on nutrition label use identifies four factors that impede comprehension and retention of nutrition information by consumers: label’s location on the package, presentation of information within the label, label’s surface size, and surrounding visual clutter. In this paper, a system is presented that makes nutrition label use more effective for nutrition information comprehension and retention. The system’s front end is a smartphone application. The system’s back end is a four node Linux cluster for image recognition and data storage. Image frames captured on the smartphone are sent to the back end for skewed or aligned barcode recognition. When barcodes are recognized, corresponding nutrition labels are retrieved from a cloud database and presented to the user on the smartphone’s touchscreen. Each displayed nutrition label is positioned centrally on the touchscreen with no surrounding visual clutter. Wikipedia links to important nutrition terms are embedded to improve comprehension and retention of nutrition information. Standard touch gestures (e.g., zoom in/out) available on mainstream smartphones are used to manipulate the label’s surface size. The nutrition label database currently includes 200,000 nutrition labels compiled from public web sites by a custom crawler. Stress test experiments with the node cluster are presented. Implications for proactive nutrition management and food policy are discussed.

Keywords: mobile computing, cloud computing, nutrition label use, nutrition management, barcode scanning

Procedia PDF Downloads 367
12679 Implicit Eulerian Fluid-Structure Interaction Method for the Modeling of Highly Deformable Elastic Membranes

Authors: Aymen Laadhari, Gábor Székely

Abstract:

This paper is concerned with the development of a fully implicit and purely Eulerian fluid-structure interaction method tailored for the modeling of the large deformations of elastic membranes in a surrounding Newtonian fluid. We consider a simplified model for the mechanical properties of the membrane, in which the surface strain energy depends on the membrane stretching. The fully Eulerian description is based on the advection of a modified surface tension tensor, and the deformations of the membrane are tracked using a level set strategy. The resulting nonlinear problem is solved by a Newton-Raphson method, featuring a quadratic convergence behavior. A monolithic solver is implemented, and we report several numerical experiments aimed at model validation and illustrating the accuracy of the presented method. We show that stability is maintained for significantly larger time steps.

Keywords: finite element method, implicit, level set, membrane, Newton method

Procedia PDF Downloads 300
12678 The Future of the Architect's Profession in France with the Emergence of Building Information Modelling

Authors: L. Mercier, D. Beladjine, K. Beddiar

Abstract:

The digital transition of building in France brings many changes which some have been able to face very quickly, while others are struggling to find their place and the interest that BIM can bring in their profession. BIM today is already adopted or initiated by construction professionals. However, this change, which can be drastic for some, prevents them from integrating it definitively. This is the case with architects. The profession is shared on the practice of BIM in its exercise. The risk of not adopting this new working method now and of not wanting to switch to its new digital tools leads us to question the future of the profession in view of the gap that is likely to be created within project management. In order to deal with the subject efficiently, our work was based on a documentary watch on BIM and then on the profession of architect, which allowed us to establish links on these two subjects. The observation of the economic model towards which the agencies tend and the trend of the sought after profiles made it possible to develop the opportunities and the brakes likely to impact the future of the profession of architect. The centralization of research directs work towards the conclusion that the model implemented by companies does not allow to integrate BIM within their structure. A solution hypothesis was then issued, focusing on the development of agencies through the diversity of profiles, skills to be integrated internally with the aim of diversifying their skills, and their business practices. In order to address this hypothesis of a multidisciplinary agency model, we conducted a survey of architectural firms. It is built on the model of Anglo-Saxon countries, which do not have the same functioning in comparison to the French model. The results obtained showed a risk of gradual disappearance on the market from small agencies in favor of those who will have and could take this BIM working method. This is why the architectural profession must, first of all, look at what is happening within its training before absolutely wanting to diversify the profiles to integrate into its structure. This directs the study on the training of architects. The schools of French architects are generally behind schedule if we allow the comparison to the schools of engineers. The latter is currently experiencing a slight improvement with the emergence of masters and BIM options during the university course. If the training of architects develops towards learning BIM and the agencies have the desire to integrate different but complementary profiles, then they will develop their skills internally and therefore open their profession to new functions. The place of BIM Management on projects will allow the architect to remain in control of the project because of their overall vision of the project. In addition, the integration of BIM and more generally of the life cycle analysis of the structure will make it possible to guarantee eco-design or eco-construction by approaching the constraints of sustainable development omnipresent on the planet.

Keywords: building information modelling, BIM, BIM management, BIM manager, BIM architect

Procedia PDF Downloads 110
12677 Volatility Transmission between Oil Price and Stock Return of Emerging and Developed Countries

Authors: Algia Hammami, Abdelfatteh Bouri

Abstract:

In this work, our objective is to study the transmission of volatility between oil and stock markets in developed (USA, Germany, Italy, France and Japan) and emerging countries (Tunisia, Thailand, Brazil, Argentina, and Jordan) for the period 1998-2015. Our methodology consists of analyzing the monthly data by the GARCH-BEKK model to capture the effect in terms of volatility in the variation of the oil price on the different stock market. The empirical results in the emerging countries indicate that the relationships are unidirectional from the stock market to the oil market. For the developed countries, we find that the transmission of volatility is unidirectional from the oil market to stock market. For the USA and Italy, we find no transmission between the two markets. The transmission is bi-directional only in Thailand. Following our estimates, we also noticed that the emerging countries influence almost the same extent as the developed countries, while at the transmission of volatility there a bid difference. The GARCH-BEKK model is more effective than the others versions to minimize the risk of an oil-stock portfolio.

Keywords: GARCH, oil prices, stock market, volatility transmission

Procedia PDF Downloads 431
12676 Digital Design and Fabrication: A Review of Trend and Its Impact in the African Context

Authors: Mohamed Al Araby, Amany Salman, Mostafa Amin, Mohamed Madbully, Dalia Keraa, Mariam Ali, Marah Abdelfatah, Mariam Ahmed, Ahmed Hassab

Abstract:

In recent years, the architecture, engineering, and construction (A.E.C.) industry have been exposed to important innovations, most notably the global integration of digital design and fabrication (D.D.F.) processes in the industry’s workflow. Despite this evolution in that sector, Africa was excluded from the examination of this development. The reason behind this exclusion is the preconceived view of it as a developing region that still employs traditional methods of construction. The primary objective of this review is to investigate the trend of digital construction (D.C.) in the African environment and the difficulties in its regular utilization of it. This objective can be attained by recognizing the notion of distributed computing in Africa and evaluating the impact of the projects deploying this technology on both the immediate and broader contexts. The paper’s methodology begins with the collection of data from 224 initiatives throughout Africa. Then, 50 of these projects were selected based on the criteria of the project's recency, typology variety, and location diversity. After that, a literature-based comparative analysis was undertaken. This study’s findings reveal a pattern of motivation for applying digital fabrication processes. Moreover, it is essential to evaluate the socio-economic effects of these projects on the population living near the analyzed subject. The last step in this study is identifying the influence on the neighboring nations.

Keywords: Africa, digital construction, digital design, fabrication

Procedia PDF Downloads 161
12675 Self-Determination among Individuals with Intellectual Disability: An Experiment

Authors: Wasim Ahmad, Bir Singh Chavan, Nazli Ahmad

Abstract:

Objectives: The present investigation is an attempt to find out the efficacy of training the special educators on promoting self-determination among individuals with intellectual disability. Methods: The study equipped the special educators with necessary skills and knowledge to train individuals with the intellectual disability for practicing self-determination. Subjects: Special educators (N=25) were selected for training on self-determination among individuals with intellectual disability. After receiving the training, (N=50) individuals with an intellectual disability were selected and intervened by the trained special educators. Tool: Self-Determination Scale for Adults with Mild Mental Retardation (SDSAMR) developed by Keshwal and Thressiakutty (2010) has been used. It’s a reliable and valid tool used by many researchers. It has 36 items distributed in five domains namely: personal management, community participation, recreation and leisure time, choice making and problem solving. Analysis: The collected data was analyzed using the statistical techniques such as t-test, ANCOVA, and Posthoc Tuckey test. Results: The findings of the study reveal that there is a significant difference at 1% level in the pre and post tests mean scores (t-15.56) of self-determination concepts among the special educators. This indicates that the training enhanced the performance of special educators on the concept of self-determination among individuals with intellectual disability. The study also reveals that the training received on transition planning by the special educators found to be effective because they were able to practice the concept by imparting and training the individuals with intellectual disability to if determined. The results show that there was a significant difference at 1% level in the pre and post tests mean scores (t-16.61) of self-determination among individuals with intellectual disability. Conclusion: To conclude it can be said that the training has a remarkable impact on the performance of the individuals with intellectual disability on self-determination.

Keywords: experiment, individuals with intellectual disability, self-determination, special educators

Procedia PDF Downloads 331
12674 Urban Greenery in the Greatest Polish Cities: Analysis of Spatial Concentration

Authors: Elżbieta Antczak

Abstract:

Cities offer important opportunities for economic development and for expanding access to basic services, including health care and education, for large numbers of people. Moreover, green areas (as an integral part of sustainable urban development) present a major opportunity for improving urban environments, quality of lives and livelihoods. This paper examines, using spatial concentration and spatial taxonomic measures, regional diversification of greenery in the cities of Poland. The analysis includes location quotients, Lorenz curve, Locational Gini Index, and the synthetic index of greenery and spatial statistics tools: (1) To verify the occurrence of strong concentration or dispersion of the phenomenon in time and space depending on the variable category, and, (2) To study if the level of greenery depends on the spatial autocorrelation. The data includes the greatest Polish cities, categories of the urban greenery (parks, lawns, street greenery, and green areas on housing estates, cemeteries, and forests) and the time span 2004-2015. According to the obtained estimations, most of cites in Poland are already taking measures to become greener. However, in the country there are still many barriers to well-balanced urban greenery development (e.g. uncontrolled urban sprawl, poor management as well as lack of spatial urban planning systems).

Keywords: greenery, urban areas, regional spatial diversification and concentration, spatial taxonomic measure

Procedia PDF Downloads 281
12673 Evaluation of Neighbourhood Characteristics and Active Transport Mode Choice

Authors: Tayebeh Saghapour, Sara Moridpour, Russell George Thompson

Abstract:

One of the common aims of transport policy makers is to switch people’s travel to active transport. For this purpose, a variety of transport goals and investments should be programmed to increase the propensity towards active transport mode choice. This paper aims to investigate whether built environment features in neighbourhoods could enhance the odds of active transportation. The present study introduces an index measuring public transport accessibility (PTAI), and a walkability index along with socioeconomic variables to investigate mode choice behaviour. Using travel behaviour data, an ordered logit regression model is applied to examine the impacts of explanatory variables on walking trips. The findings indicated that high rates of active travel are consistently associated with higher levels of walking and public transport accessibility.

Keywords: active transport, public transport accessibility, walkability, ordered logit model

Procedia PDF Downloads 347
12672 Comparative Analysis of DTC Based Switched Reluctance Motor Drive Using Torque Equation and FEA Models

Authors: P. Srinivas, P. V. N. Prasad

Abstract:

Since torque ripple is the main cause of noise and vibrations, the performance of Switched Reluctance Motor (SRM) can be improved by minimizing its torque ripple using a novel control technique called Direct Torque Control (DTC). In DTC technique, torque is controlled directly through control of magnitude of the flux and change in speed of the stator flux vector. The flux and torque are maintained within set hysteresis bands. The DTC of SRM is analysed by two methods. In one of the methods, the actual torque is computed by conducting Finite Element Analysis (FEA) on the design specifications of the motor. In the other method, the torque is computed by Simplified Torque Equation. The variation of peak current, average current, torque ripple and speed settling time with Simplified Torque Equation model is compared with FEA based model.

Keywords: direct toque control, simplified torque equation, finite element analysis, torque ripple

Procedia PDF Downloads 474
12671 A Parametric Study on the Backwater Level Due to a Bridge Constriction

Authors: S. Atabay, T. A. Ali, Md. M. Mortula

Abstract:

This paper presents the results and findings from a parametric study on the water surface elevation at upstream of bridge constriction for subcritical flow. In this study, the influence of Manning's Roughness Coefficient of main channel (nmc) and of floodplain (nfp), and bridge opening (b) flow rate (Q), contraction (kcon), and expansion coefficients (kexp) were investigated on backwater level. The DECK bridge models with different span widths and without any pier were investigated within the two stage channel having various roughness conditions. One of the most commonly used commercial one-dimensional HEC-RAS model was used in this parametric study. This study showed that the effects of main channel roughness (nmc) and flow rate (Q) on the backwater level are much higher than those of the floodplain roughness (nfp). Bridge opening (b) with contraction (kcon) and expansion coefficients (kexp) have very little effect on the backwater level within this range of parameters.

Keywords: bridge backwater, parametric study, waterways, HEC-RAS model

Procedia PDF Downloads 300
12670 Data and Biological Sharing Platforms in Community Health Programs: Partnership with Rural Clinical School, University of New South Wales and Public Health Foundation of India

Authors: Vivian Isaac, A. T. Joteeshwaran, Craig McLachlan

Abstract:

The University of New South Wales (UNSW) Rural Clinical School has a strategic collaborative focus on chronic disease and public health. Our objectives are to understand rural environmental and biological interactions in vulnerable community populations. The UNSW Rural Clinical School translational model is a spoke and hub network. This spoke and hub model connects rural data and biological specimens with city based collaborative public health research networks. Similar spoke and hub models are prevalent across research centers in India. The Australia-India Council grant was awarded so we could establish sustainable public health and community research collaborations. As part of the collaborative network we are developing strategies around data and biological sharing platforms between Indian Institute of Public Health, Public Health Foundation of India (PHFI), Hyderabad and Rural Clinical School UNSW. The key objective is to understand how research collaborations are conducted in India and also how data can shared and tracked with external collaborators such as ourselves. A framework to improve data sharing for research collaborations, including DNA was proposed as a project outcome. The complexities of sharing biological data has been investigated via a visit to India. A flagship sustainable project between Rural Clinical School UNSW and PHFI would illustrate a model of data sharing platforms.

Keywords: data sharing, collaboration, public health research, chronic disease

Procedia PDF Downloads 446
12669 High Purity Germanium Detector Characterization by Means of Monte Carlo Simulation through Application of Geant4 Toolkit

Authors: Milos Travar, Jovana Nikolov, Andrej Vranicar, Natasa Todorovic

Abstract:

Over the years, High Purity Germanium (HPGe) detectors proved to be an excellent practical tool and, as such, have established their today's wide use in low background γ-spectrometry. One of the advantages of gamma-ray spectrometry is its easy sample preparation as chemical processing and separation of the studied subject are not required. Thus, with a single measurement, one can simultaneously perform both qualitative and quantitative analysis. One of the most prominent features of HPGe detectors, besides their excellent efficiency, is their superior resolution. This feature virtually allows a researcher to perform a thorough analysis by discriminating photons of similar energies in the studied spectra where otherwise they would superimpose within a single-energy peak and, as such, could potentially scathe analysis and produce wrongly assessed results. Naturally, this feature is of great importance when the identification of radionuclides, as well as their activity concentrations, is being practiced where high precision comes as a necessity. In measurements of this nature, in order to be able to reproduce good and trustworthy results, one has to have initially performed an adequate full-energy peak (FEP) efficiency calibration of the used equipment. However, experimental determination of the response, i.e., efficiency curves for a given detector-sample configuration and its geometry, is not always easy and requires a certain set of reference calibration sources in order to account for and cover broader energy ranges of interest. With the goal of overcoming these difficulties, a lot of researches turned towards the application of different software toolkits that implement the Monte Carlo method (e.g., MCNP, FLUKA, PENELOPE, Geant4, etc.), as it has proven time and time again to be a very powerful tool. In the process of creating a reliable model, one has to have well-established and described specifications of the detector. Unfortunately, the documentation that manufacturers provide alongside the equipment is rarely sufficient enough for this purpose. Furthermore, certain parameters tend to evolve and change over time, especially with older equipment. Deterioration of these parameters consequently decreases the active volume of the crystal and can thus affect the efficiencies by a large margin if they are not properly taken into account. In this study, the optimisation method of two HPGe detectors through the implementation of the Geant4 toolkit developed by CERN is described, with the goal of further improving simulation accuracy in calculations of FEP efficiencies by investigating the influence of certain detector variables (e.g., crystal-to-window distance, dead layer thicknesses, inner crystal’s void dimensions, etc.). Detectors on which the optimisation procedures were carried out were a standard traditional co-axial extended range detector (XtRa HPGe, CANBERRA) and a broad energy range planar detector (BEGe, CANBERRA). Optimised models were verified through comparison with experimentally obtained data from measurements of a set of point-like radioactive sources. Acquired results of both detectors displayed good agreement with experimental data that falls under an average statistical uncertainty of ∼ 4.6% for XtRa and ∼ 1.8% for BEGe detector within the energy range of 59.4−1836.1 [keV] and 59.4−1212.9 [keV], respectively.

Keywords: HPGe detector, γ spectrometry, efficiency, Geant4 simulation, Monte Carlo method

Procedia PDF Downloads 114
12668 Portfolio Optimization under a Hybrid Stochastic Volatility and Constant Elasticity of Variance Model

Authors: Jai Heui Kim, Sotheara Veng

Abstract:

This paper studies the portfolio optimization problem for a pension fund under a hybrid model of stochastic volatility and constant elasticity of variance (CEV) using asymptotic analysis method. When the volatility component is fast mean-reverting, it is able to derive asymptotic approximations for the value function and the optimal strategy for general utility functions. Explicit solutions are given for the exponential and hyperbolic absolute risk aversion (HARA) utility functions. The study also shows that using the leading order optimal strategy results in the value function, not only up to the leading order, but also up to first order correction term. A practical strategy that does not depend on the unobservable volatility level is suggested. The result is an extension of the Merton's solution when stochastic volatility and elasticity of variance are considered simultaneously.

Keywords: asymptotic analysis, constant elasticity of variance, portfolio optimization, stochastic optimal control, stochastic volatility

Procedia PDF Downloads 295
12667 Modern Information Security Management and Digital Technologies: A Comprehensive Approach to Data Protection

Authors: Mahshid Arabi

Abstract:

With the rapid expansion of digital technologies and the internet, information security has become a critical priority for organizations and individuals. The widespread use of digital tools such as smartphones and internet networks facilitates the storage of vast amounts of data, but simultaneously, vulnerabilities and security threats have significantly increased. The aim of this study is to examine and analyze modern methods of information security management and to develop a comprehensive model to counteract threats and information misuse. This study employs a mixed-methods approach, including both qualitative and quantitative analyses. Initially, a systematic review of previous articles and research in the field of information security was conducted. Then, using the Delphi method, interviews with 30 information security experts were conducted to gather their insights on security challenges and solutions. Based on the results of these interviews, a comprehensive model for information security management was developed. The proposed model includes advanced encryption techniques, machine learning-based intrusion detection systems, and network security protocols. AES and RSA encryption algorithms were used for data protection, and machine learning models such as Random Forest and Neural Networks were utilized for intrusion detection. Statistical analyses were performed using SPSS software. To evaluate the effectiveness of the proposed model, T-Test and ANOVA statistical tests were employed, and results were measured using accuracy, sensitivity, and specificity indicators of the models. Additionally, multiple regression analysis was conducted to examine the impact of various variables on information security. The findings of this study indicate that the comprehensive proposed model reduced cyber-attacks by an average of 85%. Statistical analysis showed that the combined use of encryption techniques and intrusion detection systems significantly improves information security. Based on the obtained results, it is recommended that organizations continuously update their information security systems and use a combination of multiple security methods to protect their data. Additionally, educating employees and raising public awareness about information security can serve as an effective tool in reducing security risks. This research demonstrates that effective and up-to-date information security management requires a comprehensive and coordinated approach, including the development and implementation of advanced techniques and continuous training of human resources.

Keywords: data protection, digital technologies, information security, modern management

Procedia PDF Downloads 24
12666 Localization of Frontal and Temporal Speech Areas in Brain Tumor Patients by Their Structural Connections with Probabilistic Tractography

Authors: B.Shukir, H.Woo, P.Barzo, D.Kis

Abstract:

Preoperative brain mapping in tumors involving the speech areas has an important role to reduce surgical risks. Functional magnetic resonance imaging (fMRI) is the gold standard method to localize cortical speech areas preoperatively, but its availability in clinical routine is difficult. Diffusion MRI based probabilistic tractography is available in head MRI. It’s used to segment cortical subregions by their structural connectivity. In our study, we used probabilistic tractography to localize the frontal and temporal cortical speech areas. 15 patients with left frontal tumor were enrolled to our study. Speech fMRI and diffusion MRI acquired preoperatively. The standard automated anatomical labelling atlas 3 (AAL3) cortical atlas used to define 76 left frontal and 118 left temporal potential speech areas. 4 types of tractography were run according to the structural connection of these regions to the left arcuate fascicle (FA) to localize those cortical areas which have speech functions: 1, frontal through FA; 2, frontal with FA; 3, temporal to FA; 4, temporal with FA connections were determined. Thresholds of 1%, 5%, 10% and 15% applied. At each level, the number of affected frontal and temporal regions by fMRI and tractography were defined, the sensitivity and specificity were calculated. At the level of 1% threshold showed the best results. Sensitivity was 61,631,4% and 67,1523,12%, specificity was 87,210,4% and 75,611,37% for frontal and temporal regions, respectively. From our study, we conclude that probabilistic tractography is a reliable preoperative technique to localize cortical speech areas. However, its results are not feasible that the neurosurgeon rely on during the operation.

Keywords: brain mapping, brain tumor, fMRI, probabilistic tractography

Procedia PDF Downloads 154
12665 Tunneling Current Switching in the Coupled Quantum Dots by Means of External Field

Authors: Vladimir Mantsevich, Natalya Maslova, Petr Arseyev

Abstract:

We investigated the tunneling current peculiarities in the system of two coupled by means of the external field quantum dots (QDs) weakly connected to the electrodes in the presence of Coulomb correlations between localized electrons by means of Heisenberg equations for pseudo operators with constraint. Special role of multi-electronic states was demonstrated. Various single-electron levels location relative to the sample Fermi level and to the applied bias value in symmetric tunneling contact were investigated. Rabi frequency tuning results in the single-electron energy levels spacing. We revealed the appearance of negative tunneling conductivity and demonstrated multiple switching "on" and "off" of the tunneling current depending on the Coulomb correlations value, Rabi frequency amplitude and energy levels spacing. We proved that Coulomb correlations strongly influence the system behavior. We demonstrated the presence of multi-stability in the coupled QDs with Coulomb correlations when single value of the tunneling current amplitude corresponds to the two values of Rabi frequency in the case when both single-electron energy levels are located slightly above eV and are close to each other. This effect disappears when the single-electron energy levels spacing increases.

Keywords: Coulomb correlations, negative tunneling conductivity, quantum dots, rabi frequency

Procedia PDF Downloads 448
12664 The Power of Inferences and Assumptions: Using a Humanities Education Approach to Help Students Learn to Think Critically

Authors: Randall E. Osborne

Abstract:

A four-step ‘humanities’ thought model has been used in an interdisciplinary course for almost two decades and has been proven to aid in student abilities to become more inclusive in their world view. Lack of tolerance for ambiguity can interfere with this progression so we developed an assignment that seems to have assisted students in developing more tolerance for ambiguity and, therefore, opened them up to make more progress on the critical thought model. A four-step critical thought model (built from a humanities education approach) is used in an interdisciplinary course on prejudice, discrimination, and hate in an effort to minimize egocentrism and promote sociocentrism in college students. A fundamental barrier to this progression is a lack of tolerance for ambiguity. The approach to the course is built on the assumption that Tolerance for Ambiguity (characterized by a dislike of uncertain, ambiguous or situations in which expected behaviors are uncertain, will like serve as a barrier (if tolerance is low) or facilitator (if tolerance is high) of active ‘engagement’ with assignments. Given that active engagement with course assignments would be necessary to promote an increase in critical thought and the degree of multicultural attitude change, tolerance for ambiguity inhibits critical thinking and, ultimately multicultural attitude change. As expected, those students showing the least amount of decrease (or even an increase) in intolerance across the semester, earned lower grades in the course than those students who showed a significant decrease in intolerance, t(1,19) = 4.659, p < .001. Students who demonstrated the most change in their Tolerance for Ambiguity (showed an increasing ability to tolerate ambiguity) earned the highest grades in the course. This is, especially, significant because faculty did not know student scores on this measure until after all assignments had been graded and course grades assigned. An assignment designed to assist students in making their assumption and inferences processes visible so they could be explored, was implemented with the goal of this exploration then promoting more tolerance for ambiguity, which, as already outlined, promotes critical thought. The assignment offers students two options and then requires them to explore what they have learned about inferences and/or assumptions This presentation outlines the assignment and demonstrates the humanities model, what students learn from particular assignments and how it fosters a change in Tolerance for Ambiguity which, serves as the foundational component of critical thinking.

Keywords: critical thinking, humanities education, sociocentrism, tolerance for ambiguity

Procedia PDF Downloads 270
12663 Digital Elevation Model Analysis of Potential Prone Flood Disaster Watershed Citarum Headwaters Bandung

Authors: Faizin Mulia Rizkika, Iqbal Jabbari Mufti, Muhammad R. Y. Nugraha, Fadil Maulidir Sube

Abstract:

Flooding is an event of ponding on the flat area around the river as a result of the overflow of river water was not able to be accommodated by the river and may cause damage to the infrastructure of a region. This study aimed to analyze the data of Digital Elevation Model (DEM) for information that plays a role in the mapping of zones prone to flooding, mapping the distribution of zones prone to flooding that occurred in the Citarum upstream using secondary data and software (ArcGIS, MapInfo), this assessment was made distribution map of flooding, there were 13 counties / districts dam flood-prone areas in Bandung, and the most vulnerable districts are areas Baleendah-Dayeuhkolot-Bojongsoang-Banjaran. The area has a low slope and the same limits with boundary rivers and areas that have excessive land use, so the water catchment area is reduced.

Keywords: mitigation, flood, citarum, DEM

Procedia PDF Downloads 382
12662 An Efficient Activated Carbon for Copper (II) Adsorption Synthesized from Indian Gooseberry Seed Shells

Authors: Somen Mondal, Subrata Kumar Majumder

Abstract:

Removal of metal pollutants by efficient activated carbon is challenging research in the present-day scenario. In the present study, the characteristic features of an efficient activated carbon (AC) synthesized from Indian gooseberry seed shells for the copper (II) adsorption are reported. A three-step chemical activation method consisting of the impregnation, carbonization and subsequent activation is used to produce the activated carbon. The copper adsorption kinetics and isotherms onto the activated carbon were analyzed. As per present investigation, Indian gooseberry seed shells showed the BET surface area of 1359 m²/g. The maximum adsorptivity of the activated carbon at a pH value of 9.52 was found to be 44.84 mg/g at 30°C. The adsorption process followed the pseudo-second-order kinetic model along with the Langmuir adsorption isotherm. This AC could be used as a favorable and cost-effective copper (II) adsorbent in wastewater treatment to remove the metal contaminants.

Keywords: activated carbon, adsorption isotherm, kinetic model, characterization

Procedia PDF Downloads 157
12661 A Robust Spatial Feature Extraction Method for Facial Expression Recognition

Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda

Abstract:

This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.

Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure

Procedia PDF Downloads 423
12660 Co-Culture of Neonate Mouse Spermatogonial Stem Cells with Sertoli Cells: Inductive Role of Melatonin following Transplantation: Adult Azoospermia Mouse Model

Authors: Mehdi Abbasi, Shadan Navid, Mohammad Pourahmadi, M. Majidi Zolbin

Abstract:

We have recently reported that melatonin as antioxidant enhances the efficacy of colonization of spermatogonial stem cells (SSCs). Melatonin as an antioxidant plays a vital role in the development of SSCs in vitro. This study aimed to investigate evaluation of sertoli cells and melatonin simultaneously on SSC proliferation following transplantation to testis of adult mouse busulfan-treated azoospermia model. SSCs and sertoli cells were isolated from the testes of three to six-day old male mice.To determine the purity, Flow cytometry technique using PLZF antibody were evaluated. Isolated testicular cells were cultured in αMEM medium in the absence (control group) or presence (experimental group) of sertoli cells and melatonin extract for 2 weeks. We then transplanted SSCs by injection into the azoospermia mice model. Higher viability, proliferation, and Id4, Plzf, expression were observed in the presence of simultaneous sertoli cells and melatonin in vitro. Moreover, immunocytochemistry results showed higher Oct4 expression in this group. Eight weeks after transplantation, injected cells were localized at the base of seminiferous tubules in the recipient testes. The number of spermatogonia and the weight of testis were higher in the experimental group relative to control group. The results of our study suggest that this new protocol can increase the transplantation of these cells can be useful in the treatment of male infertility.

Keywords: colonization, melatonin, spermatogonial stem cell, transplantation

Procedia PDF Downloads 167
12659 Development of Star Image Simulator for Star Tracker Algorithm Validation

Authors: Zoubida Mahi

Abstract:

A successful satellite mission in space requires a reliable attitude and orbit control system to command, control and position the satellite in appropriate orbits. Several sensors are used for attitude control, such as magnetic sensors, earth sensors, horizon sensors, gyroscopes, and solar sensors. The star tracker is the most accurate sensor compared to other sensors, and it is able to offer high-accuracy attitude control without the need for prior attitude information. There are mainly three approaches in star sensor research: digital simulation, hardware in the loop simulation, and field test of star observation. In the digital simulation approach, all of the processes are done in software, including star image simulation. Hence, it is necessary to develop star image simulation software that could simulate real space environments and various star sensor configurations. In this paper, we present a new stellar image simulation tool that is used to test and validate the stellar sensor algorithms; the developed tool allows to simulate of stellar images with several types of noise, such as background noise, gaussian noise, Poisson noise, multiplicative noise, and several scenarios that exist in space such as the presence of the moon, the presence of optical system problem, illumination and false objects. On the other hand, we present in this paper a new star extraction algorithm based on a new centroid calculation method. We compared our algorithm with other star extraction algorithms from the literature, and the results obtained show the star extraction capability of the proposed algorithm.

Keywords: star tracker, star simulation, star detection, centroid, noise, scenario

Procedia PDF Downloads 91
12658 Developing a Model to Objectively Assess the Culture of Individuals and Teams in Order to Effectively and Efficiently Achieve Sustainability in the Manpower

Authors: Ahmed Mohamed Elnady Mohamed Elsafty

Abstract:

This paper explains a developed applied objective model to measure the culture qualitatively and quantitatively, whether in individuals or in teams, in order to be able to use culture correctly or modify it efficiently. This model provides precise measurements and consistent interpretations by being comprehensive, updateable, and protected from being misled by imitations. Methodically, the provided model divides the culture into seven dimensions (total 43 cultural factors): First dimension is outcome-orientation which consists of five factors and should be highest in leaders. Second dimension is details-orientation which consists of eight factors and should be in highest intelligence members. Third dimension is team-orientation which consists of five factors and should be highest in instructors or coaches. Fourth dimension is change-orientation which consists of five factors and should be highest in soldiers. Fifth dimension is people-orientation which consists of eight factors and should be highest in media members. Sixth dimension is masculinity which consists of seven factors and should be highest in hard workers. Last dimension is stability which consists of seven factors and should be highest in soft workers. In this paper, the details of all cultural factors are explained. Practically, information collection about each cultural factor in the targeted person or team is essential in order to calculate the degrees of all cultural factors using the suggested equation of multiplying 'the score of factor presence' by 'the score of factor strength'. In this paper, the details of how to build each score are explained. Based on the highest degrees - to identify which cultural dimension is the prominent - choosing the tested individual or team in the supposedly right position at the right time will provide a chance to use minimal efforts to make everyone aligned to the organization’s objectives. In other words, making everyone self-motivated by setting him/her at the right source of motivation is the most effective and efficient method to achieve high levels of competency, commitment, and sustainability. Modifying a team culture can be achieved by excluding or including new members with relatively high or low degrees in specific cultural factors. For conclusion, culture is considered as the software of the human beings and it is one of the major compression factors on the managerial discretion. It represents the behaviors, attitudes, and motivations of the human resources which are vital to enhance quality and safety, expanding the market share, and defending against attacks from external environments. Thus, it is tremendously essential and useful to use such a comprehensive model to measure, use, and modify culture.

Keywords: culture dimensions, culture factors, culture measurement, cultural analysis, cultural modification, self-motivation, alignment to objectives, competency, sustainability

Procedia PDF Downloads 162