Search results for: RLS identification algorithm
1620 Structural Protein-Protein Interactions Network of Breast Cancer Lung and Brain Metastasis Corroborates Conformational Changes of Proteins Lead to Different Signaling
Authors: Farideh Halakou, Emel Sen, Attila Gursoy, Ozlem Keskin
Abstract:
Protein–Protein Interactions (PPIs) mediate major biological processes in living cells. The study of PPIs as networks and analyze the network properties contribute to the identification of genes and proteins associated with diseases. In this study, we have created the sub-networks of brain and lung metastasis from primary tumor in breast cancer. To do so, we used seed genes known to cause metastasis, and produced their interactions through a network-topology based prioritization method named GUILDify. In order to have the experimental support for the sub-networks, we further curated them using STRING database. We proceeded by modeling structures for the interactions lacking complex forms in Protein Data Bank (PDB). The functional enrichment analysis shows that KEGG pathways associated with the immune system and infectious diseases, particularly the chemokine signaling pathway, are important for lung metastasis. On the other hand, pathways related to genetic information processing are more involved in brain metastasis. The structural analyses of the sub-networks vividly demonstrated their difference in terms of using specific interfaces in lung and brain metastasis. Furthermore, the topological analysis identified genes such as RPL5, MMP2, CCR5 and DPP4, which are already known to be associated with lung or brain metastasis. Additionally, we found 6 and 9 putative genes that are specific for lung and brain metastasis, respectively. Our analysis suggests that variations in genes and pathways contributing to these different breast metastasis types may arise due to change in tissue microenvironment. To show the benefits of using structural PPI networks instead of traditional node and edge presentation, we inspect two case studies showing the mutual exclusiveness of interactions and effects of mutations on protein conformation which lead to different signaling.Keywords: breast cancer, metastasis, PPI networks, protein conformational changes
Procedia PDF Downloads 2441619 Ensuring Quality in DevOps Culture
Authors: Sagar Jitendra Mahendrakar
Abstract:
Integrating quality assurance (QA) practices into DevOps culture has become increasingly important in modern software development environments. Collaboration, automation and continuous feedback characterize the seamless integration of DevOps development and operations teams to achieve rapid and reliable software delivery. In this context, quality assurance plays a key role in ensuring that software products meet the highest quality, performance and reliability standards throughout the development life cycle. This brief explores key principles, challenges, and best practices related to quality assurance in a DevOps culture. This emphasizes the importance of quality transfer in the development process, as quality control processes are integrated in every step of the DevOps process. Automation is the cornerstone of DevOps quality assurance, enabling continuous testing, integration and deployment and providing rapid feedback for early problem identification and resolution. In addition, the summary addresses the cultural and organizational challenges of implementing quality assurance in DevOps, emphasizing the need to foster collaboration, break down silos, and promote a culture of continuous improvement. It also discusses the importance of toolchain integration and capability development to support effective QA practices in DevOps environments. Moreover, the abstract discusses the cultural and organizational challenges in implementing QA within DevOps, emphasizing the need for fostering collaboration, breaking down silos, and nurturing a culture of continuous improvement. It also addresses the importance of toolchain integration and skills development to support effective QA practices within DevOps environments. Overall, this collection works at the intersection of QA and DevOps culture, providing insights into how organizations can use DevOps principles to improve software quality, accelerate delivery, and meet the changing demands of today's dynamic software. landscape.Keywords: quality engineer, devops, automation, tool
Procedia PDF Downloads 581618 Numerical Simulation of Air Pollutant Using Coupled AERMOD-WRF Modeling System over Visakhapatnam: A Case Study
Authors: Amit Kumar
Abstract:
Accurate identification of deteriorated air quality regions is very helpful in devising better environmental practices and mitigation efforts. In the present study, an attempt has been made to identify the air pollutant dispersion patterns especially NOX due to vehicular and industrial sources over a rapidly developing urban city, Visakhapatnam (17°42’ N, 83°20’ E), India, during April 2009. Using the emission factors of different vehicles as well as the industry, a high resolution 1 km x 1 km gridded emission inventory has been developed for Visakhapatnam city. A dispersion model AERMOD with explicit representation of planetary boundary layer (PBL) dynamics and offline coupled through a developed coupler mechanism with a high resolution mesoscale model WRF-ARW resolution for simulating the dispersion patterns of NOX is used in the work. The meteorological as well as PBL parameters obtained by employing two PBL schemes viz., non-local Yonsei University (YSU) and local Mellor-Yamada-Janjic (MYJ) of WRF-ARW model, which are reasonably representing the boundary layer parameters are considered for integrating AERMOD. Significantly different dispersion patterns of NOX have been noticed between summer and winter months. The simulated NOX concentration is validated with available six monitoring stations of Central Pollution Control Board, India. Statistical analysis of model evaluated concentrations with the observations reveals that WRF-ARW of YSU scheme with AERMOD has shown better performance. The deteriorated air quality locations are identified over Visakhapatnam based on the validated model simulations of NOX concentrations. The present study advocates the utility of tNumerical Simulation of Air Pollutant Using Coupled AERMOD-WRF Modeling System over Visakhapatnam: A Case Studyhe developed gridded emission inventory of NOX with coupled WRF-AERMOD modeling system for air quality assessment over the study region.Keywords: WRF-ARW, AERMOD, planetary boundary layer, air quality
Procedia PDF Downloads 2801617 Experimental and Modal Determination of the State-Space Model Parameters of a Uni-Axial Shaker System for Virtual Vibration Testing
Authors: Jonathan Martino, Kristof Harri
Abstract:
In some cases, the increase in computing resources makes simulation methods more affordable. The increase in processing speed also allows real time analysis or even more rapid tests analysis offering a real tool for test prediction and design process optimization. Vibration tests are no exception to this trend. The so called ‘Virtual Vibration Testing’ offers solution among others to study the influence of specific loads, to better anticipate the boundary conditions between the exciter and the structure under test, to study the influence of small changes in the structure under test, etc. This article will first present a virtual vibration test modeling with a main focus on the shaker model and will afterwards present the experimental parameters determination. The classical way of modeling a shaker is to consider the shaker as a simple mechanical structure augmented by an electrical circuit that makes the shaker move. The shaker is modeled as a two or three degrees of freedom lumped parameters model while the electrical circuit takes the coil impedance and the dynamic back-electromagnetic force into account. The establishment of the equations of this model, describing the dynamics of the shaker, is presented in this article and is strongly related to the internal physical quantities of the shaker. Those quantities will be reduced into global parameters which will be estimated through experiments. Different experiments will be carried out in order to design an easy and practical method for the identification of the shaker parameters leading to a fully functional shaker model. An experimental modal analysis will also be carried out to extract the modal parameters of the shaker and to combine them with the electrical measurements. Finally, this article will conclude with an experimental validation of the model.Keywords: lumped parameters model, shaker modeling, shaker parameters, state-space, virtual vibration
Procedia PDF Downloads 2691616 Business Feasibility of Online Marketing of Food and Beverages Products in India
Authors: Dimpy Shah
Abstract:
The global economy has substantially changed in last three decades. Now almost all markets are transparent and visible for global customers. The corporates are now no more reliant on local markets for trade. The information technology revolution has changed business dynamics and marketing practices of corporate. The markets are divided into two different formats: traditional and virtual. In very short span of time, many e-commerce portals have captured global market. This strategy is well supported by global delivery system of multinational logistic companies. Now the markets are dealing with global supply chain networks, which are more demand driven and customer oriented. The corporate have realized importance of supply chain integration and marketing in this competitive environment. The Indian markets are also significantly affected with all these changes. In terms of population, India is in second place after China. In terms of demography, almost half of the population is of youth. It has been observed that the Indian youth are more inclined towards e-commerce and prefer to buy goods from web portal. Initially, this trend was observed in Indian service sector, textile and electronic goods and now further extended in other product categories. The FMCG companies have also recognized this change and started integration of their supply chain with e-commerce platform. This paper attempts to understand contemporary marketing practices of corporate in e-commerce business in Indian food and beverages segment and also tries to identify innovative marketing practices for proper execution of their strategies. The findings are mainly focused on supply chain re-integration and brand building strategies with proper utilization of social media.Keywords: FMCG (Fast Moving Consumer Goods), ISCM (Integrated supply chain management), RFID (Radio Frequency Identification), traditional and virtual formats
Procedia PDF Downloads 2751615 Using Q-Learning to Auto-Tune PID Controller Gains for Online Quadcopter Altitude Stabilization
Authors: Y. Alrubyli
Abstract:
Unmanned Arial Vehicles (UAVs), and more specifically, quadcopters need to be stable during their flights. Altitude stability is usually achieved by using a PID controller that is built into the flight controller software. Furthermore, the PID controller has gains that need to be tuned to reach optimal altitude stabilization during the quadcopter’s flight. For that, control system engineers need to tune those gains by using extensive modeling of the environment, which might change from one environment and condition to another. As quadcopters penetrate more sectors, from the military to the consumer sectors, they have been put into complex and challenging environments more than ever before. Hence, intelligent self-stabilizing quadcopters are needed to maneuver through those complex environments and situations. Here we show that by using online reinforcement learning with minimal background knowledge, the altitude stability of the quadcopter can be achieved using a model-free approach. We found that by using background knowledge instead of letting the online reinforcement learning algorithm wander for a while to tune the PID gains, altitude stabilization can be achieved faster. In addition, using this approach will accelerate development by avoiding extensive simulations before applying the PID gains to the real-world quadcopter. Our results demonstrate the possibility of using the trial and error approach of reinforcement learning combined with background knowledge to achieve faster quadcopter altitude stabilization in different environments and conditions.Keywords: reinforcement learning, Q-leanring, online learning, PID tuning, unmanned aerial vehicle, quadcopter
Procedia PDF Downloads 1741614 The Approach of New Urbanism Model to Identify the Sustainability of 'Kampung Kota'
Authors: Nadhia Maharany Siara, Muammal, Ilham Nurhakim, Rofifah Yusadi, M. Adie Putra Tanggara, I. Nyoman Suluh Wijaya
Abstract:
Urbanization in urban areas has impact to the demand of land use for housing, and it began to occur development in the high-density area called Kampung Kota. Kampung Kota grows and develops without planning or organically. The existence of Kampung Kota, becoming identity of the city development in Indonesia, gives self-identity to the city planning in Indonesia, but the existence of Kampung Kota in the development of the city in Indonesia is often considered as a source of environment, health, and social problems. This cause negative perception about the sustainability of Kampung Kota. This research aims to identify morphology and sustainability level of Kampung Kota in Polehan Sub-District, Blimbing District, Malang City. So far, there have not been many studies that define sustainability of Kampung Kota especially from the perspective of Kampung Kota morphology as a part of urban housing areas. This research took place in in Polehan Sub-District, Blimbing District, Malang City which is one of the oldest Kampung Kota in Malang City. Identification of the sustainability level in this research is done by defining the morphology of Kampung Kota in Polehan Sub-District, Blimbing District, Malang City with a descriptive approach to the observation case (Kampung Kota Polehan Sub-District). After that, definition of sustainability level is defined by quantifying the spatial structure by using the criteria from the new urbanism model which consist of buildings and populations density, compactness, diversity and mix land uses and sustainable transportation. In this case, the use of new urbanism model approach is very appropriate. New Urbanism is a design-driven strategy that is based on traditional forms to minimize urban sprawl in the suburbs. The result obtained from this study is the hometown of the level of sustainability in Polehan Sub-District, Blimbing District, Malang City of 3.2 and can be considered to have a good sustainability.Keywords: Kampung Kota, new urbanism model, sustainability, urban morphology
Procedia PDF Downloads 2901613 Vocational Rehabilitation for People with Disabilities: Employment Rates, Job Persistence and Wages
Authors: Hester Fass, Ofir Pinto
Abstract:
Research indicates gaps in education, employment rates and wages between people with disabilities and those without disabilities. One of the main tools available to reduce these gaps is vocational rehabilitation. In order to examine the effects of vocational rehabilitation, a follow-up study, based on comprehensive administrative data, was conducted. The study included 88,286 people with disabilities who participated in vocational rehabilitation of the National Insurance Institute of Israel (NII), and completed the process between 1999 and 2012. Research variables included: employment rates, job persistence and wage levels. This research, the first of its kind in Israel, has several unique aspects: a)a long-range follow-up study on people who completed vocational rehabilitation; b) examination of a broad population spectrum, including also people that are not eligible to disability pensions ; c) a comparison among those with work-related injuries, those injured in hostile acts and those injured in other circumstances; and finally d) the identification of the characteristics of those who are entitled to vocational rehabilitation but who do not participate in any vocational rehabilitation plan. The most notable results include: 1. Vocational rehabilitation contributed to employment, job persistence and wage levels. Participation in vocational rehabilitation resulted in an employment rate of 65% within two years after completing the program, and 73% eventually. Participation in a vocational rehabilitation plan also contributed to job persistence and wage levels. 2. Vocational rehabilitation plans aimed at integration in universal frameworks increased the chances of being employed, persisting at the job and receiving a higher wage than did the vocational rehabilitation aimed at selective frameworks (such as sheltered workshops). 3. The type of disability affected the chances of integration in a vocational rehabilitation plan and in the labor market. People with a disability from birth had greater chances of integration in a vocational rehabilitation plan, while the type of disability and its severity affected the chances of the person with disabilities to find employment.Keywords: vocational rehabilitation, employment, job persistence, wages
Procedia PDF Downloads 4531612 EEG Analysis of Brain Dynamics in Children with Language Disorders
Authors: Hamed Alizadeh Dashagholi, Hossein Yousefi-Banaem, Mina Naeimi
Abstract:
Current study established for EEG signal analysis in patients with language disorder. Language disorder can be defined as meaningful delay in the use or understanding of spoken or written language. The disorder can include the content or meaning of language, its form, or its use. Here we applied Z-score, power spectrum, and coherence methods to discriminate the language disorder data from healthy ones. Power spectrum of each channel in alpha, beta, gamma, delta, and theta frequency bands was measured. In addition, intra hemispheric Z-score obtained by scoring algorithm. Obtained results showed high Z-score and power spectrum in posterior regions. Therefore, we can conclude that peoples with language disorder have high brain activity in frontal region of brain in comparison with healthy peoples. Results showed that high coherence correlates with irregularities in the ERP and is often found during complex task, whereas low coherence is often found in pathological conditions. The results of the Z-score analysis of the brain dynamics showed higher Z-score peak frequency in delta, theta and beta sub bands of Language Disorder patients. In this analysis there were activity signs in both hemispheres and the left-dominant hemisphere was more active than the right.Keywords: EEG, electroencephalography, coherence methods, language disorder, power spectrum, z-score
Procedia PDF Downloads 4241611 Real-Time Web Map Service Based on Solar-Powered Unmanned Aerial Vehicle
Authors: Sunghun Jung
Abstract:
The existing web map service providers contract with the satellite operators to update their maps by paying an astronomical amount of money, but the cost could be minimized by operating a cheap and small UAV. In contrast to the satellites, we only need to replace aged battery packs from time to time for the usage of UAVs. Utilizing both a regular camera and an infrared camera mounted on a small, solar-powered, long-endurance, and hoverable UAV, daytime ground surface photographs, and nighttime infrared photographs will be continuously and repeatedly uploaded to the web map server and overlapped with the existing ground surface photographs in real-time. The real-time web map service using a small, solar-powered, long-endurance, and hoverable UAV can also be applied to the surveillance missions, in particular, to detect border area intruders. The improved real-time image stitching algorithm is developed for the graphic map data overlapping. Also, a small home server will be developed to manage the huge size of incoming map data. The map photographs taken at tens or hundreds of kilometers by a UAV would improve the map graphic resolution compared to the map photographs taken at thousands of kilometers by satellites since the satellite photographs are limited by weather conditions.Keywords: long-endurance, real-time web map service (RWMS), solar-powered, unmanned aerial vehicle (UAV)
Procedia PDF Downloads 2741610 The Malfatti’s Problem in Reuleaux Triangle
Authors: Ching-Shoei Chiang
Abstract:
The Malfatti’s Problem is to ask for fitting 3 circles into a right triangle such that they are tangent to each other, and each circle is also tangent to a pair of the triangle’s side. This problem has been extended to any triangle (called general Malfatti’s Problem). Furthermore, the problem has been extended to have 1+2+…+n circles, we call it extended general Malfatti’s problem, these circles whose tangency graph, using the center of circles as vertices and the edge connect two circles center if these two circles tangent to each other, has the structure as Pascal’s triangle, and the exterior circles of these circles tangent to three sides of the triangle. In the extended general Malfatti’s problem, there are closed-form solutions for n=1, 2, and the problem becomes complex when n is greater than 2. In solving extended general Malfatti’s problem (n>2), we initially give values to the radii of all circles. From the tangency graph and current radii, we can compute angle value between two vectors. These vectors are from the center of the circle to the tangency points with surrounding elements, and these surrounding elements can be the boundary of the triangle or other circles. For each circle C, there are vectors from its center c to its tangency point with its neighbors (count clockwise) pi, i=0, 1,2,..,n. We add all angles between cpi to cp(i+1) mod (n+1), i=0,1,..,n, call it sumangle(C) for circle C. Using sumangle(C), we can reduce/enlarge the radii for all circles in next iteration, until sumangle(C) is equal to 2πfor all circles. With a similar idea, this paper proposed an algorithm to find the radii of circles whose tangency has the structure of Pascal’s triangle, and the exterior circles of these circles are tangent to the unit Realeaux Triangle.Keywords: Malfatti’s problem, geometric constraint solver, computer-aided geometric design, circle packing, data visualization
Procedia PDF Downloads 1321609 Diagnosis and Analysis of Automated Liver and Tumor Segmentation on CT
Authors: R. R. Ramsheeja, R. Sreeraj
Abstract:
For view the internal structures of the human body such as liver, brain, kidney etc have a wide range of different modalities for medical images are provided nowadays. Computer Tomography is one of the most significant medical image modalities. In this paper use CT liver images for study the use of automatic computer aided techniques to calculate the volume of the liver tumor. Segmentation method is used for the detection of tumor from the CT scan is proposed. Gaussian filter is used for denoising the liver image and Adaptive Thresholding algorithm is used for segmentation. Multiple Region Of Interest(ROI) based method that may help to characteristic the feature different. It provides a significant impact on classification performance. Due to the characteristic of liver tumor lesion, inherent difficulties appear selective. For a better performance, a novel proposed system is introduced. Multiple ROI based feature selection and classification are performed. In order to obtain of relevant features for Support Vector Machine(SVM) classifier is important for better generalization performance. The proposed system helps to improve the better classification performance, reason in which we can see a significant reduction of features is used. The diagnosis of liver cancer from the computer tomography images is very difficult in nature. Early detection of liver tumor is very helpful to save the human life.Keywords: computed tomography (CT), multiple region of interest(ROI), feature values, segmentation, SVM classification
Procedia PDF Downloads 5091608 Prevalence of Different Poultry Parasitoses in Farms Modern in the North of Ivory Coast
Authors: Coulibaly Fatoumata, Gragnon Biego, Aka N. David, Mbari K. Benjamin, Soro Y. René, Ndiaye Jean-louis
Abstract:
Poultry is nowadays one of the most consumed sources of protein, and its livestock represents one of the few opportunities for savings, investment and protection against risk. It provides income for the most vulnerable sections of society, in particular, women (70%) and children who mainly practice this breeding. A study was conducted in the commune of Korhogo at the level of 52 poultry farms, the objective of which was to know the epidemiological situation of parasitism external and internal poultry in order to contribute to the improvement of the health status of modern poultry farms in the said commune. The method described by OIE (2005), consisting of using the standard formula (n = δ2*p*(1-p) *c /i2), made it possible to calculate the size of the sample. Then, samples of droppings and ectoparasites were taken from the affected farms. After analysis and identification, two (2) species of mallophagous lice, including Menopon gallinae (50%) and Menacanthus stramineus (33%) and a species of bug Cimex lectularius (17%) were highlighted. The laying hens were more infested than broilers. Regarding gastrointestinal parasites, different species (six) have been identified: Trichostrongylus tenuis (17%), Syngamus trachea (19%), Heterakis sp (10%), Ascaridia sp (17%), Raillietina sp (8%) and Eimeria sp (29%). In addition, coccidiosis (Eimeria sp) proved to be the dominant pathology representing 67% of pathologies in broiler farms and 33% in poultry farms. The presence of these parasitoses in these modern farms constitutes a constraint major contribution to productivity and their development In view of all these difficulties, proposals have been made in order to participate in the establishment of a good prophylaxis program (health and medical). In addition, the Ivorian government, with the support of veterinarians, must interfere more in the organization of the health monitoring of traditional chickens and poultry in general through supervision and training in order to preserve public health ( animal, human and environmental health).Keywords: gastrointestinal parasites, ectoparasites, pathologies, poultry, korhogo.
Procedia PDF Downloads 851607 Automatic Detection of Defects in Ornamental Limestone Using Wavelets
Authors: Maria C. Proença, Marco Aniceto, Pedro N. Santos, José C. Freitas
Abstract:
A methodology based on wavelets is proposed for the automatic location and delimitation of defects in limestone plates. Natural defects include dark colored spots, crystal zones trapped in the stone, areas of abnormal contrast colors, cracks or fracture lines, and fossil patterns. Although some of these may or may not be considered as defects according to the intended use of the plate, the goal is to pair each stone with a map of defects that can be overlaid on a computer display. These layers of defects constitute a database that will allow the preliminary selection of matching tiles of a particular variety, with specific dimensions, for a requirement of N square meters, to be done on a desktop computer rather than by a two-hour search in the storage park, with human operators manipulating stone plates as large as 3 m x 2 m, weighing about one ton. Accident risks and work times are reduced, with a consequent increase in productivity. The base for the algorithm is wavelet decomposition executed in two instances of the original image, to detect both hypotheses – dark and clear defects. The existence and/or size of these defects are the gauge to classify the quality grade of the stone products. The tuning of parameters that are possible in the framework of the wavelets corresponds to different levels of accuracy in the drawing of the contours and selection of the defects size, which allows for the use of the map of defects to cut a selected stone into tiles with minimum waste, according the dimension of defects allowed.Keywords: automatic detection, defects, fracture lines, wavelets
Procedia PDF Downloads 2481606 Comparison of Bactec plus Blood Culture Media to BacT/Alert FAN plus Blood Culture Media for Identification of Bacterial Pathogens in Clinical Samples Containing Antibiotics
Authors: Recep Kesli, Huseyin Bilgin, Ela Tasdogan, Ercan Kurtipek
Abstract:
Aim: The aim of this study was to compare resin based Bactec plus aerobic/anaerobic blood culture bottles (Becton Dickinson, MD, USA) and polymeric beads based BacT/Alert FA/FN plus blood culture bottles (bioMerieux, NC, USA) in terms of microorganisms recovery rates and time to detection (TTD) in the patients receiving antibiotic treatment. Method: Blood culture samples were taken from the patients who admitted to the intensive care unit and received antibiotic treatment. Forty milliliters of blood from patients were equally distributed into four types of bottles: Bactec Plus aerobic, Bactec Plus anaerobic, BacT/Alert FA Plus, BacT/Alert FN Plus. Bactec Plus and BacT/Alert Plus media were compared to culture recovery rates and TTD. Results: Blood culture samples were collected from 382 patients hospitalized in the intensive care unit and 245 patients who were diagnosed as having bloodstream infections were included in the study. A total of 1528 Bactec Plus aerobic, Bactec Plus anaerobic, BacT/Alert FA Plus, BacT/Alert FN Plus blood culture bottles analyzed and 176, 144, 154, 126 bacteria or fungi were isolated, respectively. Gram-negative and gram-positive bacteria were significantly more frequently isolated in the resin-based Bactec Plus bottles than in the polymeric beads based BacT/Alert Plus bottles. The Bactec Plus and BacT/Alert Plus media recovery rates were similar for fungi and anaerobic bacteria. The mean TTDs in the Bactec Plus bottles were shorter than those in the BacT/Alert Plus bottles regardless of the microorganisms. Conclusion: The results of this study showed that resin-containing media is a reliable and time-saving tool for patients who are receiving antibiotic treatment due to sepsis in the intensive care unit.Keywords: Bactec Plus, BacT/Alert Plus, blood culture, antibiotic
Procedia PDF Downloads 1461605 Leisure Time Physical Activity during Pregnancy and the Associated Factors Based on Health Belief Model: A Cross Sectional Study
Authors: Xin Chen, Xiao Yang, Rongrong Han, Lu Chen, Lingling Gao
Abstract:
Background: Leisure time physical activity (LTPA) benefits both pregnant women and their fetuses. The guidelines recommended that pregnant women should do at least 150 minutes of moderate-intensity aerobic physical activity throughout the week. The aim of this study was to investigate the rate of LTPA participation among Chinese pregnant women and to identify its predictors based on the health belief model. Methods: A cross-sectional study was conducted from June 2019 to September 2019 in Changchun, China. A total of 225 pregnant women aged 18 years or older with no severe physical or mental disease were recruited in the obstetric clinic. Self-administered questionnaires were used to collect data. LTPA was assessed by a pregnant physical activity questionnaire (PPAQ). A revised pregnancy physical activity health belief scale and social-demographic and perinatal characteristics factors were collected and used to predict LTPA participation. Data were analyzed using descriptive statistics and multivariate logistic regression. Results: The participants had a high level of perceived susceptibility, perceived severity, perceived benefits, and action clues, with mean item scores above 3.5. The predictors of LTPA in Chinese pregnant women were pre-pregnancy exercise habits [OR 3.236 (95% CI:1.632, 6.416)], perceived susceptibility score [OR 2.083 (95% CI:1.002, 4.331)], and perceived barriers score [OR 3.113 (95%CI:1.462, 6.626)]. Conclusions: The results of this study will lead to better identification of pregnant women who may not participate in LTPA. Healthcare professionals should be cognizant of issues that may affect LTPA participation among pregnant women, including pre-pregnancy exercise habits, perceived susceptibility, and perceived barriers.Keywords: pregnancy, health belief model., leisure time physical activity, factors
Procedia PDF Downloads 791604 Multi-Layer Multi-Feature Background Subtraction Using Codebook Model Framework
Authors: Yun-Tao Zhang, Jong-Yeop Bae, Whoi-Yul Kim
Abstract:
Background modeling and subtraction in video analysis has been widely proved to be an effective method for moving objects detection in many computer vision applications. Over the past years, a large number of approaches have been developed to tackle different types of challenges in this field. However, the dynamic background and illumination variations are two of the most frequently occurring issues in the practical situation. This paper presents a new two-layer model based on codebook algorithm incorporated with local binary pattern (LBP) texture measure, targeted for handling dynamic background and illumination variation problems. More specifically, the first layer is designed by block-based codebook combining with LBP histogram and mean values of RGB color channels. Because of the invariance of the LBP features with respect to monotonic gray-scale changes, this layer can produce block-wise detection results with considerable tolerance of illumination variations. The pixel-based codebook is employed to reinforce the precision from the outputs of the first layer which is to eliminate false positives further. As a result, the proposed approach can greatly promote the accuracy under the circumstances of dynamic background and illumination changes. Experimental results on several popular background subtraction datasets demonstrate a very competitive performance compared to previous models.Keywords: background subtraction, codebook model, local binary pattern, dynamic background, illumination change
Procedia PDF Downloads 2171603 A Comparative Asessment of Some Algorithms for Modeling and Forecasting Horizontal Displacement of Ialy Dam, Vietnam
Authors: Kien-Trinh Thi Bui, Cuong Manh Nguyen
Abstract:
In order to simulate and reproduce the operational characteristics of a dam visually, it is necessary to capture the displacement at different measurement points and analyze the observed movement data promptly to forecast the dam safety. The accuracy of forecasts is further improved by applying machine learning methods to data analysis progress. In this study, the horizontal displacement monitoring data of the Ialy hydroelectric dam was applied to machine learning algorithms: Gaussian processes, multi-layer perceptron neural networks, and the M5-rules algorithm for modelling and forecasting of horizontal displacement of the Ialy hydropower dam (Vietnam), respectively, for analysing. The database which used in this research was built by collecting time series of data from 2006 to 2021 and divided into two parts: training dataset and validating dataset. The final results show all three algorithms have high performance for both training and model validation, but the MLPs is the best model. The usability of them are further investigated by comparison with a benchmark models created by multi-linear regression. The result show the performance which obtained from all the GP model, the MLPs model and the M5-Rules model are much better, therefore these three models should be used to analyze and predict the horizontal displacement of the dam.Keywords: Gaussian processes, horizontal displacement, hydropower dam, Ialy dam, M5-Rules, multi-layer perception neural networks
Procedia PDF Downloads 2101602 Differential Expression of Biomarkers in Cancer Stem Cells and Side Populations in Breast Cancer Cell Lines
Authors: Dipali Dhawan
Abstract:
Cancerous epithelial cells are confined to a primary site by the continued expression of adhesion molecules and the intact basal lamina. However, as the cancer progresses some cells are believed to undergo an epithelial-mesenchymal transition (EMT) event, leading to increased motility, invasion and, ultimately, metastasis of the cells from the primary tumour to secondary sites within the body. These disseminated cancer cells need the ability to self-renew, as stem cells do, in order to establish and maintain a heterogeneous metastatic tumour mass. Identification of the specific subpopulation of cancer stem cells amenable to the process of metastasis is highly desirable. In this study, we have isolated and characterized cancer stem cells from luminal and basal breast cancer cell lines (MDA-MB-231, MDA-MB-453, MDA-MB-468, MCF7 and T47D) on the basis of cell surface markers CD44 and CD24; as well as Side Populations (SP) using Hoechst 33342 dye efflux. The isolated populations were analysed for epithelial and mesenchymal markers like E-cadherin, N-cadherin, Sfrp1 and Vimentin by Western blotting and Immunocytochemistry. MDA-MB-231 cell lines contain a major population of CD44+CD24- cells whereas MCF7, T47D and MDA-MB-231 cell lines show a side population. We observed higher expression of N-cadherin in MCF-7 SP cells as compared to MCF-7NSP (Non-side population) cells suggesting that the SP cells are mesenchymal like cells and hence express increased N-cadherin with stem cell-like properties. There was an expression of Sfrp1 in the MCF7- NSP cells as compared to no expression in MCF7-SP cells, which suggests that the Wnt pathway is expressed in the MCF7-SP cells. The mesenchymal marker Vimentin was expressed only in MDA-MB-231 cells. Hence, understanding the breast cancer heterogeneity would enable a better understanding of the disease progression and therapeutic targeting.Keywords: cancer stem cells, epithelial to mesenchymal transition, biomarkers, breast cancer
Procedia PDF Downloads 5261601 Loss Function Optimization for CNN-Based Fingerprint Anti-Spoofing
Authors: Yehjune Heo
Abstract:
As biometric systems become widely deployed, the security of identification systems can be easily attacked by various spoof materials. This paper contributes to finding a reliable and practical anti-spoofing method using Convolutional Neural Networks (CNNs) based on the types of loss functions and optimizers. The types of CNNs used in this paper include AlexNet, VGGNet, and ResNet. By using various loss functions including Cross-Entropy, Center Loss, Cosine Proximity, and Hinge Loss, and various loss optimizers which include Adam, SGD, RMSProp, Adadelta, Adagrad, and Nadam, we obtained significant performance changes. We realize that choosing the correct loss function for each model is crucial since different loss functions lead to different errors on the same evaluation. By using a subset of the Livdet 2017 database, we validate our approach to compare the generalization power. It is important to note that we use a subset of LiveDet and the database is the same across all training and testing for each model. This way, we can compare the performance, in terms of generalization, for the unseen data across all different models. The best CNN (AlexNet) with the appropriate loss function and optimizers result in more than 3% of performance gain over the other CNN models with the default loss function and optimizer. In addition to the highest generalization performance, this paper also contains the models with high accuracy associated with parameters and mean average error rates to find the model that consumes the least memory and computation time for training and testing. Although AlexNet has less complexity over other CNN models, it is proven to be very efficient. For practical anti-spoofing systems, the deployed version should use a small amount of memory and should run very fast with high anti-spoofing performance. For our deployed version on smartphones, additional processing steps, such as quantization and pruning algorithms, have been applied in our final model.Keywords: anti-spoofing, CNN, fingerprint recognition, loss function, optimizer
Procedia PDF Downloads 1361600 Breeding Cotton for Annual Growth Habit: Remobilizing End-of-season Perennial Reserves for Increased Yield
Authors: Salman Naveed, Nitant Gandhi, Grant Billings, Zachary Jones, B. Todd Campbell, Michael Jones, Sachin Rustgi
Abstract:
Cotton (Gossypium spp.) is the primary source of natural fiber in the U.S. and a major crop in the Southeastern U.S. Despite constant efforts to increase the cotton fiber yield, the yield gain has stagnated. Therefore, we undertook a novel approach to improve the cotton fiber yield by altering its growth habit from perennial to annual. In this effort, we identified genotypes with high-expression alleles of five floral induction and meristem identity genes (FT, SOC1, FUL, LFY, and AP1) from an upland cotton mini-core collection and crossed them in various combinations to develop cotton lines with annual growth habit, optimal flowering time and enhanced productivity. To facilitate the characterization of genotypes with the desired combinations of stacked alleles, we identified markers associated with the gene expression traits via genome-wide association analysis using a 63K SNP Array (Hulse-Kemp et al. 2015 G3 5:1187). Over 14,500 SNPs showed polymorphism and were used for association analysis. A total of 396 markers showed association with expression traits. Out of these 396 markers, 159 mapped to genes, 50 to untranslated regions, and 187 to random genomic regions. Biased genomic distribution of associated markers was observed where more trait-associated markers mapped to the cotton D sub-genome. Many quantitative trait loci coincided at specific genomic regions. This observation has implications as these traits could be bred together. The analysis also allowed the identification of candidate regulators of the expression patterns of these floral induction and meristem identity genes whose functions will be validated via virus-induced gene silencing.Keywords: cotton, GWAS, QTL, expression traits
Procedia PDF Downloads 1511599 Geometric Imperfections in Lattice Structures: A Simulation Strategy to Predict Strength Variability
Authors: Xavier Lorang, Ahmadali Tahmasebimoradi, Chetra Mang, Sylvain Girard
Abstract:
The additive manufacturing processes (e.g. selective laser melting) allow us to produce lattice structures which have less weight, higher impact absorption capacity, and better thermal exchange property compared to the classical structures. Unfortunately, geometric imperfections (defects) in the lattice structures are by-products results of the manufacturing process. These imperfections decrease the lifetime and the strength of the lattice structures and alternate their mechanical responses. The objective of the paper is to present a simulation strategy which allows us to take into account the effect of the geometric imperfections on the mechanical response of the lattice structure. In the first part, an identification method of geometric imperfection parameters of the lattice structure based on point clouds is presented. These point clouds are based on tomography measurements. The point clouds are fed into the platform LATANA (LATtice ANAlysis) developed by IRT-SystemX to characterize the geometric imperfections. This is done by projecting the point clouds of each microbeam along the beam axis onto a 2D surface. Then, by fitting an ellipse to the 2D projections of the points, the geometric imperfections are characterized by introducing three parameters of an ellipse; semi-major/minor axes and angle of rotation. With regard to the calculated parameters of the microbeam geometric imperfections, a statistical analysis is carried out to determine a probability density law based on a statistical hypothesis. The microbeam samples are randomly drawn from the density law and are used to generate lattice structures. In the second part, a finite element model for the lattice structure with the simplified geometric imperfections (ellipse parameters) is presented. This numerical model is used to simulate the generated lattice structures. The propagation of the uncertainties of geometric imperfections is shown through the distribution of the computed mechanical responses of the lattice structures.Keywords: additive manufacturing, finite element model, geometric imperfections, lattice structures, propagation of uncertainty
Procedia PDF Downloads 1871598 Diversification of Rice-Based Cropping Systems under Irrigated Condition
Authors: A. H. Nanher, N. P. Singh
Abstract:
In India, Agriculture is largely in rice- based cropping system. It has indicated decline in factor productivity along with emergence of multi - nutrient deficiency, buildup of soil pathogen and weed flora because it operates and removes nutrients from the same rooting depth. In designing alternative cropping systems, the common approaches are crop intensification, crop diversification and cultivar options. The intensification leads to the diversification of the cropping system. Intensification is achieved by introducing an additional component crop in a pre-dominant sequential system by desirable adjustments in cultivars of one or all the component crops. Invariably, this results in higher land use efficiency and productivity per unit time Crop Diversification through such crop and inclusion of fodder crops help to improve the economic situation of small and marginal farmers because of higher income. Inclusion of crops in sequential and intercropping systems reduces some obnoxious weeds through formation of canopies due to competitive planting pattern and thus provides an opportunity to utilize cropping systems as a tool of weed management with non-chemical means. Use of organic source not only acts as supplement for fertilizer (nitrogen) but also improve the physico-chemical properties of soils. Production and use of nitrogen rich biomass offer better prospect for supplementing chemical fertilizers on regular basis. Such biological diversity brings yield and economic stability because of its potential for compensation among components of the system. In a particular agro-climatic and resource condition, the identification of most suitable crop sequence is based on its productivity, stability, land use efficiency as well as production efficiency and its performance is chiefly judged in terms of productivity and net return.Keywords: integrated farming systems, sustainable intensification, system of crop intensification, wheat
Procedia PDF Downloads 4241597 A Novel Epitope Prediction for Vaccine Designing against Ebola Viral Envelope Proteins
Authors: Manju Kanu, Subrata Sinha, Surabhi Johari
Abstract:
Viral proteins of Ebola viruses belong to one of the best studied viruses; however no effective prevention against EBOV has been developed. Epitope-based vaccines provide a new strategy for prophylactic and therapeutic application of pathogen-specific immunity. A critical requirement of this strategy is the identification and selection of T-cell epitopes that act as vaccine targets. This study describes current methodologies for the selection process, with Ebola virus as a model system. Hence great challenge in the field of ebola virus research is to design universal vaccine. A combination of publicly available bioinformatics algorithms and computational tools are used to screen and select antigen sequences as potential T-cell epitopes of supertypes Human Leukocyte Antigen (HLA) alleles. MUSCLE and MOTIF tools were used to find out most conserved peptide sequences of viral proteins. Immunoinformatics tools were used for prediction of immunogenic peptides of viral proteins in zaire strains of Ebola virus. Putative epitopes for viral proteins (VP) were predicted from conserved peptide sequences of VP. Three tools NetCTL 1.2, BIMAS and Syfpeithi were used to predict the Class I putative epitopes while three tools, ProPred, IEDB-SMM-align and NetMHCII 2.2 were used to predict the Class II putative epitopes. B cell epitopes were predicted by BCPREDS 1.0. Immunogenic peptides were identified and selected manually by putative epitopes predicted from online tools individually for both MHC classes. Finally sequences of predicted peptides for both MHC classes were looked for common region which was selected as common immunogenic peptide. The immunogenic peptides were found for viral proteins of Ebola virus: epitopes FLESGAVKY, SSLAKHGEY. These predicted peptides could be promising candidates to be used as target for vaccine design.Keywords: epitope, b cell, immunogenicity, ebola
Procedia PDF Downloads 3141596 Quantification of Pollution Loads for the Rehabilitation of Pusu River
Authors: Abdullah Al-Mamun, Md. Nuruzzaman, Md. Noor Salleh, Muhammad Abu Eusuf, Ahmad Jalal Khan Chowdhury, Mohd. Zaki M. Amin, Norlida Mohd. Dom
Abstract:
Identification of pollution sources and determination of pollution loads from all areas are very important for sustainable rehabilitation of any contaminated river. Pusu is a small river which, flows through the main campus of International Islamic University Malaysia (IIUM) at Gombak. Poor aesthetics of the river, which is flowing through the entrance of the campus, gives negative impression to the local and international visitors. As such, this study is being conducted to find ways to rehabilitate the river in a sustainable manner. The point and non-point pollution sources of the river basin are identified. Upper part of the 12.6 km2 river basin is covered with secondary forest. However, it is the lower-middle reaches of the river basin which is being cleared for residential development and source of high sediment load. Flow and concentrations of the common pollutants, important for a healthy river, such as Biochemical Oxygen Demand (BOD), Chemical Oxygen Demand (COD), Suspended Solids (SS), Turbidity, pH, Ammoniacal Nitrogen (AN), Total Nitrogen (TN) and Total Phosphorus (TP) are determined. Annual pollution loading to the river was calculated based on the primary and secondary data. Concentrations of SS were high during the rainy day due to contribution from the non-point sources. There are 7 ponds along the river system within the campus, which are severely affected by high sediment load from the land clearing activities. On the other hand, concentrations of other pollutants were high during the non-rainy days. The main sources of point pollution are the hostels, cafeterias, sewage treatment plants located in the campus. Therefore, both pollution sources need to be controlled in order to rehabilitate the river in a sustainable manner.Keywords: river pollution, rehabilitation, point pollution source, non-point pollution sources, pollution loading
Procedia PDF Downloads 3541595 STD-NMR Based Protein Engineering of the Unique Arylpropionate-Racemase AMDase G74C
Authors: Sarah Gaßmeyer, Nadine Hülsemann, Raphael Stoll, Kenji Miyamoto, Robert Kourist
Abstract:
Enzymatic racemization allows the smooth interconversion of stereocenters under very mild reaction conditions. Racemases find frequent applications in deracemization and dynamic kinetic resolutions. Arylmalonate decarboxylase (AMDase) from Bordetella Bronchiseptica has high structural similarity to amino acid racemases. These cofactor-free racemases are able to break chemically strong CH-bonds under mild conditions. The racemase-like catalytic machinery of mutant G74C conveys it a unique activity in the racemisation of pharmacologically relevant derivates of 2-phenylpropionic acid (profenes), which makes AMDase G74C an interesting object for the mechanistic investigation of cofactor-independent racemases. Structure-guided protein engineering achieved a variant of this unique racemase with 40-fold increased activity in the racemisation of several arylaliphatic carboxylic acids. By saturation–transfer–difference NMR spectroscopy (STD-NMR), substrate binding during catalysis was investigated. All atoms of the substrate showed interactions with the enzyme. STD-NMR measurements revealed distinct nuclear Overhauser effects in experiments with and without molecular conversion. The spectroscopic analysis led to the identification of several amino acid residues whose variation increased the activity of G74C. While single-amino acid exchanges increased the activity moderately, structure-guided saturation mutagenesis yielded a quadruple mutant with a 40 times higher reaction rate. This study presents STD-NMR as versatile tool for the analysis of enzyme-substrate interactions in catalytically competent systems and for the guidance of protein engineering.Keywords: racemase, rational protein design, STD-NMR, structure guided saturation mutagenesis
Procedia PDF Downloads 3041594 Hybrid Localization Schemes for Wireless Sensor Networks
Authors: Fatima Babar, Majid I. Khan, Malik Najmus Saqib, Muhammad Tahir
Abstract:
This article provides range based improvements over a well-known single-hop range free localization scheme, Approximate Point in Triangulation (APIT) by proposing an energy efficient Barycentric coordinate based Point-In-Triangulation (PIT) test along with PIT based trilateration. These improvements result in energy efficiency, reduced localization error and improved localization coverage compared to APIT and its variants. Moreover, we propose to embed Received signal strength indication (RSSI) based distance estimation in DV-Hop which is a multi-hop localization scheme. The proposed localization algorithm achieves energy efficiency and reduced localization error compared to DV-Hop and its available improvements. Furthermore, a hybrid multi-hop localization scheme is also proposed that utilize Barycentric coordinate based PIT test and both range based (Received signal strength indicator) and range free (hop count) techniques for distance estimation. Our experimental results provide evidence that proposed hybrid multi-hop localization scheme results in two to five times reduction in the localization error compare to DV-Hop and its variants, at reduced energy requirements.Keywords: Localization, Trilateration, Triangulation, Wireless Sensor Networks
Procedia PDF Downloads 4671593 Metropolis-Hastings Sampling Approach for High Dimensional Testing Methods of Autonomous Vehicles
Authors: Nacer Eddine Chelbi, Ayet Bagane, Annie Saleh, Claude Sauvageau, Denis Gingras
Abstract:
As recently stated by National Highway Traffic Safety Administration (NHTSA), to demonstrate the expected performance of a highly automated vehicles system, test approaches should include a combination of simulation, test track, and on-road testing. In this paper, we propose a new validation method for autonomous vehicles involving on-road tests (Field Operational Tests), test track (Test Matrix) and simulation (Worst Case Scenarios). We concentrate our discussion on the simulation aspects, in particular, we extend recent work based on Importance Sampling by using a Metropolis-Hasting algorithm (MHS) to sample collected data from the Safety Pilot Model Deployment (SPMD) in lane-change scenarios. Our proposed MH sampling method will be compared to the Importance Sampling method, which does not perform well in high-dimensional problems. The importance of this study is to obtain a sampler that could be applied to high dimensional simulation problems in order to reduce and optimize the number of test scenarios that are necessary for validation and certification of autonomous vehicles.Keywords: automated driving, autonomous emergency braking (AEB), autonomous vehicles, certification, evaluation, importance sampling, metropolis-hastings sampling, tests
Procedia PDF Downloads 2891592 Cable De-Commissioning of Legacy Accelerators at CERN
Authors: Adya Uluwita, Fernando Pedrosa, Georgi Georgiev, Christian Bernard, Raoul Masterson
Abstract:
CERN is an international organisation funded by 23 countries that provide the particle physics community with excellence in particle accelerators and other related facilities. Founded in 1954, CERN has a wide range of accelerators that allow groundbreaking science to be conducted. Accelerators bring particles to high levels of energy and make them collide with each other or with fixed targets, creating specific conditions that are of high interest to physicists. A chain of accelerators is used to ramp up the energy of particles and eventually inject them into the largest and most recent one: the Large Hadron Collider (LHC). Among this chain of machines is, for instance the Proton Synchrotron, which was started in 1959 and is still in operation. These machines, called "injectors”, keep evolving over time, as well as the related infrastructure. Massive decommissioning of obsolete cables started in 2015 at CERN in the frame of the so-called "injectors de-cabling project phase 1". Its goal was to replace aging cables and remove unused ones, freeing space for new cables necessary for upgrades and consolidation campaigns. To proceed with the de-cabling, a project co-ordination team was assembled. The start of this project led to the investigation of legacy cables throughout the organisation. The identification of cables stacked over half a century proved to be arduous. Phase 1 of the injectors de-cabling was implemented for 3 years with success after overcoming some difficulties. Phase 2, started 3 years later, focused on improving safety and structure with the introduction of a quality assurance procedure. This paper discusses the implementation of this quality assurance procedure throughout phase 2 of the project and the transition between the two phases. Over hundreds of kilometres of cable were removed in the injectors complex at CERN from 2015 to 2023.Keywords: CERN, de-cabling, injectors, quality assurance procedure
Procedia PDF Downloads 921591 Path Planning for Orchard Robot Using Occupancy Grid Map in 2D Environment
Authors: Satyam Raikwar, Thomas Herlitzius, Jens Fehrmann
Abstract:
In recent years, the autonomous navigation of orchard and field robots is an emerging technology of the mobile robotics in agriculture. One of the core aspects of autonomous navigation builds upon path planning, which is still a crucial issue. Generally, for simple representation, the path planning for a mobile robot is performed in a two-dimensional space, which creates a path between the start and goal point. This paper presents the automatic path planning approach for robots used in orchards and vineyards using occupancy grid maps with field consideration. The orchards and vineyards are usually structured environment and their topology is assumed to be constant over time; therefore, in this approach, an RGB image of a field is used as a working environment. These images undergone different image processing operations and then discretized into two-dimensional grid matrices. The individual grid or cell of these grid matrices represents the occupancy of the space, whether it is free or occupied. The grid matrix represents the robot workspace for motion and path planning. After the grid matrix is described, a probabilistic roadmap (PRM) path algorithm is used to create the obstacle-free path over these occupancy grids. The path created by this method was successfully verified in the test area. Furthermore, this approach is used in the navigation of the orchard robot.Keywords: orchard robots, automatic path planning, occupancy grid, probabilistic roadmap
Procedia PDF Downloads 155