Search results for: shared frailty survival models
7118 An Effective Modification to Multiscale Elastic Network Model and Its Evaluation Based on Analyses of Protein Dynamics
Authors: Weikang Gong, Chunhua Li
Abstract:
Dynamics plays an essential role in function exertion of proteins. Elastic network model (ENM), a harmonic potential-based and cost-effective computational method, is a valuable and efficient tool for characterizing the intrinsic dynamical properties encoded in biomacromolecule structures and has been widely used to detect the large-amplitude collective motions of proteins. Gaussian network model (GNM) and anisotropic network model (ANM) are the two often-used ENM models. In recent years, many ENM variants have been proposed. Here, we propose a small but effective modification (denoted as modified mENM) to the multiscale ENM (mENM) where fitting weights of Kirchhoff/Hessian matrixes with the least square method (LSM) is modified since it neglects the details of pairwise interactions. Then we perform its comparisons with the original mENM, traditional ENM, and parameter-free ENM (pfENM) on reproducing dynamical properties for the six representative proteins whose molecular dynamics (MD) trajectories are available in http://mmb.pcb.ub.es/MoDEL/. In the results, for B-factor prediction, mENM achieves the best performance among the four ENM models. Additionally, it is noted that with the weights of the multiscale Kirchhoff/Hessian matrixes modified, interestingly, the modified mGNM/mANM still has a much better performance than the corresponding traditional ENM and pfENM models. As to dynamical cross-correlation map (DCCM) calculation, taking the data obtained from MD trajectories as the standard, mENM performs the worst while the results produced by the modified mENM and pfENM models are close to those from MD trajectories with the latter a little better than the former. Generally, ANMs perform better than the corresponding GNMs except for the mENM. Thus, pfANM and the modified mANM, especially the former, have an excellent performance in dynamical cross-correlation calculation. Compared with GNMs (except for mGNM), the corresponding ANMs can capture quite a number of positive correlations for the residue pairs nearly largest distances apart, which is maybe due to the anisotropy consideration in ANMs. Furtherly, encouragingly the modified mANM displays the best performance in capturing the functional motional modes, followed by pfANM and traditional ANM models, while mANM fails in all the cases. This suggests that the consideration of long-range interactions is critical for ANM models to produce protein functional motions. Based on the analyses, the modified mENM is a promising method in capturing multiple dynamical characteristics encoded in protein structures. This work is helpful for strengthening the understanding of the elastic network model and provides a valuable guide for researchers to utilize the model to explore protein dynamics.Keywords: elastic network model, ENM, multiscale ENM, molecular dynamics, parameter-free ENM, protein structure
Procedia PDF Downloads 1217117 Aggregate Angularity on the Permanent Deformation Zones of Hot Mix Asphalt
Authors: Lee P. Leon, Raymond Charles
Abstract:
This paper presents a method of evaluating the effect of aggregate angularity on hot mix asphalt (HMA) properties and its relationship to the Permanent Deformation resistance. The research concluded that aggregate particle angularity had a significant effect on the Permanent Deformation performance, and also that with an increase in coarse aggregate angularity there was an increase in the resistance of mixes to Permanent Deformation. A comparison between the measured data and predictive data of permanent deformation predictive models showed the limits of existing prediction models. The numerical analysis described the permanent deformation zones and concluded that angularity has an effect of the onset of these zones. Prediction of permanent deformation help road agencies and by extension economists and engineers determine the best approach for maintenance, rehabilitation, and new construction works of the road infrastructure.Keywords: aggregate angularity, asphalt concrete, permanent deformation, rutting prediction
Procedia PDF Downloads 4057116 Efficient Neural and Fuzzy Models for the Identification of Dynamical Systems
Authors: Aouiche Abdelaziz, Soudani Mouhamed Salah, Aouiche El Moundhe
Abstract:
The present paper addresses the utilization of Artificial Neural Networks (ANNs) and Fuzzy Inference Systems (FISs) for the identification and control of dynamical systems with some degree of uncertainty. Because ANNs and FISs have an inherent ability to approximate functions and to adapt to changes in input and parameters, they can be used to control systems too complex for linear controllers. In this work, we show how ANNs and FISs can be put in order to form nets that can learn from external data. In sequence, it is presented structures of inputs that can be used along with ANNs and FISs to model non-linear systems. Four systems were used to test the identification and control of the structures proposed. The results show the ANNs and FISs (Back Propagation Algorithm) used were efficient in modeling and controlling the non-linear plants.Keywords: non-linear systems, fuzzy set Models, neural network, control law
Procedia PDF Downloads 2127115 The Effect of Supply Chain Integration on Information Sharing
Authors: Khlif Hamadi
Abstract:
Supply chain integration has become a potentially valuable way of securing shared information and improving supply chain performance since competition is no longer between organizations but among supply chains. This research conceptualizes and develops three dimensions of supply chain integration (integration with customers, integration with suppliers, and the interorganizational integration) and tests the relationships between supply chain integration, information sharing, and supply chain performance. Furthermore, the four types of information sharing namely; information sharing with customers, information sharing with suppliers, inter-functional information sharing, and intra-organizational information sharing; and the four constructs of Supply Chain Performance represents expenses of costs, asset utilization, supply chain reliability, and supply chain flexibility and responsiveness. The theoretical and practical implications of the study, as well as directions for future research, are discussed.Keywords: supply chain integration, supply chain management, information sharing, supply chain performance
Procedia PDF Downloads 2617114 Heat Transfer Enhancement by Turbulent Impinging Jet with Jet's Velocity Field Excitations Using OpenFOAM
Authors: Naseem Uddin
Abstract:
Impinging jets are used in variety of engineering and industrial applications. This paper is based on numerical simulations of heat transfer by turbulent impinging jet with velocity field excitations using different Reynolds Averaged Navier-Stokes Equations models. Also Detached Eddy Simulations are conducted to investigate the differences in the prediction capabilities of these two simulation approaches. In this paper the excited jet is simulated in non-commercial CFD code OpenFOAM with the goal to understand the influence of dynamics of impinging jet on heat transfer. The jet’s frequencies are altered keeping in view the preferred mode of the jet. The Reynolds number based on mean velocity and diameter is 23,000 and jet’s outlet-to-target wall distance is 2. It is found that heat transfer at the target wall can be influenced by judicious selection of amplitude and frequencies.Keywords: excitation, impinging jet, natural frequency, turbulence models
Procedia PDF Downloads 2737113 The Chemistry in the Video Game No Man’s Sky
Authors: Diogo Santos, Nelson Zagalo, Carla Morais
Abstract:
No Man’s Sky (NMS) is a sci-fi video game about survival and exploration where players fly spaceships, search for elements, and use them to survive. NMS isn’t a serious game, and not all the science in the game is presented with scientific evidence. To find how players felt about the scientific content in the game and how they perceive the chemistry in it, a survey was sent to NMS’s players, from which were collected answers from 124 respondents from 23 countries. Chemophobia is still a phenomenon when chemistry or chemicals are a subject of discussion, but 68,9% of our respondents showed a positive attitude towards the presence of chemistry in NMS, with 57% stating that playing the video game motivated them to know more about science. 8% of the players stated that NMS often prompted conversations about the science in the video game between them and teachers, parents, or friends. These results give us ideas on how an entertainment game can potentially help scientists, educators, and science communicators reach a growing, evolving, vibrant, diverse, and demanding audience.Keywords: digital games, science communication, chemistry, informal learning, No Man’s Sky
Procedia PDF Downloads 1107112 Development of Enhanced Data Encryption Standard
Authors: Benjamin Okike
Abstract:
There is a need to hide information along the superhighway. Today, information relating to the survival of individuals, organizations, or government agencies is transmitted from one point to another. Adversaries are always on the watch along the superhighway to intercept any information that would enable them to inflict psychological ‘injuries’ to their victims. But with information encryption, this can be prevented completely or at worst reduced to the barest minimum. There is no doubt that so many encryption techniques have been proposed, and some of them are already being implemented. However, adversaries always discover loopholes on them to perpetuate their evil plans. In this work, we propose the enhanced data encryption standard (EDES) that would deploy randomly generated numbers as an encryption method. Each time encryption is to be carried out, a new set of random numbers would be generated, thereby making it almost impossible for cryptanalysts to decrypt any information encrypted with this newly proposed method.Keywords: encryption, enhanced data encryption, encryption techniques, information security
Procedia PDF Downloads 1507111 A Comparison of YOLO Family for Apple Detection and Counting in Orchards
Authors: Yuanqing Li, Changyi Lei, Zhaopeng Xue, Zhuo Zheng, Yanbo Long
Abstract:
In agricultural production and breeding, implementing automatic picking robot in orchard farming to reduce human labour and error is challenging. The core function of it is automatic identification based on machine vision. This paper focuses on apple detection and counting in orchards and implements several deep learning methods. Extensive datasets are used and a semi-automatic annotation method is proposed. The proposed deep learning models are in state-of-the-art YOLO family. In view of the essence of the models with various backbones, a multi-dimensional comparison in details is made in terms of counting accuracy, mAP and model memory, laying the foundation for realising automatic precision agriculture.Keywords: agricultural object detection, deep learning, machine vision, YOLO family
Procedia PDF Downloads 1977110 Measuring Housing Quality Using Geographic Information System (GIS)
Authors: Silvija ŠIljeg, Ante ŠIljeg, Ivan Marić
Abstract:
Measuring housing quality is being done on objective and subjective level using different indicators. During the research 5 urban and housing indicators formed according to 58 variables from different housing, domains were used. The aims of the research were to measure housing quality based on GIS approach and to detect critical points of housing in the example of Croatian coastal Town Zadar. The purposes of GIS in the research are to generate models of housing quality indexes by standardisation and aggregation of variables and to examine accuracy model of housing quality index. Analysis of accuracy has been done on the example of variable referring to educational objects availability. By defining weighted coefficients and using different GIS methods high, middle and low housing quality zones were determined. Obtained results can be of use to town planners, spatial planners and town authorities in the process of generating decisions, guidelines, and spatial interventions.Keywords: housing quality, GIS, housing quality index, indicators, models of housing quality
Procedia PDF Downloads 2997109 A Mixed Method Approach for Modeling Entry Capacity at Rotary Intersections
Authors: Antonio Pratelli, Lorenzo Brocchini, Reginald Roy Souleyrette
Abstract:
A rotary is a traffic circle intersection where vehicles entering from branches give priority to circulating flow. Vehicles entering the intersection from converging roads move around the central island and weave out of the circle into their desired exiting branch. This creates merging and diverging conflicts among any entry and its successive exit, i.e., a section. Therefore, rotary capacity models are usually based on the weaving of the different movements in any section of the circle, and the maximum rate of flow value is then related to each weaving section of the rotary. Nevertheless, the single-section capacity value does not lead to the typical performance characteristics of the intersection, such as the entry average delay which is directly linked to its level of service. From another point of view, modern roundabout capacity models are based on the limitation of the flow entering from the single entrance due to the amount of flow circulating in front of the entrance itself. Modern roundabouts capacity models generally lead also to a performance evaluation. This paper aims to incorporate a modern roundabout capacity model into an old rotary capacity method to obtain from the latter the single input capacity and ultimately achieve the related performance indicators. Put simply; the main objective is to calculate the average delay of each single roundabout entrance to apply the most common Highway Capacity Manual, or HCM, criteria. The paper is organized as follows: firstly, the rotary and roundabout capacity models are sketched, and it has made a brief introduction to the model combination technique with some practical instances. The successive section is deserved to summarize the TRRL old rotary capacity model and the most recent HCM-7th modern roundabout capacity model. Then, the two models are combined through an iteration-based algorithm, especially set-up and linked to the concept of roundabout total capacity, i.e., the value reached due to a traffic flow pattern leading to the simultaneous congestion of all roundabout entrances. The solution is the average delay for each entrance of the rotary, by which is estimated its respective level of service. In view of further experimental applications, at this research stage, a collection of existing rotary intersections operating with the priority-to-circle rule has already started, both in the US and in Italy. The rotaries have been selected by direct inspection of aerial photos through a map viewer, namely Google Earth. Each instance has been recorded by location, general urban or rural, and its main geometrical patterns. Finally, conclusion remarks are drawn, and a discussion on some further research developments has opened.Keywords: mixed methods, old rotary and modern roundabout capacity models, total capacity algorithm, level of service estimation
Procedia PDF Downloads 867108 Fuzzy Data, Random Drift, and a Theoretical Model for the Sequential Emergence of Religious Capacity in Genus Homo
Authors: Margaret Boone Rappaport, Christopher J. Corbally
Abstract:
The ancient ape ancestral population from which living great ape and human species evolved had demographic features affecting their evolution. The population was large, had great genetic variability, and natural selection was effective at honing adaptations. The emerging populations of chimpanzees and humans were affected more by founder effects and genetic drift because they were smaller. Natural selection did not disappear, but it was not as strong. Consequences of the 'population crash' and the human effective population size are introduced briefly. The history of the ancient apes is written in the genomes of living humans and great apes. The expansion of the brain began before the human line emerged. Coalescence times for some genes are very old – up to several million years, long before Homo sapiens. The mismatch between gene trees and species trees highlights the anthropoid speciation processes, and gives the human genome history a fuzzy, probabilistic quality. However, it suggests traits that might form a foundation for capacities emerging later. A theoretical model is presented in which the genomes of early ape populations provide the substructure for the emergence of religious capacity later on the human line. The model does not search for religion, but its foundations. It suggests a course by which an evolutionary line that began with prosimians eventually produced a human species with biologically based religious capacity. The model of the sequential emergence of religious capacity relies on cognitive science, neuroscience, paleoneurology, primate field studies, cognitive archaeology, genomics, and population genetics. And, it emphasizes five trait types: (1) Documented, positive selection of sensory capabilities on the human line may have favored survival, but also eventually enriched human religious experience. (2) The bonobo model suggests a possible down-regulation of aggression and increase in tolerance while feeding, as well as paedomorphism – but, in a human species that remains cognitively sharp (unlike the bonobo). The two species emerged from the same ancient ape population, so it is logical to search for shared traits. (3) An up-regulation of emotional sensitivity and compassion seems to have occurred on the human line. This finds support in modern genetic studies. (4) The authors’ published model of morality's emergence in Homo erectus encompasses a cognitively based, decision-making capacity that was hypothetically overtaken, in part, by religious capacity. Together, they produced a strong, variable, biocultural capability to support human sociability. (5) The full flowering of human religious capacity came with the parietal expansion and smaller face (klinorhynchy) found only in Homo sapiens. Details from paleoneurology suggest the stage was set for human theologies. Larger parietal lobes allowed humans to imagine inner spaces, processes, and beings, and, with the frontal lobe, led to the first theologies composed of structured and integrated theories of the relationships between humans and the supernatural. The model leads to the evolution of a small population of African hominins that was ready to emerge with religious capacity when the species Homo sapiens evolved two hundred thousand years ago. By 50-60,000 years ago, when human ancestors left Africa, they were fully enabled.Keywords: genetic drift, genomics, parietal expansion, religious capacity
Procedia PDF Downloads 3417107 Multi-Level Security Measures in Cloud Computing
Authors: Shobha G. Ranjan
Abstract:
Cloud computing is an emerging, on-demand and internet- based technology. Varieties of services like, software, hardware, data storage and infrastructure can be shared though the cloud computing. This technology is highly reliable, cost effective and scalable in nature. It is a must only the authorized users should access these services. Further the time granted to access these services should be taken into account for proper accounting purpose. Currently many organizations do the security measures in many different ways to provide the best cloud infrastructure to their clients, but that’s not the limitation. This paper presents the multi-level security measure technique which is in accordance with the OSI model. In this paper, details of proposed multilevel security measures technique are presented along with the architecture, activities, algorithms and probability of success in breaking authentication.Keywords: cloud computing, cloud security, integrity, multi-tenancy, security
Procedia PDF Downloads 5017106 Application of Stochastic Models on the Portuguese Population and Distortion to Workers Compensation Pensioners Experience
Authors: Nkwenti Mbelli Njah
Abstract:
This research was motivated by a project requested by AXA on the topic of pensions payable under the workers compensation (WC) line of business. There are two types of pensions: the compulsorily recoverable and the not compulsorily recoverable. A pension is compulsorily recoverable for a victim when there is less than 30% of disability and the pension amount per year is less than six times the minimal national salary. The law defines that the mathematical provisions for compulsory recoverable pensions must be calculated by applying the following bases: mortality table TD88/90 and rate of interest 5.25% (maybe with rate of management). To manage pensions which are not compulsorily recoverable is a more complex task because technical bases are not defined by law and much more complex computations are required. In particular, companies have to predict the amount of payments discounted reflecting the mortality effect for all pensioners (this task is monitored monthly in AXA). The purpose of this research was thus to develop a stochastic model for the future mortality of the worker’s compensation pensioners of both the Portuguese market workers and AXA portfolio. Not only is past mortality modeled, also projections about future mortality are made for the general population of Portugal as well as for the two portfolios mentioned earlier. The global model was split in two parts: a stochastic model for population mortality which allows for forecasts, combined with a point estimate from a portfolio mortality model obtained through three different relational models (Cox Proportional, Brass Linear and Workgroup PLT). The one-year death probabilities for ages 0-110 for the period 2013-2113 are obtained for the general population and the portfolios. These probabilities are used to compute different life table functions as well as the not compulsorily recoverable reserves for each of the models required for the pensioners, their spouses and children under 21. The results obtained are compared with the not compulsory recoverable reserves computed using the static mortality table (TD 73/77) that is currently being used by AXA, to see the impact on this reserve if AXA adopted the dynamic tables.Keywords: compulsorily recoverable, life table functions, relational models, worker’s compensation pensioners
Procedia PDF Downloads 1647105 DenseNet and Autoencoder Architecture for COVID-19 Chest X-Ray Image Classification and Improved U-Net Lung X-Ray Segmentation
Authors: Jonathan Gong
Abstract:
Purpose AI-driven solutions are at the forefront of many pathology and medical imaging methods. Using algorithms designed to better the experience of medical professionals within their respective fields, the efficiency and accuracy of diagnosis can improve. In particular, X-rays are a fast and relatively inexpensive test that can diagnose diseases. In recent years, X-rays have not been widely used to detect and diagnose COVID-19. The under use of Xrays is mainly due to the low diagnostic accuracy and confounding with pneumonia, another respiratory disease. However, research in this field has expressed a possibility that artificial neural networks can successfully diagnose COVID-19 with high accuracy. Models and Data The dataset used is the COVID-19 Radiography Database. This dataset includes images and masks of chest X-rays under the labels of COVID-19, normal, and pneumonia. The classification model developed uses an autoencoder and a pre-trained convolutional neural network (DenseNet201) to provide transfer learning to the model. The model then uses a deep neural network to finalize the feature extraction and predict the diagnosis for the input image. This model was trained on 4035 images and validated on 807 separate images from the ones used for training. The images used to train the classification model include an important feature: the pictures are cropped beforehand to eliminate distractions when training the model. The image segmentation model uses an improved U-Net architecture. This model is used to extract the lung mask from the chest X-ray image. The model is trained on 8577 images and validated on a validation split of 20%. These models are calculated using the external dataset for validation. The models’ accuracy, precision, recall, f1-score, IOU, and loss are calculated. Results The classification model achieved an accuracy of 97.65% and a loss of 0.1234 when differentiating COVID19-infected, pneumonia-infected, and normal lung X-rays. The segmentation model achieved an accuracy of 97.31% and an IOU of 0.928. Conclusion The models proposed can detect COVID-19, pneumonia, and normal lungs with high accuracy and derive the lung mask from a chest X-ray with similarly high accuracy. The hope is for these models to elevate the experience of medical professionals and provide insight into the future of the methods used.Keywords: artificial intelligence, convolutional neural networks, deep learning, image processing, machine learning
Procedia PDF Downloads 1307104 Neuroblastoma in Children and the Potential Involvement of Viruses in Its Pathogenesis
Authors: Ugo Rovigatti
Abstract:
Neuroblastoma (NBL) has epitomized for at least 40 years our understanding of cancer cellular and molecular biology and its potential applications to novel therapeutic strategies. This includes the discovery of the very first oncogene aberrations and tumorigenesis suppression by differentiation in the 80s; the potential role of suppressor genes in the 90s; the relevance of immunotherapy in the millennium first, and the discovery of additional mutations by NGS technology in the millennium second decade. Similar discoveries were achieved in the majority of human cancers, and similar therapeutic interventions were obtained subsequently to NBL discoveries. Unfortunately, targeted therapies suggested by specific mutations (such as MYCN amplification –MNA- present in ¼ or 1/5 of cases) have not elicited therapeutic successes in aggressive NBL, where the prognosis is still dismal. The reasons appear to be linked to Tumor Heterogeneity, which is particularly evident in NBL but also a clear hallmark of aggressive human cancers generally. The new avenue of cancer immunotherapy (CIT) provided new hopes for cancer patients, but we still ignore the cellular or molecular targets. CIT is emblematic of high-risk disease (HR-NBL) since the mentioned GD2 passive immunotherapy is still providing better survival. We recently critically reviewed and evaluated the literature depicting the genomic landscapes of HR-NBL, coming to the qualified conclusion that among hundreds of affected genes, potential targets, or chromosomal sites, none correlated with anti-GD2 sensitivity. A better explanation is provided by the Micro-Foci inducing Virus (MFV) model, which predicts that neuroblasts infection with the MFV, an RNA virus isolated from a cancer-cluster (space-time association) of HR-NBL cases, elicits the appearance of MNA and additional genomic aberrations with mechanisms resembling chromothripsis. Neuroblasts infected with low titers of MFV amplified MYCN up to 100 folds and became highly transformed and malignant, thus causing neuroblastoma in young rat pups of strains SD and Fisher-344 and larger tumor masses in nu/nu mice. An association was discovered with GD2 since this glycosphingolipid is also the receptor for the family of MFV virus (dsRNA viruses). It is concluded that a dsRNA virus, MFV, appears to provide better explicatory mechanisms for the genesis of i) specific genomic aberrations such as MNA; ii) extensive tumor heterogeneity and chromothripsis; iii) the effects of passive immunotherapy with anti-GD2 monoclonals and that this and similar models should be further investigated in both pediatric and adult cancers.Keywords: neuroblastoma, MYCN, amplification, viruses, GD2
Procedia PDF Downloads 1007103 Emerging Technologies in Distance Education
Authors: Eunice H. Li
Abstract:
This paper discusses and analyses a small portion of the literature that has been reviewed for research work in Distance Education (DE) pedagogies that I am currently undertaking. It begins by presenting a brief overview of Taylor's (2001) five-generation models of Distance Education. The focus of the discussion will be on the 5th generation, Intelligent Flexible Learning Model. For this generation, educational and other institutions make portal access and interactive multi-media (IMM) an integral part of their operations. The paper then takes a brief look at current trends in technologies – for example smart-watch wearable technology such as Apple Watch. The emergent trends in technologies carry many new features. These are compared to former DE generational features. Also compared is the time span that has elapsed between the generations that are referred to in Taylor's model. This paper is a work in progress. The paper therefore welcome new insights, comparisons and critique of the issues discussed.Keywords: distance education, e-learning technologies, pedagogy, generational models
Procedia PDF Downloads 4627102 A Comparative Study of Optimization Techniques and Models to Forecasting Dengue Fever
Abstract:
Dengue is a serious public health issue that causes significant annual economic and welfare burdens on nations. However, enhanced optimization techniques and quantitative modeling approaches can predict the incidence of dengue. By advocating for a data-driven approach, public health officials can make informed decisions, thereby improving the overall effectiveness of sudden disease outbreak control efforts. The National Oceanic and Atmospheric Administration and the Centers for Disease Control and Prevention are two of the U.S. Federal Government agencies from which this study uses environmental data. Based on environmental data that describe changes in temperature, precipitation, vegetation, and other factors known to affect dengue incidence, many predictive models are constructed that use different machine learning methods to estimate weekly dengue cases. The first step involves preparing the data, which includes handling outliers and missing values to make sure the data is prepared for subsequent processing and the creation of an accurate forecasting model. In the second phase, multiple feature selection procedures are applied using various machine learning models and optimization techniques. During the third phase of the research, machine learning models like the Huber Regressor, Support Vector Machine, Gradient Boosting Regressor (GBR), and Support Vector Regressor (SVR) are compared with several optimization techniques for feature selection, such as Harmony Search and Genetic Algorithm. In the fourth stage, the model's performance is evaluated using Mean Square Error (MSE), Mean Absolute Error (MAE), and Root Mean Square Error (RMSE) as assistance. Selecting an optimization strategy with the least number of errors, lowest price, biggest productivity, or maximum potential results is the goal. In a variety of industries, including engineering, science, management, mathematics, finance, and medicine, optimization is widely employed. An effective optimization method based on harmony search and an integrated genetic algorithm is introduced for input feature selection, and it shows an important improvement in the model's predictive accuracy. The predictive models with Huber Regressor as the foundation perform the best for optimization and also prediction.Keywords: deep learning model, dengue fever, prediction, optimization
Procedia PDF Downloads 657101 Design Channel Non Persistent CSMA MAC Protocol Model for Complex Wireless Systems Based on SoC
Authors: Ibrahim A. Aref, Tarek El-Mihoub, Khadiga Ben Musa
Abstract:
This paper presents Carrier Sense Multiple Access (CSMA) communication model based on SoC design methodology. Such model can be used to support the modelling of the complex wireless communication systems, therefore use of such communication model is an important technique in the construction of high performance communication. SystemC has been chosen because it provides a homogeneous design flow for complex designs (i.e. SoC and IP based design). We use a swarm system to validate CSMA designed model and to show how advantages of incorporating communication early in the design process. The wireless communication created through the modeling of CSMA protocol that can be used to achieve communication between all the agents and to coordinate access to the shared medium (channel).Keywords: systemC, modelling, simulation, CSMA
Procedia PDF Downloads 4287100 Low SPOP Expression and High MDM2 expression Are Associated with Tumor Progression and Predict Poor Prognosis in Hepatocellular Carcinoma
Authors: Chang Liang, Weizhi Gong, Yan Zhang
Abstract:
Purpose: Hepatocellular carcinoma (HCC) is a malignant tumor with a high mortality rate and poor prognosis worldwide. Murine double minute 2 (MDM2) regulates the tumor suppressor p53, increasing cancer risk and accelerating tumor progression. Speckle-type POX virus and zinc finger protein (SPOP), a key of subunit of Cullin-Ring E3 ligase, inhibits tumor genesis and progression by the ubiquitination of its downstream substrates. This study aimed to clarify whether SPOP and MDM2 are mutually regulated in HCC and the correlation between SPOP and MDM2 and the prognosis of HCC patients. Methods: First, the expression of SPOP and MDM2 in HCC tissues were detected by TCGA database. Then, 53 paired samples of HCC tumor and adjacent tissues were collected to evaluate the expression of SPOP and MDM2 using immunohistochemistry. Chi-square test or Fisher’s exact test were used to analyze the relationship between clinicopathological features and the expression levels of SPOP and MDM2. In addition, Kaplan‒Meier curve analysis and log-rank test were used to investigate the effects of SPOP and MDM2 on the survival of HCC patients. Last, the Multivariate Cox proportional risk regression model analyzed whether the different expression levels of SPOP and MDM2 were independent risk factors for the prognosis of HCC patients. Results: Bioinformatics analysis revealed the low expression of SPOP and high expression of MDM2 were related to worse prognosis of HCC patients. The relationship between the expression of SPOP and MDM2 and tumor stem-like features showed an opposite trend. The immunohistochemistry showed the expression of SPOP protein was significantly downregulated while MDM2 protein significantly upregulated in HCC tissue compared to that in para-cancerous tissue. Tumors with low SPOP expression were related to worse T stage and Barcelona Clinic Liver Cancer (BCLC) stage, but tumors with high MDM2 expression were related to worse T stage, M stage, and BCLC stage. Kaplan–Meier curves showed HCC patients with high SPOP expression and low MDM2 expression had better survival than those with low SPOP expression and high MDM2 expression (P < 0.05). A multivariate Cox proportional risk regression model confirmed that a high MDM2 expression level was an independent risk factor for poor prognosis in HCC patients (P <0.05). Conclusion: The expression of SPOP protein was significantly downregulated, while the expression of MDM2 significantly upregulated in HCC. The low expression of SPOP and high expression. of MDM2 were associated with malignant progression and poor prognosis of HCC patients, indicating a potential therapeutic target for HCC patients.Keywords: hepatocellular carcinoma, murine double minute 2, speckle-type POX virus and zinc finger protein, ubiquitination
Procedia PDF Downloads 1447099 Scalable Learning of Tree-Based Models on Sparsely Representable Data
Authors: Fares Hedayatit, Arnauld Joly, Panagiotis Papadimitriou
Abstract:
Many machine learning tasks such as text annotation usually require training over very big datasets, e.g., millions of web documents, that can be represented in a sparse input space. State-of the-art tree-based ensemble algorithms cannot scale to such datasets, since they include operations whose running time is a function of the input space size rather than a function of the non-zero input elements. In this paper, we propose an efficient splitting algorithm to leverage input sparsity within decision tree methods. Our algorithm improves training time over sparse datasets by more than two orders of magnitude and it has been incorporated in the current version of scikit-learn.org, the most popular open source Python machine learning library.Keywords: big data, sparsely representable data, tree-based models, scalable learning
Procedia PDF Downloads 2637098 Numerical Simulation and Experimental Validation of the Tire-Road Separation in Quarter-car Model
Authors: Quy Dang Nguyen, Reza Nakhaie Jazar
Abstract:
The paper investigates vibration dynamics of tire-road separation for a quarter-car model; this separation model is developed to be close to the real situation considering the tire is able to separate from the ground plane. A set of piecewise linear mathematical models is developed and matches the in-contact and no-contact states to be considered as mother models for further investigations. The bound dynamics are numerically simulated in the time response and phase portraits. The separation analysis may determine which values of suspension parameters can delay and avoid the no-contact phenomenon, which results in improving ride comfort and eliminating the potentially dangerous oscillation. Finally, model verification is carried out in the MSC-ADAMS environment.Keywords: quarter-car vibrations, tire-road separation, separation analysis, separation dynamics, ride comfort, ADAMS validation
Procedia PDF Downloads 927097 Empirical and Indian Automotive Equity Portfolio Decision Support
Authors: P. Sankar, P. James Daniel Paul, Siddhant Sahu
Abstract:
A brief review of the empirical studies on the methodology of the stock market decision support would indicate that they are at a threshold of validating the accuracy of the traditional and the fuzzy, artificial neural network and the decision trees. Many researchers have been attempting to compare these models using various data sets worldwide. However, the research community is on the way to the conclusive confidence in the emerged models. This paper attempts to use the automotive sector stock prices from National Stock Exchange (NSE), India and analyze them for the intra-sectorial support for stock market decisions. The study identifies the significant variables and their lags which affect the price of the stocks using OLS analysis and decision tree classifiers.Keywords: Indian automotive sector, stock market decisions, equity portfolio analysis, decision tree classifiers, statistical data analysis
Procedia PDF Downloads 4857096 Impact of Integrated Signals for Doing Human Activity Recognition Using Deep Learning Models
Authors: Milagros Jaén-Vargas, Javier García Martínez, Karla Miriam Reyes Leiva, María Fernanda Trujillo-Guerrero, Francisco Fernandes, Sérgio Barroso Gonçalves, Miguel Tavares Silva, Daniel Simões Lopes, José Javier Serrano Olmedo
Abstract:
Human Activity Recognition (HAR) is having a growing impact in creating new applications and is responsible for emerging new technologies. Also, the use of wearable sensors is an important key to exploring the human body's behavior when performing activities. Hence, the use of these dispositive is less invasive and the person is more comfortable. In this study, a database that includes three activities is used. The activities were acquired from inertial measurement unit sensors (IMU) and motion capture systems (MOCAP). The main objective is differentiating the performance from four Deep Learning (DL) models: Deep Neural Network (DNN), Convolutional Neural Network (CNN), Recurrent Neural Network (RNN) and hybrid model Convolutional Neural Network-Long Short-Term Memory (CNN-LSTM), when considering acceleration, velocity and position and evaluate if integrating the IMU acceleration to obtain velocity and position represent an increment in performance when it works as input to the DL models. Moreover, compared with the same type of data provided by the MOCAP system. Despite the acceleration data is cleaned when integrating, results show a minimal increase in accuracy for the integrated signals.Keywords: HAR, IMU, MOCAP, acceleration, velocity, position, feature maps
Procedia PDF Downloads 987095 Saltwater Intrusion Studies in the Cai River in the Khanh Hoa Province, Vietnam
Authors: B. Van Kessel, P. T. Kockelkorn, T. R. Speelman, T. C. Wierikx, C. Mai Van, T. A. Bogaard
Abstract:
Saltwater intrusion is a common problem in estuaries around the world, as it could hinder the freshwater supply of coastal zones. This problem is likely to grow due to climate change and sea-level rise. The influence of these factors on the saltwater intrusion was investigated for the Cai River in the Khanh Hoa province in Vietnam. In addition, the Cai River has high seasonal fluctuations in discharge, leading to increased saltwater intrusion during the dry season. Sea level rise, river discharge changes, river mouth widening and a proposed saltwater intrusion prevention dam can have influences on the saltwater intrusion but have not been quantified for the Cai River estuary. This research used both an analytical and numerical model to investigate the effect of the aforementioned factors. The analytical model was based on a model proposed by Savenije and was calibrated using limited in situ data. The numerical model was a 3D hydrodynamic model made using the Delft3D4 software. The analytical model and numerical model agreed with in situ data, mostly for tidally average data. Both models indicated a roughly similar dependence on discharge, also agreeing that this parameter had the most severe influence on the modeled saltwater intrusion. Especially for discharges below 10 m/s3, the saltwater was predicted to reach further than 10 km. In the models, both sea-level rise and river widening mainly resulted in salinity increments up to 3 kg/m3 in the middle part of the river. The predicted sea-level rise in 2070 was simulated to lead to an increase of 0.5 km in saltwater intrusion length. Furthermore, the effect of the saltwater intrusion dam seemed significant in the model used, but only for the highest position of the gate.Keywords: Cai River, hydraulic models, river discharge, saltwater intrusion, tidal barriers
Procedia PDF Downloads 1127094 Review of the Model-Based Supply Chain Management Research in the Construction Industry
Authors: Aspasia Koutsokosta, Stefanos Katsavounis
Abstract:
This paper reviews the model-based qualitative and quantitative Operations Management research in the context of Construction Supply Chain Management (CSCM). Construction industry has been traditionally blamed for low productivity, cost and time overruns, waste, high fragmentation and adversarial relationships. The construction industry has been slower than other industries to employ the Supply Chain Management (SCM) concept and develop models that support the decision-making and planning. However the last decade there is a distinct shift from a project-based to a supply-based approach of construction management. CSCM comes up as a new promising management tool of construction operations and improves the performance of construction projects in terms of cost, time and quality. Modeling the Construction Supply Chain (CSC) offers the means to reap the benefits of SCM, make informed decisions and gain competitive advantage. Different modeling approaches and methodologies have been applied in the multi-disciplinary and heterogeneous research field of CSCM. The literature review reveals that a considerable percentage of CSC modeling accommodates conceptual or process models which discuss general management frameworks and do not relate to acknowledged soft OR methods. We particularly focus on the model-based quantitative research and categorize the CSCM models depending on their scope, mathematical formulation, structure, objectives, solution approach, software used and decision level. Although over the last few years there has been clearly an increase of research papers on quantitative CSC models, we identify that the relevant literature is very fragmented with limited applications of simulation, mathematical programming and simulation-based optimization. Most applications are project-specific or study only parts of the supply system. Thus, some complex interdependencies within construction are neglected and the implementation of the integrated supply chain management is hindered. We conclude this paper by giving future research directions and emphasizing the need to develop robust mathematical optimization models for the CSC. We stress that CSC modeling needs a multi-dimensional, system-wide and long-term perspective. Finally, prior applications of SCM to other industries have to be taken into account in order to model CSCs, but not without the consequential reform of generic concepts to match the unique characteristics of the construction industry.Keywords: construction supply chain management, modeling, operations research, optimization, simulation
Procedia PDF Downloads 5037093 Monte Carlo Simulation of X-Ray Spectra in Diagnostic Radiology and Mammography Using MCNP4C
Authors: Sahar Heidary, Ramin Ghasemi Shayan
Abstract:
The overall goal Monte Carlo N-atom radioactivity transference PC program (MCNP4C) was done for the regeneration of x-ray groups in diagnostic radiology and mammography. The electrons were transported till they slow down and stopover in the target. Both bremsstrahlung and characteristic x-ray creation were measured in this study. In this issue, the x-ray spectra forecast by several computational models recycled in the diagnostic radiology and mammography energy kind have been calculated by appraisal with dignified spectra and their outcome on the scheming of absorbed dose and effective dose (ED) told to the adult ORNL hermaphroditic phantom quantified. This comprises practical models (TASMIP and MASMIP), semi-practical models (X-rayb&m, X-raytbc, XCOMP, IPEM, Tucker et al., and Blough et al.), and Monte Carlo modeling (EGS4, ITS3.0, and MCNP4C). Images got consuming synchrotron radiation (SR) and both screen-film and the CR system were related with images of the similar trials attained with digital mammography equipment. In sight of the worthy feature of the effects gained, the CR system was used in two mammographic inspections with SR. For separately mammography unit, the capability acquiesced bilateral mediolateral oblique (MLO) and craniocaudal(CC) mammograms attained in a woman with fatty breasts and a woman with dense breasts. Referees planned the common groups and definite absences that managed to a choice to miscarry the part that formed the scientific imaginings.Keywords: mammography, monte carlo, effective dose, radiology
Procedia PDF Downloads 1317092 Evaluation of Practicality of On-Demand Bus Using Actual Taxi-Use Data through Exhaustive Simulations
Authors: Jun-ichi Ochiai, Itsuki Noda, Ryo Kanamori, Keiji Hirata, Hitoshi Matsubara, Hideyuki Nakashima
Abstract:
We conducted exhaustive simulations for data assimilation and evaluation of service quality for various setting in a new shared transportation system, called SAVS. Computational social simulation is a key technology to design recent social services like SAVS as new transportation service. One open issue in SAVS was to determine the service scale through the social simulation. Using our exhaustive simulation framework, OACIS, we did data-assimilation and evaluation of effects of SAVS based on actual tax-use data at Tajimi city, Japan. Finally, we get the conditions to realize the new service in a reasonable service quality.Keywords: on-demand bus sytem, social simulation, data assimilation, exhaustive simulation
Procedia PDF Downloads 3217091 Analytics Model in a Telehealth Center Based on Cloud Computing and Local Storage
Authors: L. Ramirez, E. Guillén, J. Sánchez
Abstract:
Some of the main goals about telecare such as monitoring, treatment, telediagnostic are deployed with the integration of applications with specific appliances. In order to achieve a coherent model to integrate software, hardware, and healthcare systems, different telehealth models with Internet of Things (IoT), cloud computing, artificial intelligence, etc. have been implemented, and their advantages are still under analysis. In this paper, we propose an integrated model based on IoT architecture and cloud computing telehealth center. Analytics module is presented as a solution to control an ideal diagnostic about some diseases. Specific features are then compared with the recently deployed conventional models in telemedicine. The main advantage of this model is the availability of controlling the security and privacy about patient information and the optimization on processing and acquiring clinical parameters according to technical characteristics.Keywords: analytics, telemedicine, internet of things, cloud computing
Procedia PDF Downloads 3257090 Development of People's Participation in Environmental Development in Pathumthani Province
Authors: Sakapas Saengchai
Abstract:
Study on the development of people's participation in environmental development was a qualitative research method. Data were collected by participant observation, in-depth interview and discussion group in Pathumthani province. The study indicated that 1) People should be aware of environmental information from government agencies. 2) People in the community should be able to brainstorm information, exchange information about community environment development. 3) People should have a role with community leaders. 4) People in the community should have a role to play in the implementation of projects and activities in the development of the environment and 5) citizens, community leaders, village committee have directed the development of the area. Maintaining a community environment with a shared decision. By emphasizing the process of participation, self-reliance, mutual help, and responsibility for one's own community. Community empowerment strengthens the sustainable spatial development of the environment.Keywords: people, participation, community, environment
Procedia PDF Downloads 2807089 A Method to Enhance the Accuracy of Digital Forensic in the Absence of Sufficient Evidence in Saudi Arabia
Authors: Fahad Alanazi, Andrew Jones
Abstract:
Digital forensics seeks to achieve the successful investigation of digital crimes through obtaining acceptable evidence from digital devices that can be presented in a court of law. Thus, the digital forensics investigation is normally performed through a number of phases in order to achieve the required level of accuracy in the investigation processes. Since 1984 there have been a number of models and frameworks developed to support the digital investigation processes. In this paper, we review a number of the investigation processes that have been produced throughout the years and introduce a proposed digital forensic model which is based on the scope of the Saudi Arabia investigation process. The proposed model has been integrated with existing models for the investigation processes and produced a new phase to deal with a situation where there is initially insufficient evidence.Keywords: digital forensics, process, metadata, Traceback, Sauid Arabia
Procedia PDF Downloads 359