Search results for: improvement of model accuracy and reliability
19584 Reconstruction of a Genome-Scale Metabolic Model to Simulate Uncoupled Growth of Zymomonas mobilis
Authors: Maryam Saeidi, Ehsan Motamedian, Seyed Abbas Shojaosadati
Abstract:
Zymomonas mobilis is known as an example of the uncoupled growth phenomenon. This microorganism also has a unique metabolism that degrades glucose by the Entner–Doudoroff (ED) pathway. In this paper, a genome-scale metabolic model including 434 genes, 757 reactions and 691 metabolites was reconstructed to simulate uncoupled growth and study its effect on flux distribution in the central metabolism. The model properly predicted that ATPase was activated in experimental growth yields of Z. mobilis. Flux distribution obtained from model indicates that the major carbon flux passed through ED pathway that resulted in the production of ethanol. Small amounts of carbon source were entered into pentose phosphate pathway and TCA cycle to produce biomass precursors. Predicted flux distribution was in good agreement with experimental data. The model results also indicated that Z. mobilis metabolism is able to produce biomass with maximum growth yield of 123.7 g (mol glucose)-1 if ATP synthase is coupled with growth and produces 82 mmol ATP gDCW-1h-1. Coupling the growth and energy reduced ethanol secretion and changed the flux distribution to produce biomass precursors.Keywords: genome-scale metabolic model, Zymomonas mobilis, uncoupled growth, flux distribution, ATP dissipation
Procedia PDF Downloads 48619583 Cuban's Supply Chains Development Model: Qualitative and Quantitative Impact on Final Consumers
Authors: Teresita Lopez Joy, Jose A. Acevedo Suarez, Martha I. Gomez Acosta, Ana Julia Acevedo Urquiaga
Abstract:
Current trends in business competitiveness indicate the need to manage businesses as supply chains and not in isolation. The use of strategies aimed at maximum satisfaction of customers in a network and based on inter-company cooperation; contribute to obtaining successful joint results. In the Cuban economic context, the development of productive linkages to achieve integrated management of supply chains is considering a key aspect. In order to achieve this jump, it is necessary to develop acting capabilities in the entities that make up the chains through a systematic procedure that allows arriving at a management model in consonance with the environment. The objective of the research focuses on: designing a model and procedure for the development of integrated management of supply chains in economic entities. The results obtained are: the Model and the Procedure for the Development of the Supply Chains Integrated Management (MP-SCIM). The Model is based on the development of logistics in the network actors, the joint work between companies, collaborative planning and the monitoring of a main indicator according to the end customers. The application Procedure starts from the well-founded need for development in a supply chain and focuses on training entrepreneurs as doers. The characterization and diagnosis is done to later define the design of the network and the relationships between the companies. It takes into account the feedback as a method of updating the conditions and way to focus the objectives according to the final customers. The MP-SCIM is the result of systematic work with a supply chain approach in companies that have consolidated as coordinators of their network. The cases of the edible oil chain and explosives for construction sector reflect results of more remarkable advances since they have applied this approach for more than 5 years and maintain it as a general strategy of successful development. The edible oil trading company experienced a jump in sales. In 2006, the company started the analysis in order to define the supply chain, apply diagnosis techniques, define problems and implement solutions. The involvement of the management and the progressive formation of performance capacities in the personnel allowed the application of tools according to the context. The company that coordinates the explosives chain for construction sector shows adequate training with independence and opportunity in the face of different situations and variations of their business environment. The appropriation of tools and techniques for the analysis and implementation of proposals is a characteristic feature of this case. The coordinating entity applies integrated supply chain management to its decisions based on the timely training of the necessary action capabilities for each situation. Other cases of study and application that validate these tools are also detailed in this paper, and they highlight the results of generalization in the quantitative and qualitative improvement according to the final clients. These cases are: teaching literature in universities, agricultural products of local scope and medicine supply chains.Keywords: integrated management, logistic system, supply chain management, tactical-operative planning
Procedia PDF Downloads 15319582 A Static Android Malware Detection Based on Actual Used Permissions Combination and API Calls
Authors: Xiaoqing Wang, Junfeng Wang, Xiaolan Zhu
Abstract:
Android operating system has been recognized by most application developers because of its good open-source and compatibility, which enriches the categories of applications greatly. However, it has become the target of malware attackers due to the lack of strict security supervision mechanisms, which leads to the rapid growth of malware, thus bringing serious safety hazards to users. Therefore, it is critical to detect Android malware effectively. Generally, the permissions declared in the AndroidManifest.xml can reflect the function and behavior of the application to a large extent. Since current Android system has not any restrictions to the number of permissions that an application can request, developers tend to apply more than actually needed permissions in order to ensure the successful running of the application, which results in the abuse of permissions. However, some traditional detection methods only consider the requested permissions and ignore whether it is actually used, which leads to incorrect identification of some malwares. Therefore, a machine learning detection method based on the actually used permissions combination and API calls was put forward in this paper. Meanwhile, several experiments are conducted to evaluate our methodology. The result shows that it can detect unknown malware effectively with higher true positive rate and accuracy while maintaining a low false positive rate. Consequently, the AdaboostM1 (J48) classification algorithm based on information gain feature selection algorithm has the best detection result, which can achieve an accuracy of 99.8%, a true positive rate of 99.6% and a lowest false positive rate of 0.Keywords: android, API Calls, machine learning, permissions combination
Procedia PDF Downloads 32919581 Detecting Overdispersion for Mortality AIDS in Zero-inflated Negative Binomial Death Rate (ZINBDR) Co-infection Patients in Kelantan
Authors: Mohd Asrul Affedi, Nyi Nyi Naing
Abstract:
Overdispersion is present in count data, and basically when a phenomenon happened, a Negative Binomial (NB) is commonly used to replace a standard Poisson model. Analysis of count data event, such as mortality cases basically Poisson regression model is appropriate. Hence, the model is not appropriate when existing a zero values. The zero-inflated negative binomial model is appropriate. In this article, we modelled the mortality cases as a dependent variable by age categorical. The objective of this study to determine existing overdispersion in mortality data of AIDS co-infection patients in Kelantan.Keywords: negative binomial death rate, overdispersion, zero-inflation negative binomial death rate, AIDS
Procedia PDF Downloads 46319580 Hybrid Intelligent Optimization Methods for Optimal Design of Horizontal-Axis Wind Turbine Blades
Authors: E. Tandis, E. Assareh
Abstract:
Designing the optimal shape of MW wind turbine blades is provided in a number of cases through evolutionary algorithms associated with mathematical modeling (Blade Element Momentum Theory). Evolutionary algorithms, among the optimization methods, enjoy many advantages, particularly in stability. However, they usually need a large number of function evaluations. Since there are a large number of local extremes, the optimization method has to find the global extreme accurately. The present paper introduces a new population-based hybrid algorithm called Genetic-Based Bees Algorithm (GBBA). This algorithm is meant to design the optimal shape for MW wind turbine blades. The current method employs crossover and neighborhood searching operators taken from the respective Genetic Algorithm (GA) and Bees Algorithm (BA) to provide a method with good performance in accuracy and speed convergence. Different blade designs, twenty-one to be exact, were considered based on the chord length, twist angle and tip speed ratio using GA results. They were compared with BA and GBBA optimum design results targeting the power coefficient and solidity. The results suggest that the final shape, obtained by the proposed hybrid algorithm, performs better compared to either BA or GA. Furthermore, the accuracy and speed convergence increases when the GBBA is employedKeywords: Blade Design, Optimization, Genetic Algorithm, Bees Algorithm, Genetic-Based Bees Algorithm, Large Wind Turbine
Procedia PDF Downloads 31619579 A Traceability Index for Food
Authors: Hari Pulapaka
Abstract:
This paper defines and develops the notion of a traceability index for food and may be used by any consumer (restaurant, distributor, average consumer etc.). The concept is then extended to a region's food system as a way to measure how well a regional food system utilizes its own bounty or at least, is connected to its food sources. With increasing emphases on the sustainability of aspects of regional and ultimately, the global food system, it is reasonable to accept that if we know how close (in relative terms) an end-user of a set of ingredients (as they traverse through the maze of supply chains) is from the sources, we may be better equipped to evaluate the quality of the set as measured by any number of qualitative and quantitative criteria. We propose a mathematical model which may be adapted to a number of contexts and sizes. Two hypothetical cases of different scope are presented which highlight how the model works as an evaluator of steps between an end-user and the source(s) of the ingredients they consume. The variables in the model are flexible enough to be adapted to other applications beyond food systems.Keywords: food, traceability, supply chain, mathematical model
Procedia PDF Downloads 27419578 Analytics Model in a Telehealth Center Based on Cloud Computing and Local Storage
Authors: L. Ramirez, E. Guillén, J. Sánchez
Abstract:
Some of the main goals about telecare such as monitoring, treatment, telediagnostic are deployed with the integration of applications with specific appliances. In order to achieve a coherent model to integrate software, hardware, and healthcare systems, different telehealth models with Internet of Things (IoT), cloud computing, artificial intelligence, etc. have been implemented, and their advantages are still under analysis. In this paper, we propose an integrated model based on IoT architecture and cloud computing telehealth center. Analytics module is presented as a solution to control an ideal diagnostic about some diseases. Specific features are then compared with the recently deployed conventional models in telemedicine. The main advantage of this model is the availability of controlling the security and privacy about patient information and the optimization on processing and acquiring clinical parameters according to technical characteristics.Keywords: analytics, telemedicine, internet of things, cloud computing
Procedia PDF Downloads 32519577 A New Distribution and Application on the Lifetime Data
Authors: Gamze Ozel, Selen Cakmakyapan
Abstract:
We introduce a new model called the Marshall-Olkin Rayleigh distribution which extends the Rayleigh distribution using Marshall-Olkin transformation and has increasing and decreasing shapes for the hazard rate function. Various structural properties of the new distribution are derived including explicit expressions for the moments, generating and quantile function, some entropy measures, and order statistics are presented. The model parameters are estimated by the method of maximum likelihood and the observed information matrix is determined. The potentiality of the new model is illustrated by means of real life data set.Keywords: Marshall-Olkin distribution, Rayleigh distribution, estimation, maximum likelihood
Procedia PDF Downloads 50119576 Range Suitability Model for Livestock Grazing in Taleghan Rangelands
Authors: Hossein Arzani, Masoud Jafari Shalamzari, Z. Arzani
Abstract:
This paper follows FAO model of suitability analysis. Influential factors affecting extensive grazing were determined and converted into a model. Taleghan rangelands were examined for common types of grazing animals as an example. Advantages and limitations were elicited. All range ecosystems’ components affect range suitability but due to the time and money restrictions, the most important and feasible elements were investigated. From which three sub-models including water accessibility, forage production and erosion sensitivity were considered. Suitable areas in four levels of suitability were calculated using GIS. This suitability modeling approach was adopted due to its simplicity and the minimal time that is required for transforming and analyzing the data sets. Managers could be benefited from the model to devise the measures more wisely to cope with the limitations and enhance the rangelands health and condition.Keywords: range suitability, land-use, extensive grazing, modeling, land evaluation
Procedia PDF Downloads 34119575 Long Standing Orbital Floor Fracture Repair: Case Report
Authors: Hisham A. Hashem, Sameh Galal, Bassem M. Moeshed
Abstract:
A 36 years old male patient presented to our unit with a history of motor-car accident from 7 months complaining of disfigurement and double vision. On examination and investigations, there was an orbital floor fracture in the left eye with inferior rectus muscle entrapment causing diplopia, dystopia and enophthalmos. Under general anesthesia, a sub-ciliary incision was performed, and the orbital floor fracture was repaired with a double layer Medpor sheet (30x50x15) with removing and freeing fibrosis that was present and freeing of the inferior rectus muscle. Remarkable improvement of the dystopia was noticed, however, there was a residual diplopia in upgaze and enophthalmos. He was then referred to a strabismologist, which upon examination found left hypotropia of 8 ΔD corrected by 8 ΔD base up prism and positive forced duction test on elevation and pseudoptosis. Under local anesthesia, a limbal incision approach with hangback 4mm recession of inferior rectus muscle was performed after identifying an inferior rectus muscle structure. Improvement was noted shortly postoperative with correction of both diplopia and pseudoptosis. Follow up after 1, 4 and 8 months was done showing a stable condition. Delayed surgery in cases of orbital floor fracture may still hold good results provided proper assessment of the case with management of each sign separately.Keywords: diplopia, dystopia, late surgery, orbital floor fracture
Procedia PDF Downloads 22719574 Numerical Simulation of Ultraviolet Disinfection in a Water Reactor
Authors: H. Shokouhmand, H. Sobhani, B. Sajadi, M. Degheh
Abstract:
In recent years, experimental and numerical investigation of water UV reactors has increased significantly. The main drawback of experimental methods is confined and expensive survey of UV reactors features. In this study, a CFD model utilizing the eulerian-lagrangian framework is applied to analysis the disinfection performance of a closed conduit reactor which contains four UV lamps perpendicular to the flow. A discrete ordinates (DO) model was employed to evaluate the UV irradiance field. To investigate the importance of each of lamps on the inactivation performance, in addition to the reference model (with 4 bright lamps), several models with one or two bright lamps in various arrangements were considered. All results were reported in three inactivation kinetics. The results showed that the log inactivation of the two central bright lamps model was between 88-99 percent, close to the reference model results. Also, whatever the lamps are closer to the main flow region, they have more effect on microbial inactivation. The effect of some operational parameters such as water flow rate, inlet water temperature, and lamps power were also studied.Keywords: Eulerian-Lagrangian framework, inactivation kinetics, log inactivation, water UV reactor
Procedia PDF Downloads 25119573 A Deterministic Approach for Solving the Hull and White Interest Rate Model with Jump Process
Authors: Hong-Ming Chen
Abstract:
This work considers the resolution of the Hull and White interest rate model with the jump process. A deterministic process is adopted to model the random behavior of interest rate variation as deterministic perturbations, which is depending on the time t. The Brownian motion and jumps uncertainty are denoted as the integral functions piecewise constant function w(t) and point function θ(t). It shows that the interest rate function and the yield function of the Hull and White interest rate model with jump process can be obtained by solving a nonlinear semi-infinite programming problem. A relaxed cutting plane algorithm is then proposed for solving the resulting optimization problem. The method is calibrated for the U.S. treasury securities at 3-month data and is used to analyze several effects on interest rate prices, including interest rate variability, and the negative correlation between stock returns and interest rates. The numerical results illustrate that our approach essentially generates the yield functions with minimal fitting errors and small oscillation.Keywords: optimization, interest rate model, jump process, deterministic
Procedia PDF Downloads 16119572 Recurrent Neural Networks with Deep Hierarchical Mixed Structures for Chinese Document Classification
Authors: Zhaoxin Luo, Michael Zhu
Abstract:
In natural languages, there are always complex semantic hierarchies. Obtaining the feature representation based on these complex semantic hierarchies becomes the key to the success of the model. Several RNN models have recently been proposed to use latent indicators to obtain the hierarchical structure of documents. However, the model that only uses a single-layer latent indicator cannot achieve the true hierarchical structure of the language, especially a complex language like Chinese. In this paper, we propose a deep layered model that stacks arbitrarily many RNN layers equipped with latent indicators. After using EM and training it hierarchically, our model solves the computational problem of stacking RNN layers and makes it possible to stack arbitrarily many RNN layers. Our deep hierarchical model not only achieves comparable results to large pre-trained models on the Chinese short text classification problem but also achieves state of art results on the Chinese long text classification problem.Keywords: nature language processing, recurrent neural network, hierarchical structure, document classification, Chinese
Procedia PDF Downloads 6819571 Development of a Numerical Model to Predict Wear in Grouted Connections for Offshore Wind Turbine Generators
Authors: Paul Dallyn, Ashraf El-Hamalawi, Alessandro Palmeri, Bob Knight
Abstract:
In order to better understand the long term implications of the grout wear failure mode in large-diameter plain-sided grouted connections, a numerical model has been developed and calibrated that can take advantage of existing operational plant data to predict the wear accumulation for the actual load conditions experienced over a given period, thus limiting the need for expensive monitoring systems. This model has been derived and calibrated based on site structural condition monitoring (SCM) data and supervisory control and data acquisition systems (SCADA) data for two operational wind turbine generator substructures afflicted with this challenge, along with experimentally derived wear rates.Keywords: grouted connection, numerical model, offshore structure, wear, wind energy
Procedia PDF Downloads 45419570 Diagnostic Accuracy Of Core Biopsy In Patients Presenting With Axillary Lymphadenopathy And Suspected Non-Breast Malignancy
Authors: Monisha Edirisooriya, Wilma Jack, Dominique Twelves, Jennifer Royds, Fiona Scott, Nicola Mason, Arran Turnbull, J. Michael Dixon
Abstract:
Introduction: Excision biopsy has been the investigation of choice for patients presenting with pathological axillary lymphadenopathy without a breast abnormality. Core biopsy of nodes can provide sufficient tissue for diagnosis and has advantages in terms of morbidity and speed of diagnosis. This study evaluates the diagnostic accuracy of core biopsy in patients presenting with axillary lymphadenopathy. Methods: Between 2009 and 2019, 165 patients referred to the Edinburgh Breast Unit had a total of 179 axillary lymph node core biopsies. Results: 152 (92%) of the 165 initial core biopsies were deemed to contain adequate nodal tissue. Core biopsy correctly established malignancy in 75 of the 78 patients with haematological malignancy (96%) and in all 28 patients with metastatic carcinoma (100%) and correctly diagnosed benign changes in 49 of 57 (86%) patients with benign conditions. There were no false positives and no false negatives. In 67 (85.9%) of the 78 patients with hematological malignancy, there was sufficient material in the first core biopsy to allow the pathologist to make an actionable diagnosis and not ask for more tissue sampling prior to treatment. There were no complications of core biopsy. On follow up, none of the patients with benign cores has been shown to have malignancy in the axilla and none with lymphoma had their initial disease incorrectly classified. Conclusions: This study shows that core biopsy is now the investigation of choice for patients presenting with axillary lymphadenopathy even in those suspected as having lymphoma.Keywords: core biopsy, excision biopsy, axillary lymphadenopathy, non-breast malignancy
Procedia PDF Downloads 24119569 Seismic Assessment of Old Existing RC Buildings with Masonry Infill in Madinah as Per ASCE
Authors: Tarek M. Alguhane, Ayman H. Khalil, Nour M. Fayed, Ayman M. Ismail
Abstract:
An existing RC building in Madinah is seismically evaluated with and without infill wall. Four model systems have been considered i. e. model I (no infill), model IIA (strut infill-update from field test), model IIB (strut infill- ASCE/SEI 41) and model IIC (strut infill-Soft storey-ASCE/SEI 41). Three dimensional pushover analyses have been carried out using SAP 2000 software incorporating inelastic material behavior for concrete, steel and infill walls. Infill wall has been modeled as equivalent strut according to suggested equation matching field test measurements and to the ASCE/SEI 41 equation. The effect of building modeling on the performance point as well as capacity and demand spectra due to EQ design spectrum function in Madinah area has been investigated. The response modification factor (R) for the 5 story RC building is evaluated from capacity and demand spectra (ATC-40) for the studied models. The results are summarized and discussed.Keywords: infill wall, pushover analysis, response modification factor, seismic assessment
Procedia PDF Downloads 39319568 Making the Right Call for Falls: Evaluating the Efficacy of a Multi-Faceted Trust Wide Approach to Improving Patient Safety Post Falls
Authors: Jawaad Saleem, Hannah Wright, Peter Sommerville, Adrian Hopper
Abstract:
Introduction: Inpatient falls are the most commonly reported patient safety incidents, and carry a significant burden on resources, morbidity, and mortality. Ensuring adequate post falls management of patients by staff is therefore paramount to maintaining patient safety especially in out of hours and resource stretched settings. Aims: This quality improvement project aims to improve the current practice of falls management at Guys St Thomas Hospital, London as compared to our 2016 Quality Improvement Project findings. Furthermore, it looks to increase current junior doctors confidence in managing falls and their use of new guidance protocols. Methods: Multifaceted Interventions implemented included: the development of new trust wide guidelines detailing management pathways for patients post falls, available for intranet access. Furthermore, the production of 2000 lanyard cards distributed amongst junior doctors and staff which summarised these guidelines. Additionally, a ‘safety signal’ email was sent from the Trust chief medical officer to all staff raising awareness of falls and the guidelines. Formal falls teaching was also implemented for new doctors at induction. Using an established incident database, 189 consecutive falls in 2017were retrospectively analysed electronically to assess and compared to the variables measured in 2016 post interventions. A separate serious incident database was used to analyse 50 falls from May 2015 to March 2018 to ascertain the statistical significance of the impact of our interventions on serious incidents. A similar questionnaire for the 2017 cohort of foundation year one (FY1) doctors was performed and compared to 2016 results. Results: Questionnaire data demonstrated improved awareness and utility of guidelines and increased confidence as well as an increase in training. 97% of FY1 trainees felt that the interventions had increased their awareness of the impact of falls on patients in the trust. Data from the incident database demonstrated the time to review patients post fall had decreased from an average of 130 to 86 minutes. Improvement was also demonstrated in the reduced time to order and schedule X-ray and CT imaging, 3 and 5 hours respectively. Data from the serious incident database show that ‘the time from fall until harm was detected’ was statistically significantly lower (P = 0.044) post intervention. We also showed the incidence of significant delays in detecting harm ( > 10 hours) reduced post intervention. Conclusions: Our interventions have helped to significantly reduce the average time to assess, order and schedule appropriate imaging post falls. Delays of over ten hours to detect serious injuries after falls were commonplace; since the intervention, their frequency has markedly reduced. We suggest this will lead to identifying patient harm sooner, reduced clinical incidents relating to falls and thus improve overall patient safety. Our interventions have also helped increase clinical staff confidence, management, and awareness of falls in the trust. Next steps include expanding teaching sessions, improving multidisciplinary team involvement to aid this improvement.Keywords: patient safety, quality improvement, serious incidents, falls, clinical care
Procedia PDF Downloads 12419567 Scorbot-ER 4U Using Forward Kinematics Modelling and Analysis
Authors: D. Maneetham, L. Sivhour
Abstract:
Robotic arm manipulators are widely used to accomplish many kinds of tasks. SCORBOT-ER 4u is a 5-degree of freedom (DOF) vertical articulated educational robotic arm, and all joints are revolute. It is specifically designed to perform pick and place task with its gripper. The pick and place task consists of consideration of the end effector coordinate of the robotic arm and the desired position coordinate in its workspace. This paper describes about forward kinematics modeling and analysis of the robotic end effector motion through joint space. The kinematics problems are defined by the transformation from the Cartesian space to the joint space. Denavit-Hartenberg (D-H) model is used in order to model the robotic links and joints with 4x4 homogeneous matrix. The forward kinematics model is also developed and simulated in MATLAB. The mathematical model is validated by using robotic toolbox in MATLAB. By using this method, it may be applicable to get the end effector coordinate of this robotic arm and other similar types to this arm. The software development of SCORBOT-ER 4u is also described here. PC-and EtherCAT based control technology from BECKHOFF is used to control the arm to express the pick and place task.Keywords: forward kinematics, D-H model, robotic toolbox, PC- and EtherCAT-based control
Procedia PDF Downloads 17919566 A Methodology for Automatic Diversification of Document Categories
Authors: Dasom Kim, Chen Liu, Myungsu Lim, Su-Hyeon Jeon, ByeoungKug Jeon, Kee-Young Kwahk, Namgyu Kim
Abstract:
Recently, numerous documents including unstructured data and text have been created due to the rapid increase in the usage of social media and the Internet. Each document is usually provided with a specific category for the convenience of the users. In the past, the categorization was performed manually. However, in the case of manual categorization, not only can the accuracy of the categorization be not guaranteed but the categorization also requires a large amount of time and huge costs. Many studies have been conducted towards the automatic creation of categories to solve the limitations of manual categorization. Unfortunately, most of these methods cannot be applied to categorizing complex documents with multiple topics because the methods work by assuming that one document can be categorized into one category only. In order to overcome this limitation, some studies have attempted to categorize each document into multiple categories. However, they are also limited in that their learning process involves training using a multi-categorized document set. These methods therefore cannot be applied to multi-categorization of most documents unless multi-categorized training sets are provided. To overcome the limitation of the requirement of a multi-categorized training set by traditional multi-categorization algorithms, we previously proposed a new methodology that can extend a category of a single-categorized document to multiple categorizes by analyzing relationships among categories, topics, and documents. In this paper, we design a survey-based verification scenario for estimating the accuracy of our automatic categorization methodology.Keywords: big data analysis, document classification, multi-category, text mining, topic analysis
Procedia PDF Downloads 27219565 Wind Interference Effects on Various Plan Shape Buildings Under Wind Load
Authors: Ritu Raj, Hrishikesh Dubey
Abstract:
This paper presents the results of the experimental investigations carried out on two intricate plan shaped buildings to evaluate aerodynamic performance of the building. The purpose is to study the associated environment arising due to wind forces in isolated and interference conditions on a model of scale 1:300 with a prototype having 180m height. Experimental tests were carried out at the boundary layer wind tunnel considering isolated conditions with 0° to 180° isolated wind directions and four interference conditions of twin building (separately for both the models). The research has been undertaken in Terrain Category-II, which is the most widely available terrain in India. A comparative assessment of the two models is performed out in an attempt to comprehend the various consequences of diverse conditions that may emerge in real-life situations, as well as the discrepancies amongst them. Experimental results of wind pressure coefficients of Model-1 and Model-2 shows good agreement with various wind incidence conditions with minute difference in the magnitudes of mean Cp. On the basis of wind tunnel studies, it is distinguished that the performance of Model-2 is better than Model-1in both isolated as well as interference conditions for all wind incidences and orientations respectively.Keywords: interference factor, tall buildings, wind direction, mean pressure-coefficients
Procedia PDF Downloads 12819564 Pricing Effects on Equitable Distribution of Forest Products and Livelihood Improvement in Nepalese Community Forestry
Authors: Laxuman Thakuri
Abstract:
Despite the large number of in-depth case studies focused on policy analysis, institutional arrangement, and collective action of common property resource management; how the local institutions take the pricing decision of forest products in community forest management and what kinds of effects produce it, the answers of these questions are largely silent among the policy-makers and researchers alike. The study examined how the local institutions take the pricing decision of forest products in the lowland community forestry of Nepal and how the decisions affect to equitable distribution of benefits and livelihood improvement which are also objectives of Nepalese community forestry. The study assumes that forest products pricing decisions have multiple effects on equitable distribution and livelihood improvement in the areas having heterogeneous socio-economic conditions. The dissertation was carried out at four community forests of lowland, Nepal that has characteristics of high value species, matured-experience of community forest management and better record-keeping system of forest products production, pricing and distribution. The questionnaire survey, individual to group discussions and direct field observation were applied for data collection from the field, and Lorenz curve, gini-coefficient, χ²-text, and SWOT (Strong, Weak, Opportunity, and Threat) analysis were performed for data analysis and results interpretation. The dissertation demonstrates that the low pricing strategy of high-value forest products was supposed crucial to increase the access of socio-economically weak households, and to and control over the important forest products such as timber, but found counter productive as the strategy increased the access of socio-economically better-off households at higher rate. In addition, the strategy contradicts to collect a large-scale community fund and carry out livelihood improvement activities as per the community forestry objectives. The crucial part of the study is despite the fact of low pricing strategy; the timber alone contributed large part of community fund collection. The results revealed close relation between pricing decisions and livelihood objectives. The action research result shows that positive price discrimination can slightly reduce the prevailing inequality and increase the fund. However, it lacks to harness the full price of forest products and collects a large-scale community fund. For broader outcomes of common property resource management in terms of resource sustainability, equity, and livelihood opportunity, the study suggests local institutions to harness the full price of resource products with respect to the local market.Keywords: community, equitable, forest, livelihood, socioeconomic, Nepal
Procedia PDF Downloads 53719563 Identifying the Structural Components of Old Buildings from Floor Plans
Authors: Shi-Yu Xu
Abstract:
The top three risk factors that have contributed to building collapses during past earthquake events in Taiwan are: "irregular floor plans or elevations," "insufficient columns in single-bay buildings," and the "weak-story problem." Fortunately, these unsound structural characteristics can be directly identified from the floor plans. However, due to the vast number of old buildings, conducting manual inspections to identify these compromised structural features in all existing structures would be time-consuming and prone to human errors. This study aims to develop an algorithm that utilizes artificial intelligence techniques to automatically pinpoint the structural components within a building's floor plans. The obtained spatial information will be utilized to construct a digital structural model of the building. This information, particularly regarding the distribution of columns in the floor plan, can then be used to conduct preliminary seismic assessments of the building. The study employs various image processing and pattern recognition techniques to enhance detection efficiency and accuracy. The study enables a large-scale evaluation of structural vulnerability for numerous old buildings, providing ample time to arrange for structural retrofitting in those buildings that are at risk of significant damage or collapse during earthquakes.Keywords: structural vulnerability detection, object recognition, seismic capacity assessment, old buildings, artificial intelligence
Procedia PDF Downloads 8919562 Development of the Academic Model to Predict Student Success at VUT-FSASEC Using Decision Trees
Authors: Langa Hendrick Musawenkosi, Twala Bhekisipho
Abstract:
The success or failure of students is a concern for every academic institution, college, university, governments and students themselves. Several approaches have been researched to address this concern. In this paper, a view is held that when a student enters a university or college or an academic institution, he or she enters an academic environment. The academic environment is unique concept used to develop the solution for making predictions effectively. This paper presents a model to determine the propensity of a student to succeed or fail in the French South African Schneider Electric Education Center (FSASEC) at the Vaal University of Technology (VUT). The Decision Tree algorithm is used to implement the model at FSASEC.Keywords: FSASEC, academic environment model, decision trees, k-nearest neighbor, machine learning, popularity index, support vector machine
Procedia PDF Downloads 20019561 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)
Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton
Abstract:
Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference
Procedia PDF Downloads 10819560 Analytical Modeling of Drain Current for DNA Biomolecule Detection in Double-Gate Tunnel Field-Effect Transistor Biosensor
Authors: Ashwani Kumar
Abstract:
Abstract- This study presents an analytical modeling approach for analyzing the drain current behavior in Tunnel Field-Effect Transistor (TFET) biosensors used for the detection of DNA biomolecules. The proposed model focuses on elucidating the relationship between the drain current and the presence of DNA biomolecules, taking into account the impact of various device parameters and biomolecule characteristics. Through comprehensive analysis, the model offers insights into the underlying mechanisms governing the sensing performance of TFET biosensors, aiding in the optimization of device design and operation. A non-local tunneling model is incorporated with other essential models to accurately trace the simulation and modeled data. An experimental validation of the model is provided, demonstrating its efficacy in accurately predicting the drain current response to DNA biomolecule detection. The sensitivity attained from the analytical model is compared and contrasted with the ongoing research work in this area.Keywords: biosensor, double-gate TFET, DNA detection, drain current modeling, sensitivity
Procedia PDF Downloads 5719559 Component Lifecycle and Concurrency Model in Usage Control (UCON) System
Authors: P. Ghann, J. Shiguang, C. Zhou
Abstract:
Access control is one of the most challenging issues facing information security. Access control is defined as, the ability to permit or deny access to a particular computational resource or digital information by an unauthorized user or subject. The concept of usage control (UCON) has been introduced as a unified approach to capture a number of extensions for access control models and systems. In UCON, an access decision is determined by three factors: Authorizations, obligations and conditions. Attribute mutability and decision continuity are two distinct characteristics introduced by UCON for the first time. An observation of UCON components indicates that, the components are predefined and static. In this paper, we propose a new and flexible model of usage control for the creation and elimination of some of these components; for example new objects, subjects, attributes and integrate these with the original UCON model. We also propose a model for concurrent usage scenarios in UCON.Keywords: access control, concurrency, digital container, usage control
Procedia PDF Downloads 32119558 Teaching English in Low Resource-Environments: Problems and Prospects
Authors: Gift Chidi-Onwuta, Iwe Nkem Nkechinyere, Chikamadu Christabelle Chinyere
Abstract:
The teaching of English is a resource-driven activity that requires rich resource-classroom settings for the delivery of effective lessons and the acquisition of interpersonal skills for integration in a target-language environment. However, throughout the world, English is often taught in low-resource classrooms. This paper is aimed to reveal the common problems associated with teaching English in low-resource environments and the prospects for teachers who found themselves in such undefined teaching settings. Self-structured and validated questionnaire in a closed-ended format, open question format and scaling format was administered to teachers across five countries: Nigeria, Cameroun, Iraq, Turkey, and Sudan. The study adopts situational language teaching theory (SLTT), which emphasizes a performance improvement imperative. This study inclines to this model because it maintains that learning must be fun and enjoyable like playing a favorite sport, just as in real life. Since teaching resources make learning engaging, we found this model apt for the current study. The perceptions of teachers about accessibility and functionality of teaching material resources, the nature of teaching outcomes in resource-less environments, their levels of involvement in improvisation and the prospects associated with resource limitations were sourced. Data were analysed using percentages and presented in frequency tables. Results: showed that a greater number of teachers across these nations do not have access to sufficient productive resource materials that can aid effective English language teaching. Teaching outcomes, from the findings, are affected by low material resources; however, results show certain advantages to teaching English with limited resources: flexibility and autonomy with students and creativity and innovation amongst teachers. Results further revealed group work, story, critical thinking strategy, flex, cardboards and flashcards, dictation and dramatization as common teaching strategies, as well as materials adopted by teachers to overcome low resource-related challenges in classrooms.Keywords: teaching materials, low-resource environments, English language teaching, situational language theory
Procedia PDF Downloads 13119557 The Concept of an Agile Enterprise Research Model
Authors: Maja Sajdak
Abstract:
The aim of this paper is to present the concept of an agile enterprise model and to initiate discussion on the research assumptions of the model presented. The implementation of the research project "The agility of enterprises in the process of adapting to the environment and its changes" began in August 2014 and is planned to last three years. The article has the form of a work-in-progress paper which aims to verify and initiate a debate over the proposed research model. In the literature there are very few publications relating to research into agility; it can be concluded that the most controversial issue in this regard is the method of measuring agility. In previous studies the operationalization of agility was often fragmentary, focusing only on selected areas of agility, for example manufacturing, or analysing only selected sectors. As a result the measures created to date can only be treated as contributory to the development of precise measurement tools. This research project aims to fill a cognitive gap in the literature with regard to the conceptualization and operationalization of an agile company. Thus, the original contribution of the author of this project is the construction of a theoretical model that integrates manufacturing agility (consisting mainly in adaptation to the environment) and strategic agility (based on proactive measures). The author of this research project is primarily interested in the attributes of an agile enterprise which indicate that the company is able to rapidly adapt to changing circumstances and behave pro-actively.Keywords: agile company, acuity, entrepreneurship, flexibility, research model, strategic leadership
Procedia PDF Downloads 34419556 Procedural Protocol for Dual Energy Computed Tomography (DECT) Inversion
Authors: Rezvan Ravanfar Haghighi, S. Chatterjee, Pratik Kumar, V. C. Vani, Priya Jagia, Sanjiv Sharma, Susama Rani Mandal, R. Lakshmy
Abstract:
The dual energy computed tomography (DECT) aims at noting the HU(V) values for the sample at two different voltages V=V1, V2 and thus obtain the electron densities (ρe) and effective atomic number (Zeff) of the substance. In the present paper, we aim to obtain a numerical algorithm by which (ρe, Zeff) can be obtained from the HU(100) and HU(140) data, where V=100, 140 kVp. The idea is to use this inversion method to characterize and distinguish between the lipid and fibrous coronary artery plaques.With the idea to develop the inversion algorithm for low Zeff materials, as is the case with non calcified coronary artery plaque, we prepare aqueous samples whose calculated values of (ρe, Zeff) lie in the range (2.65×1023≤ ρe≤ 3.64×1023 per cc ) and (6.80≤ Zeff ≤ 8.90). We fill the phantom with these known samples and experimentally determine HU(100) and HU(140) for the same pixels. Knowing that the HU(V) values are related to the attenuation coefficient of the system, we present an algorithm by which the (ρe, Zeff) is calibrated with respect to (HU(100), HU(140)). The calibration is done with a known set of 20 samples; its accuracy is checked with a different set of 23 known samples. We find that the calibration gives the ρe with an accuracy of ± 4% while Zeff is found within ±1% of the actual value, the confidence being 95%.In this inversion method (ρe, Zeff) of the scanned sample can be found by eliminating the effects of the CT machine and also by ensuring that the determination of the two unknowns (ρe, Zeff) does not interfere with each other. It is found that this algorithm can be used for prediction of chemical characteristic (ρe, Zeff) of unknown scanned materials with 95% confidence level, by inversion of the DECT data.Keywords: chemical composition, dual-energy computed tomography, inversion algorithm
Procedia PDF Downloads 43819555 Utilization of Schnerr-Sauer Cavitation Model for Simulation of Cavitation Inception and Super Cavitation
Authors: Mohammadreza Nezamirad, Azadeh Yazdi, Sepideh Amirahmadian, Nasim Sabetpour, Amirmasoud Hamedi
Abstract:
In this study, the Reynolds-Stress-Navier-Stokes framework is utilized to investigate the flow inside the diesel injector nozzle. The flow is assumed to be multiphase as the formation of vapor by pressure drop is visualized. For pressure and velocity linkage, the coupled algorithm is used. Since the cavitation phenomenon inherently is unsteady, the quasi-steady approach is utilized for saving time and resources in the current study. Schnerr-Sauer cavitation model is used, which was capable of predicting flow behavior both at the initial and final steps of the cavitation process. Two different turbulent models were used in this study to clarify which one is more capable in predicting cavitation inception and super-cavitation. It was found that K-ε was more compatible with the Shnerr-Sauer cavitation model; therefore, the mentioned model is used for the rest of this study.Keywords: CFD, RANS, cavitation, fuel, injector
Procedia PDF Downloads 209