Search results for: accurate forecast
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2799

Search results for: accurate forecast

819 Reimagining the Management of Telco Supply Chain with Blockchain

Authors: Jeaha Yang, Ahmed Khan, Donna L. Rodela, Mohammed A. Qaudeer

Abstract:

Traditional supply chain silos still exist today due to the difficulty of establishing trust between various partners and technological barriers across industries. Companies lose opportunities and revenue and inadvertently make poor business decisions resulting in further challenges. Blockchain technology can bring a new level of transparency through sharing information with a distributed ledger in a decentralized manner that creates a basis of trust for business. Blockchain is a loosely coupled, hub-style communication network in which trading partners can work indirectly with each other for simpler integration, but they work together through the orchestration of their supply chain operations under a coherent process that is developed jointly. A Blockchain increases efficiencies, lowers costs, and improves interoperability to strengthen and automate the supply chain management process while all partners share the risk. Blockchain ledger is built to track inventory lifecycle for supply chain transparency and keeps a journal of inventory movement for real-time reconciliation. State design patterns are used to capture the life cycle (behavior) of inventory management as a state machine for a common, transparent and coherent process which creates an opportunity for trading partners to become more responsive in terms of changes or improvements in process, reconcile discrepancies, and comply with internal governance and external regulations. It enables end-to-end, inter-company visibility at the unit level for more accurate demand planning with better insight into order fulfillment and replenishment.

Keywords: supply chain management, inventory trace-ability, perpetual inventory system, inventory lifecycle, blockchain, inventory consignment, supply chain transparency, digital thread, demand planning, hyper ledger fabric

Procedia PDF Downloads 88
818 Day Ahead and Intraday Electricity Demand Forecasting in Himachal Region using Machine Learning

Authors: Milan Joshi, Harsh Agrawal, Pallaw Mishra, Sanand Sule

Abstract:

Predicting electricity usage is a crucial aspect of organizing and controlling sustainable energy systems. The task of forecasting electricity load is intricate and requires a lot of effort due to the combined impact of social, economic, technical, environmental, and cultural factors on power consumption in communities. As a result, it is important to create strong models that can handle the significant non-linear and complex nature of the task. The objective of this study is to create and compare three machine learning techniques for predicting electricity load for both the day ahead and intraday, taking into account various factors such as meteorological data and social events including holidays and festivals. The proposed methods include a LightGBM, FBProphet, combination of FBProphet and LightGBM for day ahead and Motifs( Stumpy) based on Mueens algorithm for similarity search for intraday. We utilize these techniques to predict electricity usage during normal days and social events in the Himachal Region. We then assess their performance by measuring the MSE, RMSE, and MAPE values. The outcomes demonstrate that the combination of FBProphet and LightGBM method is the most accurate for day ahead and Motifs for intraday forecasting of electricity usage, surpassing other models in terms of MAPE, RMSE, and MSE. Moreover, the FBProphet - LightGBM approach proves to be highly effective in forecasting electricity load during social events, exhibiting precise day ahead predictions. In summary, our proposed electricity forecasting techniques display excellent performance in predicting electricity usage during normal days and special events in the Himachal Region.

Keywords: feature engineering, FBProphet, LightGBM, MASS, Motifs, MAPE

Procedia PDF Downloads 66
817 Challenges of eradicating neglected tropical diseases

Authors: Marziye Hadian, Alireza Jabbari

Abstract:

Background: Each year, tropical diseases affect large numbers of tropical or subtropical populations and give rise to irreparable financial and human damage. Among these diseases, some are known as Neglected Tropical Disease (NTD) that may cause unusual dangers; however, they have not been appropriately accounted for. Taking into account the priority of eradication of the disease, this study explored the causes of failure to eradicate neglected tropical diseases. Method: This study was a systematized review that was conducted in January 2021 on the articles related to neglected tropical diseases on databases of Web of Science, PubMed, Scopus, Science Direct, Ovid, Pro-Quest, and Google Scholar. The Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines as well as Critical Appraisal Skills Program (CASP) for articles and AACODS (Authority, Accuracy, Coverage, Objectivity, Date, Significance) for grey literature (provides five criteria for judging the quality of grey information) were integrated. Finding: The challenges in controlling and eradicating neglected tropical diseases in four general themes are as follows: shortcomings in disease management policies and programs, environmental challenges, executive challenges in policy disease and research field and 36 sub-themes. Conclusion: To achieve the goals of eradicating forgotten tropical diseases, it seems indispensable to free up financial, human and research resources, proper management of health infrastructure, attention to migrants and refugees, clear targeting, prioritization appropriate to local conditions and special attention to political and social developments. Reducing the number of diseases should free up resources for the management of neglected tropical diseases prone to epidemics as dengue, chikungunya and leishmaniasis. For the purpose of global support, targeting should be accurate.

Keywords: neglected tropical disease, NTD, preventive, eradication

Procedia PDF Downloads 128
816 From News Breakers to News Followers: The Influence of Facebook on the Coverage of the January 2010 Crisis in Jos

Authors: T. Obateru, Samuel Olaniran

Abstract:

In an era when the new media is affording easy access to packaging and dissemination of information, the social media have become a popular avenue for sharing information for good or ill. It is evident that the traditional role of journalists as ‘news breakers’ is fast being eroded. People now share information on happenings via the social media like Facebook, Twitter and the rest, such that journalists themselves now get leads on happenings from such sources. Beyond the access to information provided by the new media is the erosion of the gatekeeping role of journalists who by their training and calling, are supposed to handle information with responsibility. Thus, sensitive information that journalists would normally filter is randomly shared by social media activists. This was the experience of journalists in Jos, Plateau State in January 2010 when another of the recurring ethnoreligious crisis that engulfed the state resulted in another widespread killing, vandalism, looting, and displacements. Considered as one of the high points of crises in the state, journalists who had the duty of covering the crisis also relied on some of these sources to get their bearing on the violence. This paper examined the role of Facebook in the work of journalists who covered the 2010 crisis. Taking the gatekeeping perspective, it interrogated the extent to which Facebook impacted their professional duty positively or negatively vis-à-vis the peace journalism model. It employed survey to elicit information from 50 journalists who covered the crisis using questionnaire as instrument. The paper revealed that the dissemination of hate information via mobile phones and social media, especially Facebook, aggravated the crisis situation. Journalists became news followers rather than news breakers because a lot of them were put on their toes by information (many of which were inaccurate or false) circulated on Facebook. It recommended that journalists must remain true to their calling by upholding their ‘gatekeeping’ role of disseminating only accurate and responsible information if they would remain the main source of credible information on which their audience rely.

Keywords: crisis, ethnoreligious, Facebook, journalists

Procedia PDF Downloads 289
815 A Framework on Data and Remote Sensing for Humanitarian Logistics

Authors: Vishnu Nagendra, Marten Van Der Veen, Stefania Giodini

Abstract:

Effective humanitarian logistics operations are a cornerstone in the success of disaster relief operations. However, for effectiveness, they need to be demand driven and supported by adequate data for prioritization. Without this data operations are carried out in an ad hoc manner and eventually become chaotic. The current availability of geospatial data helps in creating models for predictive damage and vulnerability assessment, which can be of great advantage to logisticians to gain an understanding on the nature and extent of the disaster damage. This translates into actionable information on the demand for relief goods, the state of the transport infrastructure and subsequently the priority areas for relief delivery. However, due to the unpredictable nature of disasters, the accuracy in the models need improvement which can be done using remote sensing data from UAVs (Unmanned Aerial Vehicles) or satellite imagery, which again come with certain limitations. This research addresses the need for a framework to combine data from different sources to support humanitarian logistic operations and prediction models. The focus is on developing a workflow to combine data from satellites and UAVs post a disaster strike. A three-step approach is followed: first, the data requirements for logistics activities are made explicit, which is done by carrying out semi-structured interviews with on field logistics workers. Second, the limitations in current data collection tools are analyzed to develop workaround solutions by following a systems design approach. Third, the data requirements and the developed workaround solutions are fit together towards a coherent workflow. The outcome of this research will provide a new method for logisticians to have immediately accurate and reliable data to support data-driven decision making.

Keywords: unmanned aerial vehicles, damage prediction models, remote sensing, data driven decision making

Procedia PDF Downloads 373
814 Molecularly Imprinted Nanoparticles (MIP NPs) as Non-Animal Antibodies Substitutes for Detection of Viruses

Authors: Alessandro Poma, Kal Karim, Sergey Piletsky, Giuseppe Battaglia

Abstract:

The recent increasing emergency threat to public health of infectious influenza diseases has prompted interest in the detection of avian influenza virus (AIV) H5N1 in humans as well as animals. A variety of technologies for diagnosing AIV infection have been developed. However, various disadvantages (costs, lengthy analyses, and need for high-containment facilities) make these methods less than ideal in their practical application. Molecularly Imprinted Polymeric Nanoparticles (MIP NPs) are suitable to overcome these limitations by having high affinity, selectivity, versatility, scalability and cost-effectiveness with the versatility of post-modification (labeling – fluorescent, magnetic, optical) opening the way to the potential introduction of improved diagnostic tests capable of providing rapid differential diagnosis. Here we present our first results in the production and testing of MIP NPs for the detection of AIV H5N1. Recent developments in the solid-phase synthesis of MIP NPs mean that for the first time a reliable supply of ‘soluble’ synthetic antibodies can be made available for testing as potential biological or diagnostic active molecules. The MIP NPs have the potential to detect viruses that are widely circulating in farm animals and indeed humans. Early and accurate identification of the infectious agent will expedite appropriate control measures. Thus, diagnosis at an early stage of infection of a herd or flock or individual maximizes the efficiency with which containment, prevention and possibly treatment strategies can be implemented. More importantly, substantiating the practicability’s of these novel reagents should lead to an initial reduction and eventually to a potential total replacement of animals, both large and small, to raise such specific serological materials.

Keywords: influenza virus, molecular imprinting, nanoparticles, polymers

Procedia PDF Downloads 347
813 Shifting to Electronic Operative Notes in Plastic surgery

Authors: Samar Mousa, Galini Mavromatidou, Rebecca Shirley

Abstract:

Surgeons carry out numerous operations in the busy burns and plastic surgery department daily. Writing an accurate operation note with all the essential information is crucial for communication not only within the plastics team but also to the multi-disciplinary team looking after the patient, including other specialties, nurses and GPs. The Royal college of surgeons of England, in its guidelines of good surgical practice, mentioned that the surgeon should ensure that there are clear (preferably typed) operative notes for every procedure. The notes should accompany the patient into recovery and to the ward and should give sufficient detail to enable continuity of care by another doctor. The notes should include the Date and time, Elective/emergency procedure, Names of the operating surgeon and assistant, Name of the theatre anesthetist, Operative procedure carried out, Incision, Operative diagnosis, Operative findings, Any problems/complications, Any extra procedure performed and the reason why it was performed, Details of tissue removed, added or altered, Identification of any prosthesis used, including the serial numbers of prostheses and other implanted materials, Details of closure technique, Anticipated blood loss, Antibiotic prophylaxis (where applicable), DVT prophylaxis (where applicable), Detailed postoperative care instructions and Signature. Fourteen random days were chosen in December 2021 to assess the accuracy of operative notes and post-operative care. A total of 163 operative notes were examined. The average completion rates in all domains were 85.4%. An electronic operative note template was designed to cover all domains mentioned in the Royal College of surgeons' good surgical practice. It is kept in the hospital drive for all surgeons to use.

Keywords: operative notes, plastic surgery, documentation, electronic

Procedia PDF Downloads 76
812 Investigating a Deterrence Function for Work Trips for Perth Metropolitan Area

Authors: Ali Raouli, Amin Chegenizadeh, Hamid Nikraz

Abstract:

The Perth metropolitan area and its surrounding regions have been expanding rapidly in recent decades and it is expected that this growth will continue in the years to come. With this rapid growth and the resulting increase in population, consideration should be given to strategic planning and modelling for the future expansion of Perth. The accurate estimation of projected traffic volumes has always been a major concern for the transport modelers and planners. Development of a reliable strategic transport model depends significantly on the inputs data into the model and the calibrated parameters of the model to reflect the existing situation. Trip distribution is the second step in four-step modelling (FSM) which is complex due to its behavioral nature. Gravity model is the most common method for trip distribution. The spatial separation between the Origin and Destination (OD) zones will be reflected in gravity model by applying deterrence functions which provide an opportunity to include people’s behavior in choosing their destinations based on distance, time and cost of their journeys. Deterrence functions play an important role for distribution of the trips within a study area and would simulate the trip distances and therefore should be calibrated for any particular strategic transport model to correctly reflect the trip behavior within the modelling area. This paper aims to review the most common deterrence functions and propose a calibrated deterrence function for work trips within the Perth Metropolitan Area based on the information obtained from the latest available Household data and Perth and Region Travel Survey (PARTS) data. As part of this study, a four-step transport model using EMME software has been developed for Perth Metropolitan Area to assist with the analysis and findings.

Keywords: deterrence function, four-step modelling, origin destination, transport model

Procedia PDF Downloads 163
811 Trends in the Incidence of Bloodstream Infections in Patients with Hematological Malignancies in the Period 1991–2012

Authors: V. N. Chebotkevich, E. E. Schetinkina, V. V. Burylev, E. I. Kaytandzhan, N. P. Stizhak

Abstract:

Objective: Blood stream infections (BSI) are severe, life-threatening illness for immuno compromised patients with hematological malignancies. We report the trend in blood-stream infections in this group of patients in the period 1991-2013. Methods: A total of 4742 blood samples investigated. All blood cultures were incubated in a continuous monitoring system for 7 days before discarding negative. On signaled positive, organism was identified by conventional methods. The Real-time polymerase chain reaction (PCR) was used for the indication of human herpes virus 6 (HHV-6), Cytomegalovirus (CMV) and Epstein-Barr virus (EBV). Results: Between 1991 and 2001 the incidence of Gram-positive bacteria (Staphylococcus epidermidis, Staphylococcus aureus) being the most common germs isolated (70,9%) were as Gram-negative rods (Escherichia coli, Klebsiella spp., Pseudomonas spp.) – 29,1%. In next decade 2002-2012 the number of Gram-negative bacteria was increased up to 40.2%. It is shown that the incidence of bacteremia was significantly more frequent at the background of detectable Cytomegalovirus and Epstein-Barr virus-specific DNA in blood. Over recent years, an increased frequency of micro mycetes was registered in blood of the patients with hematological malignancies (Candida spp. was predominant). Conclusion: Accurate and timely detection of BSI is important in determining appropriate treatment of infectious complications in patients with hematological malignancies. The isolation of Staphylococcus epidermidis from blood cultures remains a clinical dilemma for physicians and microbiologists. But in many cases this agent is of the clinical significance in immunocompromised patients with hematological malignancies. The role of CMV and EBV in development of bacteremia was demonstrated.

Keywords: infectious complications, blood stream infections, bacteremia, hemoblastosis

Procedia PDF Downloads 346
810 A Statistical Approach to Predict and Classify the Commercial Hatchability of Chickens Using Extrinsic Parameters of Breeders and Eggs

Authors: M. S. Wickramarachchi, L. S. Nawarathna, C. M. B. Dematawewa

Abstract:

Hatchery performance is critical for the profitability of poultry breeder operations. Some extrinsic parameters of eggs and breeders cause to increase or decrease the hatchability. This study aims to identify the affecting extrinsic parameters on the commercial hatchability of local chicken's eggs and determine the most efficient classification model with a hatchability rate greater than 90%. In this study, seven extrinsic parameters were considered: egg weight, moisture loss, breeders age, number of fertilised eggs, shell width, shell length, and shell thickness. Multiple linear regression was performed to determine the most influencing variable on hatchability. First, the correlation between each parameter and hatchability were checked. Then a multiple regression model was developed, and the accuracy of the fitted model was evaluated. Linear Discriminant Analysis (LDA), Classification and Regression Trees (CART), k-Nearest Neighbors (kNN), Support Vector Machines (SVM) with a linear kernel, and Random Forest (RF) algorithms were applied to classify the hatchability. This grouping process was conducted using binary classification techniques. Hatchability was negatively correlated with egg weight, breeders' age, shell width, shell length, and positive correlations were identified with moisture loss, number of fertilised eggs, and shell thickness. Multiple linear regression models were more accurate than single linear models regarding the highest coefficient of determination (R²) with 94% and minimum AIC and BIC values. According to the classification results, RF, CART, and kNN had performed the highest accuracy values 0.99, 0.975, and 0.972, respectively, for the commercial hatchery process. Therefore, the RF is the most appropriate machine learning algorithm for classifying the breeder outcomes, which are economically profitable or not, in a commercial hatchery.

Keywords: classification models, egg weight, fertilised eggs, multiple linear regression

Procedia PDF Downloads 83
809 Evaluating the Water Balance of Sokoto Basement Complex to Address Water Security Challenges

Authors: Murtala Gada Abubakar, Aliyu T. Umar

Abstract:

A substantial part of Nigeria is part of semi-arid areas of the world, underlain by basement complex (hard) rocks which are very poor in both transmission and storage of appreciable quantity of water. Recently, a growing attention is being paid on the need to develop water resources in these areas largely due to concerns about increasing droughts and the need to maintain water security challenges. While there is ample body of knowledge that captures the hydrological behaviours of the sedimentary part, reported research which unambiguously illustrates water distribution in the basement complex of the Sokoto basin remains sparse. Considering the growing need to meet the water requirements of those living in this region necessitated the call for accurate water balance estimations that can inform a sustainable planning and development to address water security challenges for the area. To meet this task, a one-dimensional soil water balance model was developed and utilised to assess the state of water distribution within the Sokoto basin basement complex using measured meteorological variables and information about different landscapes within the complex. The model simulated the soil water storage and rates of input and output of water in response to climate and irrigation where applicable using data from 2001 to 2010 inclusive. The results revealed areas within the Sokoto basin basement complex that are rich and deficient in groundwater resource. The high potential areas identified includes the fadama, the fractured rocks and the cultivated lands, while the low potential areas are the sealed surfaces and non-fractured rocks. This study concludes that the modelling approach is a useful tool for assessing the hydrological behaviour and for better understanding the water resource availability within a basement complex.

Keywords: basement complex, hydrological processes, Sokoto Basin, water security

Procedia PDF Downloads 313
808 Design of Sustainable Concrete Pavement by Incorporating RAP Aggregates

Authors: Selvam M., Vadthya Poornachandar, Surender Singh

Abstract:

These Reclaimed Asphalt Pavement (RAP) aggregates are generally dumped in the open area after the demolition of Asphalt Pavements. The utilization of RAP aggregates in cement concrete pavements may provide several socio-economic-environmental benefits and could embrace the circular economy. The cross recycling of RAP aggregates in the concrete pavement could reduce the consumption of virgin aggregates and saves the fertile land. However, the structural, as well as functional properties of RAP-concrete could be significantly lower than the conventional Pavement Quality Control (PQC) pavements. This warrants judicious selection of RAP fraction (coarse and fine aggregates) along with the accurate proportion of the same for PQC highways. Also, the selection of the RAP fraction and its proportion shall not be solely based on the mechanical properties of RAP-concrete specimens but also governed by the structural and functional behavior of the pavement system. In this study, an effort has been made to predict the optimum RAP fraction and its corresponding proportion for cement concrete pavements by considering the low-volume and high-volume roads. Initially, the effect of inclusions of RAP on the fresh and mechanical properties of concrete pavement mixes is mapped through an extensive literature survey. Almost all the studies available to date are considered for this study. Generally, Indian Roads Congress (IRC) methods are the most widely used design method in India for the analysis of concrete pavements, and the same has been considered for this study. Subsequently, fatigue damage analysis is performed to evaluate the required safe thickness of pavement slab for different fractions of RAP (coarse RAP). Consequently, the performance of RAP-concrete is predicted by employing the AASHTO-1993 model for the following distresses conditions: faulting, cracking, and smoothness. The performance prediction and total cost analysis of RAP aggregates depict that the optimum proportions of coarse RAP aggregates in the PQC mix are 35% and 50% for high volume and low volume roads, respectively.

Keywords: concrete pavement, RAP aggregate, performance prediction, pavement design

Procedia PDF Downloads 154
807 Evaluating the Dosimetric Performance for 3D Treatment Planning System for Wedged and Off-Axis Fields

Authors: Nashaat A. Deiab, Aida Radwan, Mohamed S. Yahiya, Mohamed Elnagdy, Rasha Moustafa

Abstract:

This study is to evaluate the dosimetric performance of our institution's 3D treatment planning system for wedged and off-axis 6MV photon beams, guided by the recommended QA tests documented in the AAPM TG53; NCS report 15 test packages, IAEA TRS 430 and ESTRO booklet no.7. The study was performed for Elekta Precise linear accelerator designed for clinical range of 4, 6 and 15 MV photon beams with asymmetric jaws and fully integrated multileaf collimator that enables high conformance to target with sharp field edges. Ten tests were applied on solid water equivalent phantom along with 2D array dose detection system. The calculated doses using 3D treatment planning system PrecisePLAN were compared with measured doses to make sure that the dose calculations are accurate for simple situations such as square and elongated fields, different SSD, beam modifiers e.g. wedges, blocks, MLC-shaped fields and asymmetric collimator settings. The QA results showed dosimetric accuracy of the TPS within the specified tolerance limits. Except for large elongated wedged field, the central axis and outside central axis have errors of 0.2% and 0.5%, respectively, and off- planned and off-axis elongated fields the region outside the central axis of the beam errors are 0.2% and 1.1%, respectively. The dosimetric investigated results yielded differences within the accepted tolerance level as recommended. Differences between dose values predicted by the TPS and measured values at the same point are the result from limitations of the dose calculation, uncertainties in the measurement procedure, or fluctuations in the output of the accelerator.

Keywords: quality assurance, dose calculation, wedged fields, off-axis fields, 3D treatment planning system, photon beam

Procedia PDF Downloads 435
806 Landcover Mapping Using Lidar Data and Aerial Image and Soil Fertility Degradation Assessment for Rice Production Area in Quezon, Nueva Ecija, Philippines

Authors: Eliza. E. Camaso, Guiller. B. Damian, Miguelito. F. Isip, Ronaldo T. Alberto

Abstract:

Land-cover maps were important for many scientific, ecological and land management purposes and during the last decades, rapid decrease of soil fertility was observed to be due to land use practices such as rice cultivation. High-precision land-cover maps are not yet available in the area which is important in an economy management. To assure   accurate mapping of land cover to provide information, remote sensing is a very suitable tool to carry out this task and automatic land use and cover detection. The study did not only provide high precision land cover maps but it also provides estimates of rice production area that had undergone chemical degradation due to fertility decline. Land-cover were delineated and classified into pre-defined classes to achieve proper detection features. After generation of Land-cover map, of high intensity of rice cultivation, soil fertility degradation assessment in rice production area due to fertility decline was created to assess the impact of soils used in agricultural production. Using Simple spatial analysis functions and ArcGIS, the Land-cover map of Municipality of Quezon in Nueva Ecija, Philippines was overlaid to the fertility decline maps from Land Degradation Assessment Philippines- Bureau of Soils and Water Management (LADA-Philippines-BSWM) to determine the area of rice crops that were most likely where nitrogen, phosphorus, zinc and sulfur deficiencies were induced by high dosage of urea and imbalance N:P fertilization. The result found out that 80.00 % of fallow and 99.81% of rice production area has high soil fertility decline.

Keywords: aerial image, landcover, LiDAR, soil fertility degradation

Procedia PDF Downloads 248
805 Numerical Simulation of Air Pollutant Using Coupled AERMOD-WRF Modeling System over Visakhapatnam: A Case Study

Authors: Amit Kumar

Abstract:

Accurate identification of deteriorated air quality regions is very helpful in devising better environmental practices and mitigation efforts. In the present study, an attempt has been made to identify the air pollutant dispersion patterns especially NOX due to vehicular and industrial sources over a rapidly developing urban city, Visakhapatnam (17°42’ N, 83°20’ E), India, during April 2009. Using the emission factors of different vehicles as well as the industry, a high resolution 1 km x 1 km gridded emission inventory has been developed for Visakhapatnam city. A dispersion model AERMOD with explicit representation of planetary boundary layer (PBL) dynamics and offline coupled through a developed coupler mechanism with a high resolution mesoscale model WRF-ARW resolution for simulating the dispersion patterns of NOX is used in the work. The meteorological as well as PBL parameters obtained by employing two PBL schemes viz., non-local Yonsei University (YSU) and local Mellor-Yamada-Janjic (MYJ) of WRF-ARW model, which are reasonably representing the boundary layer parameters are considered for integrating AERMOD. Significantly different dispersion patterns of NOX have been noticed between summer and winter months. The simulated NOX concentration is validated with available six monitoring stations of Central Pollution Control Board, India. Statistical analysis of model evaluated concentrations with the observations reveals that WRF-ARW of YSU scheme with AERMOD has shown better performance. The deteriorated air quality locations are identified over Visakhapatnam based on the validated model simulations of NOX concentrations. The present study advocates the utility of tNumerical Simulation of Air Pollutant Using Coupled AERMOD-WRF Modeling System over Visakhapatnam: A Case Studyhe developed gridded emission inventory of NOX with coupled WRF-AERMOD modeling system for air quality assessment over the study region.

Keywords: WRF-ARW, AERMOD, planetary boundary layer, air quality

Procedia PDF Downloads 273
804 Prognostic Impact of Pre-transplant Ferritinemia: A Survival Analysis Among Allograft Patients

Authors: Mekni Sabrine, Nouira Mariem

Abstract:

Background and aim: Allogeneic hematopoietic stem cell transplantation is a curative treatment for several hematological diseases; however, it has a non-negligible morbidity and mortality depending on several prognostic factors, including pre-transplant hyperferritinemia. The aim of our study was to estimate the impact of hyperferritinemia on survivals and on the occurrence of post-transplant complications. Methods: It was a longitudinal study conducted over 8 years and including all patients who had a first allograft. The impact of pretransplant hyperferritinemia (ferritinemia ≥1500) on survivals was studied using the Kaplan Meier method and the COX model for uni- and multivariate analysis. The Khi-deux test and binary logistic regression were used to study the association between pretransplant ferritinemia and post-transplant complications. Results: One hundred forty patients were included with an average age of 26.6 years and a sex ratio (M/F)=1.4. Hyperferritinemia was found in 33% of patients. It had no significant impact on either overall survival (p=0.9) or event -free survival (p=0.6). In multivariate analysis, only the type of disease was independently associated with overall survival (p=0.04) and event-free survival (p=0.002). For post-allograft complications: The occurrence of early documented infections was independently associated with pretransplant hyperferritinemia (p=0.02) and the presence of acute graft versus host disease( GVHD) (p<10-3). The occurrence of acute GVHD was associated with early documented infection (p=0.002) and Cytomegalovirus reactivation (p<10-3). The occurrence of chronic GVHD was associated with the presence of Cytomegalovirus reactivation (p=0.006) and graft source (p=0.009). Conclusion: Our study showed the significant impact of pre-transplant hyperferritinemia on the occurrence of early infections but not on survivals. Early and more accurate assessment iron overload by other tests such as liver magnetic resonance imaging with initiation of chelating treatment could prevent the occurrence of such complications after transplantation.

Keywords: allogeneic, transplants, ferritin, survival

Procedia PDF Downloads 61
803 Comparison of Two Neural Networks To Model Margarine Age And Predict Shelf-Life Using Matlab

Authors: Phakamani Xaba, Robert Huberts, Bilainu Oboirien

Abstract:

The present study was aimed at developing & comparing two neural-network-based predictive models to predict shelf-life/product age of South African margarine using free fatty acid (FFA), water droplet size (D3.3), water droplet distribution (e-sigma), moisture content, peroxide value (PV), anisidine valve (AnV) and total oxidation (totox) value as input variables to the model. Brick margarine products which had varying ages ranging from fresh i.e. week 0 to week 47 were sourced. The brick margarine products which had been stored at 10 & 25 °C and were characterized. JMP and MATLAB models to predict shelf-life/ margarine age were developed and their performances were compared. The key performance indicators to evaluate the model performances were correlation coefficient (CC), root mean square error (RMSE), and mean absolute percentage error (MAPE) relative to the actual data. The MATLAB-developed model showed a better performance in all three performance indicators. The correlation coefficient of the MATLAB model was 99.86% versus 99.74% for the JMP model, the RMSE was 0.720 compared to 1.005 and the MAPE was 7.4% compared to 8.571%. The MATLAB model was selected to be the most accurate, and then, the number of hidden neurons/ nodes was optimized to develop a single predictive model. The optimized MATLAB with 10 neurons showed a better performance compared to the models with 1 & 5 hidden neurons. The developed models can be used by margarine manufacturers, food research institutions, researchers etc, to predict shelf-life/ margarine product age, optimize addition of antioxidants, extend shelf-life of products and proactively troubleshoot for problems related to changes which have an impact on shelf-life of margarine without conducting expensive trials.

Keywords: margarine shelf-life, predictive modelling, neural networks, oil oxidation

Procedia PDF Downloads 189
802 An Unsupervised Domain-Knowledge Discovery Framework for Fake News Detection

Authors: Yulan Wu

Abstract:

With the rapid development of social media, the issue of fake news has gained considerable prominence, drawing the attention of both the public and governments. The widespread dissemination of false information poses a tangible threat across multiple domains of society, including politics, economy, and health. However, much research has concentrated on supervised training models within specific domains, their effectiveness diminishes when applied to identify fake news across multiple domains. To solve this problem, some approaches based on domain labels have been proposed. By segmenting news to their specific area in advance, judges in the corresponding field may be more accurate on fake news. However, these approaches disregard the fact that news records can pertain to multiple domains, resulting in a significant loss of valuable information. In addition, the datasets used for training must all be domain-labeled, which creates unnecessary complexity. To solve these problems, an unsupervised domain knowledge discovery framework for fake news detection is proposed. Firstly, to effectively retain the multidomain knowledge of the text, a low-dimensional vector for each news text to capture domain embeddings is generated. Subsequently, a feature extraction module utilizing the unsupervisedly discovered domain embeddings is used to extract the comprehensive features of news. Finally, a classifier is employed to determine the authenticity of the news. To verify the proposed framework, a test is conducted on the existing widely used datasets, and the experimental results demonstrate that this method is able to improve the detection performance for fake news across multiple domains. Moreover, even in datasets that lack domain labels, this method can still effectively transfer domain knowledge, which can educe the time consumed by tagging without sacrificing the detection accuracy.

Keywords: fake news, deep learning, natural language processing, multiple domains

Procedia PDF Downloads 88
801 On Voice in English: An Awareness Raising Attempt on Passive Voice

Authors: Meral Melek Unver

Abstract:

This paper aims to explore ways to help English as a Foreign Language (EFL) learners notice and revise voice in English and raise their awareness of when and how to use active and passive voice to convey meaning in their written and spoken work. Because passive voice is commonly preferred in certain genres such as academic essays and news reports, despite the current trends promoting active voice, it is essential for learners to be fully aware of the meaning, use and form of passive voice to better communicate. The participants in the study are 22 EFL learners taking a one-year intensive English course at a university, who will receive English medium education (EMI) in their departmental studies in the following academic year. Data from students’ written and oral work was collected over a four-week period and the misuse or inaccurate use of passive voice was identified. The analysis of the data proved that they failed to make sensible decisions about when and how to use passive voice partly because the differences between their mother tongue and English and because they were not aware of the fact that active and passive voice would not alternate all the time. To overcome this, a Test-Teach-Test shape lesson, as opposed to a Present-Practice-Produce shape lesson, was designed and implemented to raise their awareness of the decisions they needed to make in choosing the voice and help them notice the meaning and use of passive voice through concept checking questions. The results first suggested that awareness raising activities on the meaning and use of voice in English would be beneficial in having accurate and meaningful outcomes from students. Also, helping students notice and renotice passive voice through carefully designed activities would help them internalize the use and form of it. As a result of the study, a number of activities are suggested to revise and notice passive voice as well as a short questionnaire to help EFL teachers to self-reflect on their teaching.

Keywords: voice in English, test-teach-test, passive voice, English language teaching

Procedia PDF Downloads 218
800 Nanomaterial Based Electrochemical Sensors for Endocrine Disrupting Compounds

Authors: Gaurav Bhanjana, Ganga Ram Chaudhary, Sandeep Kumar, Neeraj Dilbaghi

Abstract:

Main sources of endocrine disrupting compounds in the ecosystem are hormones, pesticides, phthalates, flame retardants, dioxins, personal-care products, coplanar polychlorinated biphenyls (PCBs), bisphenol A, and parabens. These endocrine disrupting compounds are responsible for learning disabilities, brain development problems, deformations of the body, cancer, reproductive abnormalities in females and decreased sperm count in human males. Although discharge of these chemical compounds into the environment cannot be stopped, yet their amount can be retarded through proper evaluation and detection techniques. The available techniques for determination of these endocrine disrupting compounds mainly include high performance liquid chromatography (HPLC), mass spectroscopy (MS) and gas chromatography-mass spectrometry (GC–MS). These techniques are accurate and reliable but have certain limitations like need of skilled personnel, time consuming, interference and requirement of pretreatment steps. Moreover, these techniques are laboratory bound and sample is required in large amount for analysis. In view of above facts, new methods for detection of endocrine disrupting compounds should be devised that promise high specificity, ultra sensitivity, cost effective, efficient and easy-to-operate procedure. Nowadays, electrochemical sensors/biosensors modified with nanomaterials are gaining high attention among researchers. Bioelement present in this system makes the developed sensors selective towards analyte of interest. Nanomaterials provide large surface area, high electron communication feature, enhanced catalytic activity and possibilities of chemical modifications. In most of the cases, nanomaterials also serve as an electron mediator or electrocatalyst for some analytes.

Keywords: electrochemical, endocrine disruptors, microscopy, nanoparticles, sensors

Procedia PDF Downloads 270
799 Modified Weibull Approach for Bridge Deterioration Modelling

Authors: Niroshan K. Walgama Wellalage, Tieling Zhang, Richard Dwight

Abstract:

State-based Markov deterioration models (SMDM) sometimes fail to find accurate transition probability matrix (TPM) values, and hence lead to invalid future condition prediction or incorrect average deterioration rates mainly due to drawbacks of existing nonlinear optimization-based algorithms and/or subjective function types used for regression analysis. Furthermore, a set of separate functions for each condition state with age cannot be directly derived by using Markov model for a given bridge element group, which however is of interest to industrial partners. This paper presents a new approach for generating Homogeneous SMDM model output, namely, the Modified Weibull approach, which consists of a set of appropriate functions to describe the percentage condition prediction of bridge elements in each state. These functions are combined with Bayesian approach and Metropolis Hasting Algorithm (MHA) based Markov Chain Monte Carlo (MCMC) simulation technique for quantifying the uncertainty in model parameter estimates. In this study, factors contributing to rail bridge deterioration were identified. The inspection data for 1,000 Australian railway bridges over 15 years were reviewed and filtered accordingly based on the real operational experience. Network level deterioration model for a typical bridge element group was developed using the proposed Modified Weibull approach. The condition state predictions obtained from this method were validated using statistical hypothesis tests with a test data set. Results show that the proposed model is able to not only predict the conditions in network-level accurately but also capture the model uncertainties with given confidence interval.

Keywords: bridge deterioration modelling, modified weibull approach, MCMC, metropolis-hasting algorithm, bayesian approach, Markov deterioration models

Procedia PDF Downloads 722
798 Comparison of Support Vector Machines and Artificial Neural Network Classifiers in Characterizing Threatened Tree Species Using Eight Bands of WorldView-2 Imagery in Dukuduku Landscape, South Africa

Authors: Galal Omer, Onisimo Mutanga, Elfatih M. Abdel-Rahman, Elhadi Adam

Abstract:

Threatened tree species (TTS) play a significant role in ecosystem functioning and services, land use dynamics, and other socio-economic aspects. Such aspects include ecological, economic, livelihood, security-based, and well-being benefits. The development of techniques for mapping and monitoring TTS is thus critical for understanding the functioning of ecosystems. The advent of advanced imaging systems and supervised learning algorithms has provided an opportunity to classify TTS over fragmenting landscape. Recently, vegetation maps have been produced using advanced imaging systems such as WorldView-2 (WV-2) and robust classification algorithms such as support vectors machines (SVM) and artificial neural network (ANN). However, delineation of TTS in a fragmenting landscape using high resolution imagery has widely remained elusive due to the complexity of the species structure and their distribution. Therefore, the objective of the current study was to examine the utility of the advanced WV-2 data for mapping TTS in the fragmenting Dukuduku indigenous forest of South Africa using SVM and ANN classification algorithms. The results showed the robustness of the two machine learning algorithms with an overall accuracy (OA) of 77.00% (total disagreement = 23.00%) for SVM and 75.00% (total disagreement = 25.00%) for ANN using all eight bands of WV-2 (8B). This study concludes that SVM and ANN classification algorithms with WV-2 8B have the potential to classify TTS in the Dukuduku indigenous forest. This study offers relatively accurate information that is important for forest managers to make informed decisions regarding management and conservation protocols of TTS.

Keywords: artificial neural network, threatened tree species, indigenous forest, support vector machines

Procedia PDF Downloads 509
797 Unity in Diversity: Exploring the Psychological Processes and Mechanisms of the Sense of Community for the Chinese Nation in Ethnic Inter-embedded Communities

Authors: Jiamin Chen, Liping Yang

Abstract:

In 2007, sociologist Putnam proposed a pessimistic forecast in the United States' "Social Capital Community Benchmark Survey," suggesting that "ethnic diversity would challenge social unity and undermine social cohesion." If this pessimistic assumption were proven true, it would indicate a risk of division in diverse societies. China, with 56 ethnic groups, is a multi-ethnic country. On May 26, 2014, General Secretary Xi Jinping proposed "building ethnically inter-embedded communities to promote deeper development in interactions, exchanges, and integration among ethnic groups." Researchers unanimously agree that ethnic inter-embedded communities can serve as practical arenas and pathways for solidifying the sense of the Chinese national community However, there is no research providing evidence that ethnic inter-embedded communities can foster the sense of the Chinese national community, and the influencing factors remain unclear. This study adopts a constructivist grounded theory research approach. Convenience sampling and snowball sampling were used in the study. Data were collected in three communities in Kunming City. Twelve individuals were eventually interviewed, and the transcribed interviews totaled 187,000 words. The research has obtained ethical approval from the Ethics Committee of Nanjing Normal University (NNU202310030). The research analyzed the data and constructed theories, employing strategies such as coding, constant comparison, and theoretical sampling. The study found that: firstly, ethnic inter-embedded communities exhibit characteristics of diversity, including ethnic diversity, cultural diversity, and linguistic diversity. Diversity has positive functions, including increased opportunities for contact, promoting self-expansion, and increasing happiness; negative functions of diversity include highlighting ethnic differences, causing ethnic conflicts, and reminding of ethnic boundaries. Secondly, individuals typically engage in interactions within the community using active embedding and passive embedding strategies. Active embedding strategies include maintaining openness, focusing on similarities, and pro-diversity beliefs, which can increase external group identification, intergroup relational identity, and promote ethnic integration. Individuals using passive embedding strategies tend to focus on ethnic stereotypes, perceive stigmatization of their own ethnic group, and adopt an authoritarian-oriented approach to interactions, leading to a perception of more identity threats and ultimately rejecting ethnic integration. Thirdly, the commonality of the Chinese nation is reflected in the 56 ethnic groups as an "identity community" and "interest community," and both active and passive embedding paths affect individual understanding of the commonality of the Chinese nation. Finally, community work and environment can influence the embedding process. The research constructed a social psychological process and mechanism model for solidifying sense of the Chinese national community in ethnic inter-embedded communities. Based on this theoretical model, future research can conduct more micro-level psychological mechanism tests and intervention studies to enhance Chinese national cohesion.

Keywords: diversity, sense of the chinese national community, ethnic inter-embedded communities, ethnic group

Procedia PDF Downloads 36
796 Development of Immuno-Modulators: Application of Molecular Dynamics Simulation

Authors: Ruqaiya Khalil, Saman Usmani, Zaheer Ul-Haq

Abstract:

The accurate characterization of ligand binding affinity is indispensable for designing molecules with optimized binding affinity. Computational tools help in many directions to predict quantitative correlations between protein-ligand structure and their binding affinities. Molecular dynamics (MD) simulation is a modern state-of-the-art technique to evaluate the underlying basis of ligand-protein interactions by characterizing dynamic and energetic properties during the event. Autoimmune diseases arise from an abnormal immune response of the body against own tissues. The current regimen for the described condition is limited to immune-modulators having compromised pharmacodynamics and pharmacokinetics profiles. One of the key player mediating immunity and tolerance, thus invoking autoimmunity is Interleukin-2; a cytokine influencing the growth of T cells. Molecular dynamics simulation techniques are applied to seek insight into the inhibitory mechanisms of newly synthesized compounds that manifested immunosuppressant potentials during in silico pipeline. In addition to estimation of free energies associated with ligand binding, MD simulation yielded us a great deal of information about ligand-macromolecule interactions to evaluate the pattern of interactions and the molecular basis of inhibition. The present study is a continuum of our efforts to identify interleukin-2 inhibitors of both natural and synthetic origin. Herein, we report molecular dynamics simulation studies of Interluekin-2 complexed with different antagonists previously reported by our group. The study of protein-ligand dynamics enabled us to gain a better understanding of the contribution of different active site residues in ligand binding. The results of the study will be used as the guide to rationalize the fragment based synthesis of drug-like interleukin-2 inhibitors as immune-modulators.

Keywords: immuno-modulators, MD simulation, protein-ligand interaction, structure-based drug design

Procedia PDF Downloads 252
795 Numerical Investigation of Multiphase Flow in Pipelines

Authors: Gozel Judakova, Markus Bause

Abstract:

We present and analyze reliable numerical techniques for simulating complex flow and transport phenomena related to natural gas transportation in pipelines. Such kind of problems are of high interest in the field of petroleum and environmental engineering. Modeling and understanding natural gas flow and transformation processes during transportation is important for the sake of physical realism and the design and operation of pipeline systems. In our approach a two fluid flow model based on a system of coupled hyperbolic conservation laws is considered for describing natural gas flow undergoing hydratization. The accurate numerical approximation of two-phase gas flow remains subject of strong interest in the scientific community. Such hyperbolic problems are characterized by solutions with steep gradients or discontinuities, and their approximation by standard finite element techniques typically gives rise to spurious oscillations and numerical artefacts. Recently, stabilized and discontinuous Galerkin finite element techniques have attracted researchers’ interest. They are highly adapted to the hyperbolic nature of our two-phase flow model. In the presentation a streamline upwind Petrov-Galerkin approach and a discontinuous Galerkin finite element method for the numerical approximation of our flow model of two coupled systems of Euler equations are presented. Then the efficiency and reliability of stabilized continuous and discontinous finite element methods for the approximation is carefully analyzed and the potential of the either classes of numerical schemes is investigated. In particular, standard benchmark problems of two-phase flow like the shock tube problem are used for the comparative numerical study.

Keywords: discontinuous Galerkin method, Euler system, inviscid two-fluid model, streamline upwind Petrov-Galerkin method, twophase flow

Procedia PDF Downloads 322
794 Evaluation of Elements Impurities in Drugs According to Pharmacopoeia by use FESEM-EDS Technique

Authors: Rafid Doulab

Abstract:

Elemental Impurities in the Pharmaceuticals industryis are indispensable to ensure pharmaceuticalssafety for 24 elements. Although atomic absorption and inductively coupled plasma are used in the U.S Pharmacopeia and the European Pharmacopoeia, FESEM with energy dispersive spectrometers can be applied as an alternative analysis method for quantitative and qualitative results for a variety of elements without chemical pretreatment, unlike other techniques. This technique characterizes by shortest time, with more less contamination, no reagent consumption, and generation of minimal residue or waste, as well as sample preparations time limiting, with minimal analysis error. Simple dilution for powder or direct analysis for liquid, we analyzed the usefulness of EDS method in testing with field emission scanning electron microscopy (FESEM, SUPRA 55 Carl Zeiss Germany) with an X-ray energy dispersion (XFlash6l10 Bruker Germany). The samples analyzed directly without coating by applied 5µ of known concentrated diluted sample on carbon stub with accelerated voltage according to sample thickness, the result for this spot was in atomic percentage, and by Avogadro converted factor, the final result will be in microgram. Conclusion and recommendation: The conclusion of this study is application of FESEM-EDS in US pharmacopeia and ICH /Q3D guideline to reach a high-precision and accurate method in element impurities analysis of drugs or bulk materials to determine the permitted daily exposure PDE in liquid or solid specimens, and to obtain better results than other techniques, by the way it does not require complex methods or chemicals for digestion, which interfere with the final results with the possibility of to keep the sample at any time for re analysis. The recommendation is to use this technique in pharmacopeia as standard methods like inductively coupled plasma both ICP-AES, ICP-OES, and ICP-MS.

Keywords: pharmacopoeia, FESEM-EDS, element impurities, atomic concentration

Procedia PDF Downloads 112
793 Reliability Analysis of Construction Schedule Plan Based on Building Information Modelling

Authors: Lu Ren, You-Liang Fang, Yan-Gang Zhao

Abstract:

In recent years, the application of BIM (Building Information Modelling) to construction schedule plan has been the focus of more and more researchers. In order to assess the reasonable level of the BIM-based construction schedule plan, that is whether the schedule can be completed on time, some researchers have introduced reliability theory to evaluate. In the process of evaluation, the uncertain factors affecting the construction schedule plan are regarded as random variables, and probability distributions of the random variables are assumed to be normal distribution, which is determined using two parameters evaluated from the mean and standard deviation of statistical data. However, in practical engineering, most of the uncertain influence factors are not normal random variables. So the evaluation results of the construction schedule plan will be unreasonable under the assumption that probability distributions of random variables submitted to the normal distribution. Therefore, in order to get a more reasonable evaluation result, it is necessary to describe the distribution of random variables more comprehensively. For this purpose, cubic normal distribution is introduced in this paper to describe the distribution of arbitrary random variables, which is determined by the first four moments (mean, standard deviation, skewness and kurtosis). In this paper, building the BIM model firstly according to the design messages of the structure and making the construction schedule plan based on BIM, then the cubic normal distribution is used to describe the distribution of the random variables due to the collecting statistical data of the random factors influencing construction schedule plan. Next the reliability analysis of the construction schedule plan based on BIM can be carried out more reasonably. Finally, the more accurate evaluation results can be given providing reference for the implementation of the actual construction schedule plan. In the last part of this paper, the more efficiency and accuracy of the proposed methodology for the reliability analysis of the construction schedule plan based on BIM are conducted through practical engineering case.

Keywords: BIM, construction schedule plan, cubic normal distribution, reliability analysis

Procedia PDF Downloads 137
792 Thermo-Mechanical Analysis of Composite Structures Utilizing a Beam Finite Element Based on Global-Local Superposition

Authors: Andre S. de Lima, Alfredo R. de Faria, Jose J. R. Faria

Abstract:

Accurate prediction of thermal stresses is particularly important for laminated composite structures, as large temperature changes may occur during fabrication and field application. The normal transverse deformation plays an important role in the prediction of such stresses, especially for problems involving thick laminated plates subjected to uniform temperature loads. Bearing this in mind, the present study aims to investigate the thermo-mechanical behavior of laminated composite structures using a new beam element based on global-local superposition, accounting for through-the-thickness effects. The element formulation is based on a global-local superposition in the thickness direction, utilizing a cubic global displacement field in combination with a linear layerwise local displacement distribution, which assures zig-zag behavior of the stresses and displacements. By enforcing interlaminar stress (normal and shear) and displacement continuity, as well as free conditions at the upper and lower surfaces, the number of degrees of freedom in the model is maintained independently of the number of layers. Moreover, the proposed formulation allows for the determination of transverse shear and normal stresses directly from the constitutive equations, without the need of post-processing. Numerical results obtained with the beam element were compared to analytical solutions, as well as results obtained with commercial finite elements, rendering satisfactory results for a range of length-to-thickness ratios. The results confirm the need for an element with through-the-thickness capabilities and indicate that the present formulation is a promising alternative to such analysis.

Keywords: composite beam element, global-local superposition, laminated composite structures, thermal stresses

Procedia PDF Downloads 151
791 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method

Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi

Abstract:

This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.

Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure

Procedia PDF Downloads 487
790 A Novel Study Contrasting Traditional Autopsy with Post-Mortem Computed Tomography in Falls Leading to Death

Authors: Balaji Devanathan, Gokul G., Abilash S., Abhishek Yadav, Sudhir K. Gupta

Abstract:

Background: As an alternative to the traditional autopsy, a virtual autopsy is carried out using scanning and imaging technologies, mainly post-mortem computed tomography (PMCT). This facility aims to supplement traditional autopsy results and reduce or eliminate internal dissection in subsequent autopsies. For emotional and religious reasons, the deceased's relatives have historically disapproved such interior dissection. The non-invasive, objective, and preservative PMCT is what friends and family would rather have than a traditional autopsy. Additionally, it aids in the examination of the technologies and the benefits and drawbacks of each, demonstrating the significance of contemporary imaging in the field of forensic medicine. Results: One hundred falls resulting in fatalities was analysed by the writers. Before the autopsy, each case underwent a PMCT examination using a 16-slice Multi-Slice CT spiral scanner. By using specialised software, MPR and VR reconstructions were carried out following the capture of the raw images. The accurate detection of fractures in the skull, face bones, clavicle, scapula, and vertebra was better observed in comparison to a routine autopsy. The interpretation of pneumothorax, Pneumoperitoneum, pneumocephalus, and hemosiuns are much enhanced by PMCT than traditional autopsy. Conclusion. It is useful to visualise the skeletal damage in fall from height cases using a virtual autopsy based on PMCT. So, the ideal tool in traumatising patients is a virtual autopsy based on PMCT scans. When assessing trauma victims, PMCT should be viewed as an additional helpful tool to traditional autopsy. This is because it can identify additional bone fractures in body parts that are challenging to examine during autopsy, such as posterior regions, which helps the pathologist reconstruct the victim's life and determine the cause of death.

Keywords: PMCT, fall from height, autopsy, fracture

Procedia PDF Downloads 31