Search results for: event series
3468 Development of Time Series Forecasting Model for Dengue Cases in Nakhon Si Thammarat, Southern Thailand
Authors: Manit Pollar
Abstract:
Identifying the dengue epidemic periods early would be helpful to take necessary actions to prevent the dengue outbreaks. Providing an accurate prediction on dengue epidemic seasons will allow sufficient time to take the necessary decisions and actions to safeguard the situation for local authorities. This study aimed to develop a forecasting model on number of dengue incidences in Nakhon Si Thammarat Province, Southern Thailand using time series analysis. We develop Seasonal Autoregressive Moving Average (SARIMA) models on the monthly data collected between 2003-2011 and validated the models using data collected between January-September 2012. The result of this study revealed that the SARIMA(1,1,0)(1,2,1)12 model closely described the trends and seasons of dengue incidence and confirmed the existence of dengue fever cases in Nakhon Si Thammarat for the years between 2003-2011. The study showed that the one-step approach for predicting dengue incidences provided significantly more accurate predictions than the twelve-step approach. The model, even if based purely on statistical data analysis, can provide a useful basis for allocation of resources for disease prevention.Keywords: SARIMA, time series model, dengue cases, Thailand
Procedia PDF Downloads 3583467 Anemia and Nutritional Status as Dominant Factor of the Event Low Birth Weight in Indonesia: A Systematic Review
Authors: Lisnawati Hutagalung
Abstract:
Background: Low birth weight (LBW) is one cause of newborn death. Babies with low birth weight tend to have slower cognitive development, growth retardation, more at risk of infectious disease event at risk of death. Objective: Identifying risk factors and dominant factors that influence the incidence of LBW in Indonesia. Method: This research used some database of public health such as Google Scholar, UGM journals, UI journals and UNAND journals in 2012-2015. Data were filtered using keywords ‘Risk Factors’ AND ‘Cause LBW’ with amounts 2757 study. The filtrate obtained 5 public health research that meets the criteria. Results: Risk factors associated with LBW, among other environment factors (exposure to cigarette smoke and residence), social demographics (age and socio-economic) and maternal factors (anemia, placental abnormal, nutritional status of mothers, examinations antenatal, preeclampsia, parity, and complications in pregnancy). Anemia and nutritional status become the dominant factor affecting LBW. Conclusions: The risk factors that affect LBW, most commonly found in the maternal factors. The dominant factors are a big effect on LBW is anemia and nutritional status of the mother during pregnancy.Keywords: low birth weight, anemia, nutritional status, the dominant factor
Procedia PDF Downloads 3653466 The Use of Social Media Sarcasm as a Response to Media-Coverage of Iran’s Unprecedented Attack on Israel
Authors: Afif J. Arabi
Abstract:
On April 15, 2024, Iran announced its unprecedented military attack by sending waves of more than 300 drones and ballistic missiles toward Israel. The Attack lasted approximately five hours and was a widely covered, distributed, and followed media event. Iran’s military action against Israel was a long-awaited action across the Middle East since the early days of the October 7th war on Gaza and after a long history of verbal threats. While people in many Arab countries stayed up past midnight in anticipation of watching the disastrous results of this unprecedented attack, voices on traditional and social media alike started to question the timed public announcement of the attack, which gave Israel at least a two-hour notice to prepare its defenses. When live news coverage started showing that nearly all the drones and missiles were intercepted by Israel – with help from the U.S. and other countries – and no deaths were reported, the social media response to this media event turned toward sarcasm, mockery, irony, and humor. Social media users posted sarcastic pictures, jokes, and comments mocking the Iranian offensive. This research examines this unique media event and the sarcastic response it generated on social media. The study aims to investigate the causes leading to media sarcasm in militarized political conflict, the social function of such generated sarcasm, and the role of social media as a platform for consuming frustration, dissatisfaction, and outrage passively through various media products. The study compares the serious traditional media coverage of the event with the humorous social media response among Arab countries. The research uses an eclectic theoretical approach using framing theory as a paradigm for understanding and investigating communication social functionalism theory in media studies to examine sarcasm. Social functionalism theory is a sociological perspective that views society as a complex system whose parts work together to promote solidarity and stability. In the context of media and sarcasm, this theory would suggest that sarcasm serves specific functions within society, such as reinforcing social norms, providing a means for social critique, or functioning as a safety valve for expressing social tension.; and a qualitative analysis of specific examples including responses of SM commentators to such manifestations of political criticism. The preliminary findings of this study point to a heightened dramatization of the televised event and a widespread belief that this attack was a staged show incongruent with Iran’s official enmity and death threats toward Israel. The social media sarcasm reinforces Arab’s view of Iran and Israel as mutual threats. This belief stems from the complex dynamics, historical context, and regional conflict surrounding these three nations: Iran, Israel, and Arabs.Keywords: social functionalism, social media sarcasm, Television news framing, live militarized conflict coverage, iran, israel, communication theory
Procedia PDF Downloads 443465 Association Between Short-term NOx Exposure and Asthma Exacerbations in East London: A Time Series Regression Model
Authors: Hajar Hajmohammadi, Paul Pfeffer, Anna De Simoni, Jim Cole, Chris Griffiths, Sally Hull, Benjamin Heydecker
Abstract:
Background: There is strong interest in the relationship between short-term air pollution exposure and human health. Most studies in this field focus on serious health effects such as death or hospital admission, but air pollution exposure affects many people with less severe impacts, such as exacerbations of respiratory conditions. A lack of quantitative analysis and inconsistent findings suggest improved methodology is needed to understand these effectsmore fully. Method: We developed a time series regression model to quantify the relationship between daily NOₓ concentration and Asthma exacerbations requiring oral steroids from primary care settings. Explanatory variables include daily NOₓ concentration measurements extracted from 8 available background and roadside monitoring stations in east London and daily ambient temperature extracted for London City Airport, located in east London. Lags of NOx concentrations up to 21 days (3 weeks) were used in the model. The dependent variable was the daily number of oral steroid courses prescribed for GP registered patients with asthma in east London. A mixed distribution model was then fitted to the significant lags of the regression model. Result: Results of the time series modelling showed a significant relationship between NOₓconcentrations on each day and the number of oral steroid courses prescribed in the following three weeks. In addition, the model using only roadside stations performs better than the model with a mixture of roadside and background stations.Keywords: air pollution, time series modeling, public health, road transport
Procedia PDF Downloads 1443464 Short Life Cycle Time Series Forecasting
Authors: Shalaka Kadam, Dinesh Apte, Sagar Mainkar
Abstract:
The life cycle of products is becoming shorter and shorter due to increased competition in market, shorter product development time and increased product diversity. Short life cycles are normal in retail industry, style business, entertainment media, and telecom and semiconductor industry. The subject of accurate forecasting for demand of short lifecycle products is of special enthusiasm for many researchers and organizations. Due to short life cycle of products the amount of historical data that is available for forecasting is very minimal or even absent when new or modified products are launched in market. The companies dealing with such products want to increase the accuracy in demand forecasting so that they can utilize the full potential of the market at the same time do not oversupply. This provides the challenge to develop a forecasting model that can forecast accurately while handling large variations in data and consider the complex relationships between various parameters of data. Many statistical models have been proposed in literature for forecasting time series data. Traditional time series forecasting models do not work well for short life cycles due to lack of historical data. Also artificial neural networks (ANN) models are very time consuming to perform forecasting. We have studied the existing models that are used for forecasting and their limitations. This work proposes an effective and powerful forecasting approach for short life cycle time series forecasting. We have proposed an approach which takes into consideration different scenarios related to data availability for short lifecycle products. We then suggest a methodology which combines statistical analysis with structured judgement. Also the defined approach can be applied across domains. We then describe the method of creating a profile from analogous products. This profile can then be used for forecasting products with historical data of analogous products. We have designed an application which combines data, analytics and domain knowledge using point-and-click technology. The forecasting results generated are compared using MAPE, MSE and RMSE error scores. Conclusion: Based on the results it is observed that no one approach is sufficient for short life-cycle forecasting and we need to combine two or more approaches for achieving the desired accuracy.Keywords: forecast, short life cycle product, structured judgement, time series
Procedia PDF Downloads 3583463 Influence of Water Reservoir Parameters on the Climate and Coastal Areas
Authors: Lia Matchavariani
Abstract:
Water reservoir construction on the rivers flowing into the sea complicates the coast protection, seashore starts to degrade causing coast erosion and disaster on the backdrop of current climate change. The instruments of the impact of a water reservoir on the climate and coastal areas are its contact surface with the atmosphere and the area irrigated with its water or humidified with infiltrated waters. The Black Sea coastline is characterized by the highest ecological vulnerability. The type and intensity of the water reservoir impact are determined by its morphometry, type of regulation, level regime, and geomorphological and geological characteristics of the adjoining area. Studies showed the impact of the water reservoir on the climate, on its comfort parameters is positive if it is located in the zone of insufficient humidity and vice versa, is negative if the water reservoir is found in the zone with abundant humidity. There are many natural and anthropogenic factors determining the peculiarities of the impact of the water reservoir on the climate, which can be assessed with maximum accuracy by the so-called “long series” method, which operates on the meteorological elements (temperature, wind, precipitations, etc.) with the long series formed with the stationary observation data. This is the time series, which consists of two periods with statistically sufficient duration. The first period covers the observations up to the formation of the water reservoir and another period covers the observations accomplished during its operation. If no such data are available, or their series is statistically short, “an analog” method is used. Such an analog water reservoir is selected based on the similarity of the environmental conditions. It must be located within the zone of the designed water reservoir, under similar environmental conditions, and besides, a sufficient number of observations accomplished in its coastal zone.Keywords: coast-constituent sediment, eustasy, meteorological parameters, seashore degradation, water reservoirs impact
Procedia PDF Downloads 453462 A Simulation of Patient Queuing System on Radiology Department at Tertiary Specialized Referral Hospital in Indonesia
Authors: Yonathan Audhitya Suthihono, Ratih Dyah Kusumastuti
Abstract:
The radiology department in a tertiary referral hospital faces service operation challenges such as huge and various patient arrival, which can increase the probability of patient queuing. During the COVID-19 pandemic, it is mandatory to apply social distancing protocol in the radiology department. A strategy to prevent the accumulation of patients at one spot would be required. The aim of this study is to identify an alternative solution which can reduce the patient’s waiting time in radiology department. Discrete event simulation (DES) is used for this study by constructing several improvement scenarios with Arena simulation software. Statistical analysis is used to test the validity of the base case scenario model and to investigate the performance of the improvement scenarios. The result of this study shows that the selected scenario is able to reduce patient waiting time significantly, which leads to more efficient services in a radiology department, be able to serve patients more effectively, and thus increase patient satisfaction. The result of the simulation can be used by the hospital management to improve the operational performance of the radiology department.Keywords: discrete event simulation, hospital management patient queuing model, radiology department services
Procedia PDF Downloads 1193461 A General Framework for Measuring the Internal Fraud Risk of an Enterprise Resource Planning System
Authors: Imran Dayan, Ashiqul Khan
Abstract:
Internal corporate fraud, which is fraud carried out by internal stakeholders of a company, affects the well-being of the organisation just like its external counterpart. Even if such an act is carried out for the short-term benefit of a corporation, the act is ultimately harmful to the entity in the long run. Internal fraud is often carried out by relying upon aberrations from usual business processes. Business processes are the lifeblood of a company in modern managerial context. Such processes are developed and fine-tuned over time as a corporation grows through its life stages. Modern corporations have embraced technological innovations into their business processes, and Enterprise Resource Planning (ERP) systems being at the heart of such business processes is a testimony to that. Since ERP systems record a huge amount of data in their event logs, the logs are a treasure trove for anyone trying to detect any sort of fraudulent activities hidden within the day-to-day business operations and processes. This research utilises the ERP systems in place within corporations to assess the likelihood of prospective internal fraud through developing a framework for measuring the risks of fraud through Process Mining techniques and hence finds risky designs and loose ends within these business processes. This framework helps not only in identifying existing cases of fraud in the records of the event log, but also signals the overall riskiness of certain business processes, and hence draws attention for carrying out a redesign of such processes to reduce the chance of future internal fraud while improving internal control within the organisation. The research adds value by applying the concepts of Process Mining into the analysis of data from modern day applications of business process records, which is the ERP event logs, and develops a framework that should be useful to internal stakeholders for strengthening internal control as well as provide external auditors with a tool of use in case of suspicion. The research proves its usefulness through a few case studies conducted with respect to big corporations with complex business processes and an ERP in place.Keywords: enterprise resource planning, fraud risk framework, internal corporate fraud, process mining
Procedia PDF Downloads 3353460 Applied Complement of Probability and Information Entropy for Prediction in Student Learning
Authors: Kennedy Efosa Ehimwenma, Sujatha Krishnamoorthy, Safiya Al‑Sharji
Abstract:
The probability computation of events is in the interval of [0, 1], which are values that are determined by the number of outcomes of events in a sample space S. The probability Pr(A) that an event A will never occur is 0. The probability Pr(B) that event B will certainly occur is 1. This makes both events A and B a certainty. Furthermore, the sum of probabilities Pr(E₁) + Pr(E₂) + … + Pr(Eₙ) of a finite set of events in a given sample space S equals 1. Conversely, the difference of the sum of two probabilities that will certainly occur is 0. This paper first discusses Bayes, the complement of probability, and the difference of probability for occurrences of learning-events before applying them in the prediction of learning objects in student learning. Given the sum of 1; to make a recommendation for student learning, this paper proposes that the difference of argMaxPr(S) and the probability of student-performance quantifies the weight of learning objects for students. Using a dataset of skill-set, the computational procedure demonstrates i) the probability of skill-set events that have occurred that would lead to higher-level learning; ii) the probability of the events that have not occurred that requires subject-matter relearning; iii) accuracy of the decision tree in the prediction of student performance into class labels and iv) information entropy about skill-set data and its implication on student cognitive performance and recommendation of learning.Keywords: complement of probability, Bayes’ rule, prediction, pre-assessments, computational education, information theory
Procedia PDF Downloads 1613459 A Fuzzy TOPSIS Based Model for Safety Risk Assessment of Operational Flight Data
Authors: N. Borjalilu, P. Rabiei, A. Enjoo
Abstract:
Flight Data Monitoring (FDM) program assists an operator in aviation industries to identify, quantify, assess and address operational safety risks, in order to improve safety of flight operations. FDM is a powerful tool for an aircraft operator integrated into the operator’s Safety Management System (SMS), allowing to detect, confirm, and assess safety issues and to check the effectiveness of corrective actions, associated with human errors. This article proposes a model for safety risk assessment level of flight data in a different aspect of event focus based on fuzzy set values. It permits to evaluate the operational safety level from the point of view of flight activities. The main advantages of this method are proposed qualitative safety analysis of flight data. This research applies the opinions of the aviation experts through a number of questionnaires Related to flight data in four categories of occurrence that can take place during an accident or an incident such as: Runway Excursions (RE), Controlled Flight Into Terrain (CFIT), Mid-Air Collision (MAC), Loss of Control in Flight (LOC-I). By weighting each one (by F-TOPSIS) and applying it to the number of risks of the event, the safety risk of each related events can be obtained.Keywords: F-topsis, fuzzy set, flight data monitoring (FDM), flight safety
Procedia PDF Downloads 1683458 Forecasting Model for Rainfall in Thailand: Case Study Nakhon Ratchasima Province
Authors: N. Sopipan
Abstract:
In this paper, we study of rainfall time series of weather stations in Nakhon Ratchasima province in Thailand using various statistical methods enabled to analyse the behaviour of rainfall in the study areas. Time-series analysis is an important tool in modelling and forecasting rainfall. ARIMA and Holt-Winter models based on exponential smoothing were built. All the models proved to be adequate. Therefore, could give information that can help decision makers establish strategies for proper planning of agriculture, drainage system and other water resource applications in Nakhon Ratchasima province. We found the best perform for forecasting is ARIMA(1,0,1)(1,0,1)12.Keywords: ARIMA Models, exponential smoothing, Holt-Winter model
Procedia PDF Downloads 3003457 Approximation of Periodic Functions Belonging to Lipschitz Classes by Product Matrix Means of Fourier Series
Authors: Smita Sonker, Uaday Singh
Abstract:
Various investigators have determined the degree of approximation of functions belonging to the classes W(L r , ξ(t)), Lip(ξ(t), r), Lip(α, r), and Lipα using different summability methods with monotonocity conditions. Recently, Lal has determined the degree of approximation of the functions belonging to Lipα and W(L r , ξ(t)) classes by using Ces`aro-N¨orlund (C 1 .Np)- summability with non-increasing weights {pn}. In this paper, we shall determine the degree of approximation of 2π - periodic functions f belonging to the function classes Lipα and W(L r , ξ(t)) by C 1 .T - means of Fourier series of f. Our theorems generalize the results of Lal and we also improve these results in the light off. From our results, we also derive some corollaries.Keywords: Lipschitz classes, product matrix operator, signals, trigonometric Fourier approximation
Procedia PDF Downloads 4773456 Determining the Number of Single Models in a Combined Forecast
Authors: Serkan Aras, Emrah Gulay
Abstract:
Combining various forecasting models is an important tool for researchers to attain more accurate forecasts. A great number of papers have shown that selecting single models as dissimilar models, or methods based on different information as possible leads to better forecasting performances. However, there is not a certain rule regarding the number of single models to be used in any combining methods. This study focuses on determining the optimal or near optimal number for single models with the help of statistical tests. An extensive experiment is carried out by utilizing some well-known time series data sets from diverse fields. Furthermore, many rival forecasting methods and some of the commonly used combining methods are employed. The obtained results indicate that some statistically significant performance differences can be found regarding the number of the single models in the combining methods under investigation.Keywords: combined forecast, forecasting, M-competition, time series
Procedia PDF Downloads 3553455 The Structural Pattern: An Event-Related Potential Study on Tang Poetry
Authors: ShuHui Yang, ChingChing Lu
Abstract:
Measuring event-related potentials (ERPs) has been fundamental to our understanding of how people process language. One specific ERP component, a P600, has been hypothesized to be associated with syntactic reanalysis processes. We, however, propose that the P600 is not restricted to reanalysis processes, but is the index of the structural pattern processing. To investigate the structural pattern processing, we utilized the effects of stimulus degradation in structural priming. To put it another way, there was no P600 effect if the structure of the prime was the same with the structure of the target. Otherwise, there would be a P600 effect if the structure were different between the prime and the target. In the experiment, twenty-two participants were presented with four sentences of Tang poetry. All of the first two sentences, being prime, were conducted with SVO+VP. The last two sentences, being the target, were divided into three types. Type one of the targets was SVO+VP. Type two of the targets was SVO+VPVP. Type three of the targets was VP+VP. The result showed that both of the targets, SVO+VPVP and VP+VP, elicited positive-going brainwave, a P600 effect, at 600~900ms time window. Furthermore, the P600 component was lager for the target’ VP+VP’ than the target’ SVO+VPVP’. That meant the more dissimilar the structure was, the lager the P600 effect we got. These results indicate that P600 was the index of the structure processing, and it would affect the P600 effect intensity with the degrees of structural heterogeneity.Keywords: ERPs, P600, structural pattern, structural priming, Tang poetry
Procedia PDF Downloads 1403454 Preliminary Analysis on Land Use-Land Cover Assessment of Post-Earthquake Geohazard: A Case Study in Kundasang, Sabah
Authors: Nur Afiqah Mohd Kamal, Khamarrul Azahari Razak
Abstract:
The earthquake aftermath has become a major concern, especially in high seismicity region. In Kundasang, Sabah, the earthquake on 5th June 2015 resulted in several catastrophes; landslides, rockfalls, mudflows and major slopes affected regardless of the series of the aftershocks. Certainly, the consequences of earthquake generate and induce the episodic disaster, not only life-threatening but it also affects infrastructure and economic development. Therefore, a need for investigating the change in land use and land cover (LULC) of post-earthquake geohazard is essential for identifying the extent of disastrous effects towards the development in Kundasang. With the advancement of remote sensing technology, post-earthquake geohazards (landslides, mudflows, rockfalls, debris flows) assessment can be evaluated by the employment of object-based image analysis in investigating the LULC change which consists of settlements, public infrastructure and vegetation cover. Therefore, this paper discusses the preliminary results on post-earthquakes geohazards distribution in Kundasang and evaluates the LULC classification effect upon the occurrences of geohazards event. The result of this preliminary analysis will provide an overview to determine the extent of geohazard impact on LULC. This research also provides beneficial input to the local authority in Kundasang about the risk of future structural development on the geohazard area.Keywords: geohazard, land use land cover, object-based image analysis, remote sensing
Procedia PDF Downloads 2453453 Performance Evaluation of Production Schedules Based on Process Mining
Authors: Kwan Hee Han
Abstract:
External environment of enterprise is rapidly changing majorly by global competition, cost reduction pressures, and new technology. In these situations, production scheduling function plays a critical role to meet customer requirements and to attain the goal of operational efficiency. It deals with short-term decision making in the production process of the whole supply chain. The major task of production scheduling is to seek a balance between customer orders and limited resources. In manufacturing companies, this task is so difficult because it should efficiently utilize resource capacity under the careful consideration of many interacting constraints. At present, many computerized software solutions have been utilized in many enterprises to generate a realistic production schedule to overcome the complexity of schedule generation. However, most production scheduling systems do not provide sufficient information about the validity of the generated schedule except limited statistics. Process mining only recently emerged as a sub-discipline of both data mining and business process management. Process mining techniques enable the useful analysis of a wide variety of processes such as process discovery, conformance checking, and bottleneck analysis. In this study, the performance of generated production schedule is evaluated by mining event log data of production scheduling software system by using the process mining techniques since every software system generates event logs for the further use such as security investigation, auditing and error bugging. An application of process mining approach is proposed for the validation of the goodness of production schedule generated by scheduling software systems in this study. By using process mining techniques, major evaluation criteria such as utilization of workstation, existence of bottleneck workstations, critical process route patterns, and work load balance of each machine over time are measured, and finally, the goodness of production schedule is evaluated. By using the proposed process mining approach for evaluating the performance of generated production schedule, the quality of production schedule of manufacturing enterprises can be improved.Keywords: data mining, event log, process mining, production scheduling
Procedia PDF Downloads 2793452 Socioeconomic Burden of Life Long Disease: A Case of Diabetes Care in Bangladesh
Authors: Samira Humaira Habib
Abstract:
Diabetes has profound effects on individuals and their families. If diabetes is not well monitored and managed, then it leads to long-term complications and a large and growing cost to the health care system. Prevalence and socioeconomic burden of diabetes and relative return of investment for the elimination or the reduction of the burden are much more important regarding its cost burden. Various studies regarding the socioeconomic cost burden of diabetes are well explored in developed countries but almost absent in developing countries like Bangladesh. The main objective of the study is to estimate the total socioeconomic burden of diabetes. It is a prospective longitudinal follow up study which is analytical in nature. Primary and secondary data are collected from patients who are undergoing treatment for diabetes at the out-patient department of Bangladesh Institute of Research & Rehabilitation in Diabetes, Endocrine & Metabolic Disorders (BIRDEM). Of the 2115 diabetic subjects, females constitute around 50.35% of the study subject, and the rest are male (49.65%). Among the subjects, 1323 are controlled, and 792 are uncontrolled diabetes. Cost analysis of 2115 diabetic patients shows that the total cost of diabetes management and treatment is US$ 903018 with an average of US$ 426.95 per patient. In direct cost, the investigation and medical treatment at hospital along with investigation constitute most of the cost in diabetes. The average cost of a hospital is US$ 311.79, which indicates an alarming warn for diabetic patients. The indirect cost shows that cost of productivity loss (US$ 51110.1) is higher among the all indirect item. All constitute total indirect cost as US$ 69215.7. The incremental cost of intensive management of uncontrolled diabetes is US$ 101.54 per patient and event-free time gained in this group is 0.55 years and the life years gain is 1.19 years. The incremental cost per event-free year gained is US$ 198.12. The incremental cost of intensive management of the controlled group is US$ 89.54 per patient and event-free time gained is 0.68 years, and the life year gain is 1.12 years. The incremental cost per event-free year gained is US$ 223.34. The EuroQoL difference between the groups is found to be 64.04. The cost-effective ratio is found to be US$ 1.64 cost per effect in case of controlled diabetes and US$ 1.69 cost per effect in case of uncontrolled diabetes. So management of diabetes is much more cost-effective. Cost of young type 1 diabetic patient showed upper socioeconomic class, and with the increase of the duration of diabetes, the cost increased also. The dietary pattern showed macronutrients intake and cost are significantly higher in the uncontrolled group than their counterparts. Proper management and control of diabetes can decrease the cost of care for the long term.Keywords: cost, cost-effective, chronic diseases, diabetes care, burden, Bangladesh
Procedia PDF Downloads 1473451 Causes and Impacts of Marine Heatwaves in the Bay of Bengal Region in the Recent Period
Authors: Sudhanshu Kumar, Raghvendra Chandrakar, Arun Chakraborty
Abstract:
In the ocean, the temperature extremes have the potential to devastate marine habitats, ecosystems together with ensuing socioeconomic consequences. In recent years, these extreme events are more frequent and intense globally and their increasing trend is expected to continue in the upcoming decades. It recently attracted public interest, as well as scientific researchers, which motivates us to analyze the current marine heatwave (MHW) events in the Bay of Bengal region. we have isolated 107 MHW events (above 90th percentile threshold) in this region of the Indian Ocean and investigated the variation in duration, intensity, and frequency of MHW events during our test period (1982-2021). Our study reveals that in the study region the average of three MHW events per year with an increasing linear trend of 1.11 MHW events per decade. In the analysis, we found the longest MHW event which lasted about 99 days, which is far greater than an average MHW event duration. The maximum intensity was 5.29°C (above the climatology-mean), while the mean intensity was 2.03°C. In addition, we observed net heat flux accompanied by anticyclonic eddies to be the primary cause of these events. Moreover, we concluded that these events affect sea surface height and oceanic productivity, highlighting the adverse impact of MHWs on marine ecosystems.Keywords: marine heatwaves, global warming, climate change, sea surface temperature, marine ecosystem
Procedia PDF Downloads 1233450 Experimental Characterization of Flowable Cement Pastes Made with Marble Waste
Authors: F. Messaoudi, O. Haddad, R. Bouras, S. Kaci
Abstract:
The development of self-compacting concrete (SCC) marks a huge step towards improved efficiency and working conditions on construction sites and in the precast industry. SCC flows easily into more complex shapes and through reinforcement bars, reduces the manpower required for the placement; no vibration is required to ensure correct compaction of concrete. This concrete contains a high volume of binder which is controlled by their rheological behavior. The paste consists of binders (Portland cement with or without supplementary cementitious materials), water, chemical admixtures and fillers. In this study, two series of tests were performed on self-compacting cement pastes made with marble waste additions as the mineral addition. The first series of this investigation was to determine the flow time of paste using Marsh cone, the second series was to determine the rheological parameters of the same paste namely yield stress and plastic viscosity using the rheometer Haake RheoStress 1. The results of this investigation allowed us to study the evolution of the yield stress, viscosity and the flow time Marsh cone paste as a function of the composition of the paste. A correlation between the results obtained on the flow test Marsh cone and those of the plastic viscosity on the mottled different cement pastes is proposed.Keywords: adjuvant, rheological parameter, self-compacting cement pastes, waste marble
Procedia PDF Downloads 2763449 Development of Trigger Tool to Identify Adverse Drug Events From Warfarin Administered to Patient Admitted in Medical Wards of Chumphae Hospital
Authors: Puntarikorn Rungrattanakasin
Abstract:
Objectives: To develop the trigger tool to warn about the risk of bleeding as an adverse event from warfarin drug usage during admission in Medical Wards of Chumphae Hospital. Methods: A retrospective study was performed by reviewing the medical records for the patients admitted between June 1st,2020- May 31st, 2021. ADEs were evaluated by Naranjo’s algorithm. The international normalized ratio (INR) and events of bleeding during admissions were collected. Statistical analyses, including Chi-square test and Reciever Operating Characteristic (ROC) curve for optimal INR threshold, were used for the study. Results: Among the 139 admissions, the INR range was found to vary between 0.86-14.91, there was a total of 15 bleeding events, out of which 9 were mild, and 6 were severe. The occurrence of bleeding started whenever the INR was greater than 2.5 and reached the statistical significance (p <0.05), which was in concordance with the ROC curve and yielded 100 % sensitivity and 60% specificity in the detection of a bleeding event. In this regard, the INR greater than 2.5 was considered to be an optimal threshold to alert promptly for bleeding tendency. Conclusions: The INR value of greater than 2.5 (>2.5) would be an appropriate trigger tool to warn of the risk of bleeding for patients taking warfarin in Chumphae Hospital.Keywords: trigger tool, warfarin, risk of bleeding, medical wards
Procedia PDF Downloads 1483448 Significance of Square Non-Spiral Microcoils for Biomedical Applications
Authors: Himanshu Chandrakar, Krishnapriya S., Rama Komaragiri, Suja K. J.
Abstract:
Micro coils are significant components for micro magnetic sensors and actuators especially in biomedical devices. Non-spiral planar microcoils of square, hexagonal and octagonal shapes are introduced for the first time in this paper. Comparison between different planar spiral and non-spiral coils are also discussed. The fabrication advantages and low power dissipation of non-spiral structures make them a strong alternative for conventional spiral planar coils. Series resistance of non-spiral coil is lesser than that of spiral coils though magnetic field is slightly lesser for non-spiral coils. Comparison of different planar microcoils shows that the proposed square non-spiral coil gives better performance than other structures.Keywords: non-spiral planar microcoil, power dissipation, series resistance, spiral
Procedia PDF Downloads 1683447 One Period Loops of Memristive Circuits with Mixed-Mode Oscillations
Authors: Wieslaw Marszalek, Zdzislaw Trzaska
Abstract:
Interesting properties of various one-period loops of singularly perturbed memristive circuits with mixed-mode oscillations (MMOs) are analyzed in this paper. The analysis is mixed, both analytical and numerical and focused on the properties of pinched hysteresis of the memristive element and other one-period loops formed by pairs of time-series solutions for various circuits' variables. The memristive element is the only nonlinear element in the two circuits. A theorem on periods of mixed-mode oscillations of the circuits is formulated and proved. Replacements of memristors by parallel G-C or series R-L circuits for a MMO response with equivalent RMS values is also discussed.Keywords: mixed-mode oscillations, memristive circuits, pinched hysteresis, one-period loops, singularly perturbed circuits
Procedia PDF Downloads 4703446 Simulation of Photovoltaic Array for Specified Ratings of Converter
Authors: Smita Pareek, Ratna Dahiya
Abstract:
The power generated by solar photovoltaic (PV) module depends on surrounding irradiance, temperature, shading conditions, and shading pattern. This paper presents a simulation of photovoltaic module using Matlab/Simulink. PV Array is also simulated by series and parallel connections of modules and their characteristics curves are given. Further PV module topology/configuration are proposed for 5.5kW inverter available in the literature. Shading of a PV array either complete or partial can have a significant impact on its power output and energy yield; therefore, the simulated model characteristics curves (I-V and P-V) are drawn for uniform shading conditions (USC) and then output power, voltage and current are calculated for variation in insolation for shading conditions. Additionally the characteristics curves are also given for a predetermined shadowing condition.Keywords: array, series, parallel, photovoltaic, partial shading
Procedia PDF Downloads 5663445 Initial Palaeotsunami and Historical Tsunami in the Makran Subduction Zone of the Northwest Indian Ocean
Authors: Mohammad Mokhtari, Mehdi Masoodi, Parvaneh Faridi
Abstract:
history of tsunami generating earthquakes along the Makran Subduction Zone provides evidence of the potential tsunami hazard for the whole coastal area. In comparison with other subduction zone in the world, the Makran region of southern Pakistan and southeastern Iran remains low seismicity. Also, it is one of the least studied area in the northwest of the Indian Ocean regarding tsunami studies. We present a review of studies dealing with the historical /and ongoing palaeotsunamis supported by IGCP of UNESCO in the Makran Subduction Zone. The historical tsunami presented here includes about nine tsunamis in the Makran Subduction Zone, of which over 7 tsunamis occur in the eastern Makran. Tsunami is not as common in the western Makran as in the eastern Makran, where a database of historical events exists. The historically well-documented event is related to the 1945 earthquake with a magnitude of 8.1moment magnitude and tsunami in the western and eastern Makran. There are no details as to whether a tsunami was generated by a seismic event before 1945 off western Makran. But several potentially large tsunamigenic events in the MSZ before 1945 occurred in 325 B.C., 1008, 1483, 1524, 1765, 1851, 1864, and 1897. Here we will present new findings from a historical point of view, immediately, we would like to emphasize that the area needs to be considered with higher research investigation. As mentioned above, a palaeotsunami (geological evidence) is now being planned, and here we will present the first phase result. From a risk point of view, the study shows as preliminary achievement within 20 minutes the wave reaches to Iranian as well Pakistan and Oman coastal zone with very much destructive tsunami waves capable of inundating destructive effect. It is important to note that all the coastal areas of all states surrounding the MSZ are being developed very rapidly, so any event would have a devastating effect on this region. Although several papers published about modelling, seismology, tsunami deposits in the last decades; as Makran is a forgotten subduction zone, more data such as the main crustal structure, fault location, and its related parameter are required.Keywords: historical tsunami, Indian ocean, makran subduction zone, palaeotsunami
Procedia PDF Downloads 1313444 Research of Interaction between Layers of Compressed Composite Columns
Authors: Daumantas Zidanavicius
Abstract:
In order to investigate the bond between concrete and steel in the circular steel tube column filled with concrete, the 7 series of specimens were tested with the same geometrical parameters but different concrete properties. Two types of specimens were chosen. For the first type, the expansive additives to the concrete mixture were taken to increase internal forces. And for the second type, mechanical components were used. All 7 series of the short columns were modeled by FEM and tested experimentally. In the work, big attention was taken to the bond-slip models between steel and concrete. Results show that additives to concrete let increase the bond strength up to two times and the mechanical anchorage –up to 6 times compared to control specimens without additives and anchorage.Keywords: concrete filled steel tube, push-out test, bond slip relationship, bond stress distribution
Procedia PDF Downloads 1243443 A Long Short-Term Memory Based Deep Learning Model for Corporate Bond Price Predictions
Authors: Vikrant Gupta, Amrit Goswami
Abstract:
The fixed income market forms the basis of the modern financial market. All other assets in financial markets derive their value from the bond market. Owing to its over-the-counter nature, corporate bonds have relatively less data publicly available and thus is researched upon far less compared to Equities. Bond price prediction is a complex financial time series forecasting problem and is considered very crucial in the domain of finance. The bond prices are highly volatile and full of noise which makes it very difficult for traditional statistical time-series models to capture the complexity in series patterns which leads to inefficient forecasts. To overcome the inefficiencies of statistical models, various machine learning techniques were initially used in the literature for more accurate forecasting of time-series. However, simple machine learning methods such as linear regression, support vectors, random forests fail to provide efficient results when tested on highly complex sequences such as stock prices and bond prices. hence to capture these intricate sequence patterns, various deep learning-based methodologies have been discussed in the literature. In this study, a recurrent neural network-based deep learning model using long short term networks for prediction of corporate bond prices has been discussed. Long Short Term networks (LSTM) have been widely used in the literature for various sequence learning tasks in various domains such as machine translation, speech recognition, etc. In recent years, various studies have discussed the effectiveness of LSTMs in forecasting complex time-series sequences and have shown promising results when compared to other methodologies. LSTMs are a special kind of recurrent neural networks which are capable of learning long term dependencies due to its memory function which traditional neural networks fail to capture. In this study, a simple LSTM, Stacked LSTM and a Masked LSTM based model has been discussed with respect to varying input sequences (three days, seven days and 14 days). In order to facilitate faster learning and to gradually decompose the complexity of bond price sequence, an Empirical Mode Decomposition (EMD) has been used, which has resulted in accuracy improvement of the standalone LSTM model. With a variety of Technical Indicators and EMD decomposed time series, Masked LSTM outperformed the other two counterparts in terms of prediction accuracy. To benchmark the proposed model, the results have been compared with traditional time series models (ARIMA), shallow neural networks and above discussed three different LSTM models. In summary, our results show that the use of LSTM models provide more accurate results and should be explored more within the asset management industry.Keywords: bond prices, long short-term memory, time series forecasting, empirical mode decomposition
Procedia PDF Downloads 1363442 Pharmacovigilance: An Empowerment in Safe Utilization of Pharmaceuticals
Authors: Pankaj Prashar, Bimlesh Kumar, Ankita Sood, Anamika Gautam
Abstract:
Pharmacovigilance (PV) is a rapidly growing discipline in pharmaceutical industries as an integral part of clinical research and drug development over the past few decades. PV carries a breadth of scope from drug manufacturing to its regulation with safer utilization. The fundamental steps of PV not only includes data collection and verification, coding of drugs with adverse drug reactions, causality assessment and timely reporting to the authorities but also monitoring drug manufacturing, safety issues, product quality and conduction of due diligence. Standardization of adverse event information, collaboration of multiple departments in different companies, preparation of documents in accordance to both governmental as well as non-governmental organizations (FDA, EMA, GVP, ICH) are the advancements in discipline of PV. De-harmonization, lack of predictive drug safety models, improper funding by government, non-reporting, and non-acceptability of ADRs by developing countries and reports directly from patients to the monitoring centres respectively are the major road backs of PV. Mandatory pharmacovigilance reporting, frequent inspections, funding by government, educating and training medical students, pharmacists and nurses in this segment can bring about empowerment in PV. This area needs to be addressed with a sense of urgency for the safe utilization of pharmaceuticals.Keywords: pharmacovigilance, regulatory, adverse event, drug safety
Procedia PDF Downloads 1243441 Performance Comparison of Thread-Based and Event-Based Web Servers
Authors: Aikaterini Kentroti, Theodore H. Kaskalis
Abstract:
Today, web servers are expected to serve thousands of client requests concurrently within stringent response time limits. In this paper, we evaluate experimentally and compare the performance as well as the resource utilization of popular web servers, which differ in their approach to handle concurrency. More specifically, Central Processing Unit (CPU)- and I/O intensive tests were conducted against the thread-based Apache and Go as well as the event-based Nginx and Node.js under increasing concurrent load. The tests involved concurrent users requesting a term of the Fibonacci sequence (the 10th, 20th, 30th) and the content of a table from the database. The results show that Go achieved the best performance in all benchmark tests. For example, Go reached two times higher throughput than Node.js and five times higher than Apache and Nginx in the 20th Fibonacci term test. In addition, Go had the smallest memory footprint and demonstrated the most efficient resource utilization, in terms of CPU usage. Instead, Node.js had by far the largest memory footprint, consuming up to 90% more memory than Nginx and Apache. Regarding the performance of Apache and Nginx, our findings indicate that Hypertext Preprocessor (PHP) becomes a bottleneck when the servers are requested to respond by performing CPU-intensive tasks under increasing concurrent load.Keywords: apache, Go, Nginx, node.js, web server benchmarking
Procedia PDF Downloads 973440 Proactive Pure Handoff Model with SAW-TOPSIS Selection and Time Series Predict
Authors: Harold Vásquez, Cesar Hernández, Ingrid Páez
Abstract:
This paper approach cognitive radio technic and applied pure proactive handoff Model to decrease interference between PU and SU and comparing it with reactive handoff model. Through the study and analysis of multivariate models SAW and TOPSIS join to 3 dynamic prediction techniques AR, MA ,and ARMA. To evaluate the best model is taken four metrics: number failed handoff, number handoff, number predictions, and number interference. The result presented the advantages using this type of pure proactive models to predict changes in the PU according to the selected channel and reduce interference. The model showed better performance was TOPSIS-MA, although TOPSIS-AR had a higher predictive ability this was not reflected in the interference reduction.Keywords: cognitive radio, spectrum handoff, decision making, time series, wireless networks
Procedia PDF Downloads 4873439 QSAR, Docking and E-pharmacophore Approach on Novel Series of HDAC Inhibitors with Thiophene Linker as Anticancer Agents
Authors: Harish Rajak, Preeti Patel
Abstract:
HDAC inhibitors can reactivate gene expression and inhibit the growth and survival of cancer cells. The 3D-QSAR and Pharmacophore modeling studies were performed to identify important pharmacophoric features and correlate 3D-chemical structure with biological activity. The pharmacophore hypotheses were developed using e-pharmacophore script and phase module. Pharmacophore hypothesis represents the 3D arrangement of molecular features necessary for activity. A series of 55 compounds with well-assigned HDAC inhibitory activity was used for 3D-QSAR model development. Best 3D-QSAR model, which is a five PLS factor model with good statistics and predictive ability, acquired Q2 (0.7293), R2 (0.9811) and standard deviation (0.0952). Molecular docking were performed using Histone Deacetylase protein (PDB ID: 1t69) and prepared series of hydroxamic acid based HDAC inhibitors. Docking study of compound 43 show significant binding interactions Ser 276 and oxygen atom of dioxine cap region, Gly 151 and amino group and Asp 267 with carboxyl group of CONHOH, which are essential for anticancer activity. On docking, most of the compounds exhibited better glide score values between -8 to -10.5. We have established structure activity correlation using docking, energetic based pharmacophore modelling, pharmacophore and atom based 3D QSAR model. The results of these studies were further used for the design and testing of new HDAC analogs.Keywords: Docking, e-pharmacophore, HDACIs, QSAR, Suberoylanilidehydroxamic acid.
Procedia PDF Downloads 301