Search results for: raw data utilization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 25391

Search results for: raw data utilization

25181 Exploring the Relationship between Computerization and Marketing Performance Case Study: Snowa Company

Authors: Mojtaba Molaahmadi, Morteza Raei Dehaghi, Abdolrahim Arghavan

Abstract:

The present study aims to explore the effect of computerization on marketing performance in Snowa Company. In other words, this study intends to respond to this question that whether or not there is a relationship between utilization of computerization in marketing activities and marketing performance. The statistical population included 60 marketing managers of Snowa Company. In order to test the research hypotheses, Pearson correlation coefficient was employed. The reliability was equal to 96.8%. In this study, computerization was the independent variable and marketing performance was the dependent variable with characteristics of market share, improving the competitive position, and sales volume. The results of testing the hypotheses revealed that there is a significant relationship between utilization of computerization and market share, sales volume and improving the competitive position

Keywords: computerization, e-marketing information, information technology, marketing performance

Procedia PDF Downloads 300
25180 A Literature Review on Emotion Recognition Using Wireless Body Area Network

Authors: Christodoulou Christos, Politis Anastasios

Abstract:

The utilization of Wireless Body Area Network (WBAN) is experiencing a notable surge in popularity as a result of its widespread implementation in the field of smart health. WBANs utilize small sensors implanted within the human body to monitor and record physiological indicators. These sensors transmit the collected data to hospitals and healthcare facilities through designated access points. Bio-sensors exhibit a diverse array of shapes and sizes, and their deployment can be tailored to the condition of the individual. Multiple sensors may be strategically placed within, on, or around the human body to effectively observe, record, and transmit essential physiological indicators. These measurements serve as a basis for subsequent analysis, evaluation, and therapeutic interventions. In conjunction with physical health concerns, numerous smartwatches are engineered to employ artificial intelligence techniques for the purpose of detecting mental health conditions such as depression and anxiety. The utilization of smartwatches serves as a secure and cost-effective solution for monitoring mental health. Physiological signals are widely regarded as a highly dependable method for the recognition of emotions due to the inherent inability of individuals to deliberately influence them over extended periods of time. The techniques that WBANs employ to recognize emotions are thoroughly examined in this article.

Keywords: emotion recognition, wireless body area network, WBAN, ERC, wearable devices, psychological signals, emotion, smart-watch, prediction

Procedia PDF Downloads 21
25179 Fatty Acid Translocase (Cd36), Energy Substrate Utilization, and Insulin Signaling in Brown Adipose Tissue in Spontaneously Hypertensive Rats

Authors: Michal Pravenec, Miroslava Simakova, Jan Silhavy

Abstract:

Brown adipose tissue (BAT) plays an important role in lipid and glucose metabolism in rodents and possibly also in humans. Recently, using systems genetics approach in the BAT from BXH/HXB recombinant inbred strains, derived from the SHR (spontaneously hypertensive rat) and BN (Brown Norway) progenitors, we identified Cd36 (fatty acid translocase) as the hub gene of co-expression module associated with BAT relative weight and function. An important aspect of BAT biology is to better understand the mechanisms regulating the uptake and utilization of fatty acids and glucose. Accordingly, BAT function in the SHR that harbors mutant nonfunctional Cd36 variant (hereafter referred to as SHR-Cd36⁻/⁻) was compared with SHR transgenic line expressing wild type Cd36 under control of a universal promoter (hereafter referred to as SHR-Cd36⁺/⁺). BAT was incubated in media containing insulin and 14C-U-glucose alone or 14C-U-glucose together with palmitate. Incorporation of glucose into BAT lipids was significantly higher in SHR-Cd36⁺/⁺ versus SHR-Cd36⁻/⁻ rats when incubation media contained glucose alone (SHR-Cd36⁻/⁻ 591 ± 75 vs. SHR-Cd36⁺/⁺ 1036 ± 135 nmol/gl./2h; P < 0.005). Adding palmitate into incubation media had no effect in SHR-Cd36⁻/⁻ rats but significantly reduced glucose incorporation into BAT lipids in SHR-Cd36⁺/⁺ (SHR-Cd36⁻/⁻ 543 ± 55 vs. SHR-Cd36⁺/⁺ 766 ± 75 nmol/gl./2h; P < 0.05 denotes significant Cd36 x palmitate interaction determined by two-way ANOVA). This Cd36-dependent reduced glucose uptake in SHR-Cd36⁺/⁺ BAT was likely secondary to increased palmitate incorporation and utilization due to the presence of wild type Cd36 fatty acid translocase in transgenic rats. This possibility is supported by increased incorporation of 14C-U-palmitate into BAT lipids in the presence of both palmitate and glucose in incubation media (palmitate alone: SHR-Cd36⁻/⁻ 870 ± 21 vs. SHR-Cd36⁺/⁺ 899 ± 42; glucose+palmitate: SHR-Cd36⁻/⁻ 899 ± 47 vs. SHR-Cd36⁺/⁺ 1460 ± 111 nmol/palm./2h; P < 0.05 denotes significant Cd36 x glucose interaction determined by two-way ANOVA). It is possible that addition of glucose into the incubation media increased palmitate incorporation into BAT lipids in SHR-Cd36⁺/⁺ rats because of glucose availability for glycerol phosphate production and increased triglyceride synthesis. These changes in glucose and palmitate incorporation into BAT lipids were associated with significant differential expression of Irs1, Irs2, Slc2a4 and Foxo1 genes involved in insulin signaling and glucose metabolism only in SHR-Cd36⁺/⁺ rats which suggests Cd36-dependent effects on insulin action. In conclusion, these results provide compelling evidence that Cd36 plays an important role in BAT insulin signaling and energy substrate utilization.

Keywords: brown adipose tissue, Cd36, energy substrate utilization, insulin signaling, spontaneously hypertensive rat

Procedia PDF Downloads 117
25178 The Utilization of Tea Residues for Activated Carbon Preparation

Authors: Jiazhen Zhou, Youcai Zhao

Abstract:

Waste tea is commonly generated in certain areas of China and its utilization has drawn a lot of concern nowadays. In this paper, highly microporous and mesoporous activated carbons were produced from waste tea by physical activation in the presence of water vapor in a tubular furnace. The effect of activation temperature on yield and pore properties of produced activated carbon are studied. The yield decreased with the increase of activation temperature. According to the Nitrogen adsorption isotherms, the micropore and mesopore are both developed in the activated carbon. The specific surface area and the mesopore volume fractions of the activated carbon increased with the raise of activation temperature. The maximum specific surface area attained 756 m²/g produced at activation temperature 900°C. The results showed that the activation temperature had a significant effect on the micro and mesopore volumes as well as the specific surface area.

Keywords: activated carbon, nitrogen adsorption isotherm, physical activation, waste tea

Procedia PDF Downloads 301
25177 Restored CO₂ from Flue Gas and Utilization by Converting to Methanol by 3 Step Processes: Steam Reforming, Reverse Water Gas Shift and Hydrogenation

Authors: Rujira Jitrwung, Kuntima Krekkeitsakul, Weerawat Patthaveekongka, Chiraphat Kumpidet, Jarukit Tepkeaw, Krissana Jaikengdee, Anantachai Wannajampa

Abstract:

Flue gas discharging from coal fired or gas combustion power plant contains around 12% Carbon dioxide (CO₂), 6% Oxygen (O₂), and 82% Nitrogen (N₂).CO₂ is a greenhouse gas which has been concerned to the global warming. Carbon Capture, Utilization, and Storage (CCUS) is a topic which is a tool to deal with this CO₂ realization. Flue gas is drawn down from the chimney and filtered, then it is compressed to build up the pressure until 8 bar. This compressed flue gas is sent to three stages Pressure Swing Adsorption (PSA), which is filled with activated carbon. Experiments were showed the optimum adsorption pressure at 7bar, which CO₂ can be adsorbed step by step in 1st, 2nd, and 3rd stage, obtaining CO₂ concentration 29.8, 66.4, and 96.7 %, respectively. The mixed gas concentration from the last step is composed of 96.7% CO₂,2.7% N₂, and 0.6%O₂. This mixed CO₂product gas obtained from 3 stages PSA contained high concentration CO₂, which is ready to use for methanol synthesis. The mixed CO₂ was experimented in 5 Liter/Day of methanol synthesis reactor skid by 3 step processes as followed steam reforming, reverse water gas shift, and then hydrogenation. The result showed that proportional of mixed CO₂ and CH₄ 70/30, 50/50, 30/70 % (v/v), and 10/90 yielded methanol 2.4, 4.3, 5.6, and 6.0 Liter/day and save CO₂ 40, 30, 20, and 5 % respectively. The optimum condition resulted both methanol yield and CO₂ consumption using CO₂/CH₄ ratio 43/57 % (v/v), which yielded 4.8 Liter/day methanol and save CO₂ 27% comparing with traditional methanol production from methane steam reforming (5 Liter/day)and absent CO₂ consumption.

Keywords: carbon capture utilization and storage, pressure swing adsorption, reforming, reverse water gas shift, methanol

Procedia PDF Downloads 151
25176 From Theory to Practice: Harnessing Mathematical and Statistical Sciences in Data Analytics

Authors: Zahid Ullah, Atlas Khan

Abstract:

The rapid growth of data in diverse domains has created an urgent need for effective utilization of mathematical and statistical sciences in data analytics. This abstract explores the journey from theory to practice, emphasizing the importance of harnessing mathematical and statistical innovations to unlock the full potential of data analytics. Drawing on a comprehensive review of existing literature and research, this study investigates the fundamental theories and principles underpinning mathematical and statistical sciences in the context of data analytics. It delves into key mathematical concepts such as optimization, probability theory, statistical modeling, and machine learning algorithms, highlighting their significance in analyzing and extracting insights from complex datasets. Moreover, this abstract sheds light on the practical applications of mathematical and statistical sciences in real-world data analytics scenarios. Through case studies and examples, it showcases how mathematical and statistical innovations are being applied to tackle challenges in various fields such as finance, healthcare, marketing, and social sciences. These applications demonstrate the transformative power of mathematical and statistical sciences in data-driven decision-making. The abstract also emphasizes the importance of interdisciplinary collaboration, as it recognizes the synergy between mathematical and statistical sciences and other domains such as computer science, information technology, and domain-specific knowledge. Collaborative efforts enable the development of innovative methodologies and tools that bridge the gap between theory and practice, ultimately enhancing the effectiveness of data analytics. Furthermore, ethical considerations surrounding data analytics, including privacy, bias, and fairness, are addressed within the abstract. It underscores the need for responsible and transparent practices in data analytics, and highlights the role of mathematical and statistical sciences in ensuring ethical data handling and analysis. In conclusion, this abstract highlights the journey from theory to practice in harnessing mathematical and statistical sciences in data analytics. It showcases the practical applications of these sciences, the importance of interdisciplinary collaboration, and the need for ethical considerations. By bridging the gap between theory and practice, mathematical and statistical sciences contribute to unlocking the full potential of data analytics, empowering organizations and decision-makers with valuable insights for informed decision-making.

Keywords: data analytics, mathematical sciences, optimization, machine learning, interdisciplinary collaboration, practical applications

Procedia PDF Downloads 64
25175 Applications of Big Data in Education

Authors: Faisal Kalota

Abstract:

Big Data and analytics have gained a huge momentum in recent years. Big Data feeds into the field of Learning Analytics (LA) that may allow academic institutions to better understand the learners’ needs and proactively address them. Hence, it is important to have an understanding of Big Data and its applications. The purpose of this descriptive paper is to provide an overview of Big Data, the technologies used in Big Data, and some of the applications of Big Data in education. Additionally, it discusses some of the concerns related to Big Data and current research trends. While Big Data can provide big benefits, it is important that institutions understand their own needs, infrastructure, resources, and limitation before jumping on the Big Data bandwagon.

Keywords: big data, learning analytics, analytics, big data in education, Hadoop

Procedia PDF Downloads 382
25174 A Bottleneck-Aware Power Management Scheme in Heterogeneous Processors for Web Apps

Authors: Inyoung Park, Youngjoo Woo, Euiseong Seo

Abstract:

With the advent of WebGL, Web apps are now able to provide high quality graphics by utilizing the underlying graphic processing units (GPUs). Despite that the Web apps are becoming common and popular, the current power management schemes, which were devised for the conventional native applications, are suboptimal for Web apps because of the additional layer, the Web browser, between OS and application. The Web browser running on a CPU issues GL commands, which are for rendering images to be displayed by the Web app currently running, to the GPU and the GPU processes them. The size and number of issued GL commands determine the processing load of the GPU. While the GPU is processing the GL commands, CPU simultaneously executes the other compute intensive threads. The actual user experience will be determined by either CPU processing or GPU processing depending on which of the two is the more demanded resource. For example, when the GPU work queue is saturated by the outstanding commands, lowering the performance level of the CPU does not affect the user experience because it is already deteriorated by the retarded execution of GPU commands. Consequently, it would be desirable to lower CPU or GPU performance level to save energy when the other resource is saturated and becomes a bottleneck in the execution flow. Based on this observation, we propose a power management scheme that is specialized for the Web app runtime environment. This approach incurs two technical challenges; identification of the bottleneck resource and determination of the appropriate performance level for unsaturated resource. The proposed power management scheme uses the CPU utilization level of the Window Manager to tell which one is the bottleneck if exists. The Window Manager draws the final screen using the processed results delivered from the GPU. Thus, the Window Manager is on the critical path that determines the quality of user experience and purely executed by the CPU. The proposed scheme uses the weighted average of the Window Manager utilization to prevent excessive sensitivity and fluctuation. We classified Web apps into three categories using the analysis results that measure frame-per-second (FPS) changes under diverse CPU/GPU clock combinations. The results showed that the capability of the CPU decides user experience when the Window Manager utilization is above 90% and consequently, the proposed scheme decreases the performance level of CPU by one step. On the contrary, when its utilization is less than 60%, the bottleneck usually lies in the GPU and it is desirable to decrease the performance of GPU. Even the processing unit that is not on critical path, excessive performance drop can occur and that may adversely affect the user experience. Therefore, our scheme lowers the frequency gradually, until it finds an appropriate level by periodically checking the CPU utilization. The proposed scheme reduced the energy consumption by 10.34% on average in comparison to the conventional Linux kernel, and it worsened their FPS by 1.07% only on average.

Keywords: interactive applications, power management, QoS, Web apps, WebGL

Procedia PDF Downloads 165
25173 Thermodynamic Analyses of Information Dissipation along the Passive Dendritic Trees and Active Action Potential

Authors: Bahar Hazal Yalçınkaya, Bayram Yılmaz, Mustafa Özilgen

Abstract:

Brain information transmission in the neuronal network occurs in the form of electrical signals. Neural work transmits information between the neurons or neurons and target cells by moving charged particles in a voltage field; a fraction of the energy utilized in this process is dissipated via entropy generation. Exergy loss and entropy generation models demonstrate the inefficiencies of the communication along the dendritic trees. In this study, neurons of 4 different animals were analyzed with one dimensional cable model with N=6 identical dendritic trees and M=3 order of symmetrical branching. Each branch symmetrically bifurcates in accordance with the 3/2 power law in an infinitely long cylinder with the usual core conductor assumptions, where membrane potential is conserved in the core conductor at all branching points. In the model, exergy loss and entropy generation rates are calculated for each branch of equivalent cylinders of electrotonic length (L) ranging from 0.1 to 1.5 for four different dendritic branches, input branch (BI), and sister branch (BS) and two cousin branches (BC-1 & BC-2). Thermodynamic analysis with the data coming from two different cat motoneuron studies show that in both experiments nearly the same amount of exergy is lost while generating nearly the same amount of entropy. Guinea pig vagal motoneuron loses twofold more exergy compared to the cat models and the squid exergy loss and entropy generation were nearly tenfold compared to the guinea pig vagal motoneuron model. Thermodynamic analysis show that the dissipated energy in the dendritic tress is directly proportional with the electrotonic length, exergy loss and entropy generation. Entropy generation and exergy loss show variability not only between the vertebrate and invertebrates but also within the same class. Concurrently, single action potential Na+ ion load, metabolic energy utilization and its thermodynamic aspect contributed for squid giant axon and mammalian motoneuron model. Energy demand is supplied to the neurons in the form of Adenosine triphosphate (ATP). Exergy destruction and entropy generation upon ATP hydrolysis are calculated. ATP utilization, exergy destruction and entropy generation showed differences in each model depending on the variations in the ion transport along the channels.

Keywords: ATP utilization, entropy generation, exergy loss, neuronal information transmittance

Procedia PDF Downloads 360
25172 Exploring the Chemical Composition of Drinking Water in Residential Area of Kuwait by Appling GIS Technology

Authors: H. Aljabli

Abstract:

The research on the presence of heavy metals and bromate in drinking water is of significant scientific importance. These substances have the potential to pose risks to public health and are subject to regulatory limits outlined by the National Primary Drinking Water Regulations. Through a comprehensive analysis that includes the compilation of existing data and the collection of new data via water sampling in residential areas of Kuwait, you aim to generate maps that depict the spatial distribution of these substances. Moreover, your investigation will involve the utilization of GRAPHER software to explore correlations among different chemical parameters. By employing rigorous scientific methodologies, your research will provide valuable insights for the Ministry of Electricity and Water and the Ministry of Health. These insights can inform evidence-based decision-making, facilitate the implementation of corrective measures, and support strategic planning for future infrastructure activities.

Keywords: heavy metals, bromate, ozonation, GIS

Procedia PDF Downloads 26
25171 The Implementation of a Nurse-Driven Palliative Care Trigger Tool

Authors: Sawyer Spurry

Abstract:

Problem: Palliative care providers at an academic medical center in Maryland stated medical intensive care unit (MICU) patients are often referred late in their hospital stay. The MICU has performed well below the hospital quality performance metric of 80% of patients who expire with expected outcomes should have received a palliative care consult within 48 hours of admission. Purpose: The purpose of this quality improvement (QI) project is to increase palliative care utilization in the MICU through the implementation of a Nurse-Driven PalliativeTriggerTool to prompt the need for specialty palliative care consult. Methods: MICU nursing staff and providers received education concerning the implications of underused palliative care services and the literature data supporting the use of nurse-driven palliative care tools as a means of increasing utilization of palliative care. A MICU population specific criteria of palliative triggers (Palliative Care Trigger Tool) was formulated by the QI implementation team, palliative care team, and patient care services department. Nursing staff were asked to assess patients daily for the presence of palliative triggers using the Palliative Care Trigger Tool and present findings during bedside rounds. MICU providers were asked to consult palliative medicinegiven the presence of palliative triggers; following interdisciplinary rounds. Rates of palliative consult, given the presence of triggers, were collected via electronic medical record e-data pull, de-identified, and recorded in the data collection tool. Preliminary Results: Over 140 MICU registered nurses were educated on the palliative trigger initiative along with 8 nurse practitioners, 4 intensivists, 2 pulmonary critical care fellows, and 2 palliative medicine physicians. Over 200 patients were admitted to the MICU and screened for palliative triggers during the 15-week implementation period. Primary outcomes showed an increase in palliative care consult rates to those patients presenting with triggers, a decreased mean time from admission to palliative consult, and increased recognition of unmet palliative care needs by MICU nurses and providers. Conclusions: Anticipatory findings of this QI project would suggest a positive correlation between utilizing palliative care trigger criteria and decreased time to palliative care consult. The direct outcomes of effective palliative care results in decreased length of stay, healthcare costs, and moral distress, as well as improved symptom management and quality of life (QOL).

Keywords: palliative care, nursing, quality improvement, trigger tool

Procedia PDF Downloads 159
25170 Efficient Utilization of Commodity Computers in Academic Institutes: A Cloud Computing Approach

Authors: Jasraj Meena, Malay Kumar, Manu Vardhan

Abstract:

Cloud computing is a new technology in industry and academia. The technology has grown and matured in last half decade and proven their significant role in changing environment of IT infrastructure where cloud services and resources are offered over the network. Cloud technology enables users to use services and resources without being concerned about the technical implications of technology. There are substantial research work has been performed for the usage of cloud computing in educational institutes and majority of them provides cloud services over high-end blade servers or other high-end CPUs. However, this paper proposes a new stack called “CiCKAStack” which provide cloud services over unutilized computing resources, named as commodity computers. “CiCKAStack” provides IaaS and PaaS using underlying commodity computers. This will not only increasing the utilization of existing computing resources but also provide organize file system, on demand computing resource and design and development environment.

Keywords: commodity computers, cloud-computing, KVM, CloudStack, AppScale

Procedia PDF Downloads 239
25169 Assessment of Forage Utilization for Pasture-Based Livestock Production in Udubo Grazing Reserve, Bauchi State

Authors: Mustapha Saidu, Bilyaminu Mohammed

Abstract:

The study was conducted in Udubo Grazing Reserve between July 2019 and October 2019 to assess forage utilization for pasture-based livestock production in reserve. The grazing land was cross-divided into grids, where 15 coordinates were selected as the sample points. Grids of one-kilometer interval were made. The grids were systematically selected 1 grid after 7 grids. 1 × 1-meter quadrat was made at the coordinate of the selected grids for measurement, estimation, and sample collection. The results of the study indicated that Zornia glochidiatah has the highest percent of species composition (42%), while Mitracarpus hirtus has the lowest percent (0.1%). Urochloa mosambicensis has 48 percent of height removed and 27 percent used by weight, Zornia glochidiata 60 percent of height removed and 57 percent used by weight, Alysicapus veginalis has 55 percent of height removed, and 40 percent used by weight, and Cenchrus biflorus has 40 percent of height removed and 28 percent used by weight. The target is 50 percent utilization of forage by weight during a grazing period as well as at the end of the grazing season. The study found that Orochloa mosambicensis, Alysicarpus veginalis, and Cenchrus biflorus had lower percent by weight which is normal, while Zornia glochidiata had a higher percent by weight which is an indication of danger. The study recommends that the identification of key plant species in pasture and rangeland is critical to implementing a successful grazing management plan. There should be collective action and promotion of historically generated grazing knowledge through public and private advocacies.

Keywords: forage, grazing reserve, live stock, pasture, plant species

Procedia PDF Downloads 51
25168 Utilization of Juncus acutus as Alternative Feed Resource in Ruminants

Authors: Nurcan Cetinkaya

Abstract:

The aim of this paper is to bring about the utilization of Juncus acutus as an alternative roughage resource in ruminant nutrition. In Turkey, JA is prevailing plant of the natural grassland in Kizilirmak Delta, Samsun. Crude nutrient values such as crude protein (CP), ether extract (EE), organic matter (OM), neutral detergent fiber (NDF), acid detergent fiber (ADF), and acid detergent lignin(ADL) including antioxidant activity, total phenolic and flavonoid compounds, total organic matter digestibility (OMD) and metabolisable energy (ME) values of Juncus acutus stem, seed, and also its mixture with maize silage were estimated. and published. Furthermore, the effects of JA over rumen cellulolitic bacteria were studied. The obtained results from different studies conducted on JA by our team show that Juncus acutus may be a new roughage source in ruminant nutrition.

Keywords: antioxidant activity, cellulolytic bacteria, Juncus acutus, organic matter digestibility

Procedia PDF Downloads 247
25167 Optimization of Dez Dam Reservoir Operation Using Genetic Algorithm

Authors: Alireza Nikbakht Shahbazi, Emadeddin Shirali

Abstract:

Since optimization issues of water resources are complicated due to the variety of decision making criteria and objective functions, it is sometimes impossible to resolve them through regular optimization methods or, it is time or money consuming. Therefore, the use of modern tools and methods is inevitable in resolving such problems. An accurate and essential utilization policy has to be determined in order to use natural resources such as water reservoirs optimally. Water reservoir programming studies aim to determine the final cultivated land area based on predefined agricultural models and water requirements. Dam utilization rule curve is also provided in such studies. The basic information applied in water reservoir programming studies generally include meteorological, hydrological, agricultural and water reservoir related data, and the geometric characteristics of the reservoir. The system of Dez dam water resources was simulated applying the basic information in order to determine the capability of its reservoir to provide the objectives of the performed plan. As a meta-exploratory method, genetic algorithm was applied in order to provide utilization rule curves (intersecting the reservoir volume). MATLAB software was used in order to resolve the foresaid model. Rule curves were firstly obtained through genetic algorithm. Then the significance of using rule curves and the decrease in decision making variables in the system was determined through system simulation and comparing the results with optimization results (Standard Operating Procedure). One of the most essential issues in optimization of a complicated water resource system is the increasing number of variables. Therefore a lot of time is required to find an optimum answer and in some cases, no desirable result is obtained. In this research, intersecting the reservoir volume has been applied as a modern model in order to reduce the number of variables. Water reservoir programming studies has been performed based on basic information, general hypotheses and standards and applying monthly simulation technique for a statistical period of 30 years. Results indicated that application of rule curve prevents the extreme shortages and decrease the monthly shortages.

Keywords: optimization, rule curve, genetic algorithm method, Dez dam reservoir

Procedia PDF Downloads 240
25166 Mathematics Bridging Theory and Applications for a Data-Driven World

Authors: Zahid Ullah, Atlas Khan

Abstract:

In today's data-driven world, the role of mathematics in bridging the gap between theory and applications is becoming increasingly vital. This abstract highlights the significance of mathematics as a powerful tool for analyzing, interpreting, and extracting meaningful insights from vast amounts of data. By integrating mathematical principles with real-world applications, researchers can unlock the full potential of data-driven decision-making processes. This abstract delves into the various ways mathematics acts as a bridge connecting theoretical frameworks to practical applications. It explores the utilization of mathematical models, algorithms, and statistical techniques to uncover hidden patterns, trends, and correlations within complex datasets. Furthermore, it investigates the role of mathematics in enhancing predictive modeling, optimization, and risk assessment methodologies for improved decision-making in diverse fields such as finance, healthcare, engineering, and social sciences. The abstract also emphasizes the need for interdisciplinary collaboration between mathematicians, statisticians, computer scientists, and domain experts to tackle the challenges posed by the data-driven landscape. By fostering synergies between these disciplines, novel approaches can be developed to address complex problems and make data-driven insights accessible and actionable. Moreover, this abstract underscores the importance of robust mathematical foundations for ensuring the reliability and validity of data analysis. Rigorous mathematical frameworks not only provide a solid basis for understanding and interpreting results but also contribute to the development of innovative methodologies and techniques. In summary, this abstract advocates for the pivotal role of mathematics in bridging theory and applications in a data-driven world. By harnessing mathematical principles, researchers can unlock the transformative potential of data analysis, paving the way for evidence-based decision-making, optimized processes, and innovative solutions to the challenges of our rapidly evolving society.

Keywords: mathematics, bridging theory and applications, data-driven world, mathematical models

Procedia PDF Downloads 45
25165 Factors Affecting Attitude of Community Pharmacists Towards Locally Manufactured Pharmaceutical Products in Addisababa: A Cross-sectional Study

Authors: Gelila Tamyalew, Asres Abitie

Abstract:

Community Pharmacists (CPs) have a significant part in consumer choice in the rational use of LMPPs. The opinion of pharmacists regarding branded and generic medications can offer a perception of the potential obstacles that might have to be overcome to advance generic medicine utilization. Many factors affect CPs' attitudes negatively toward LMPPs. Therefore, the current study assessed factors that can affect CPs' attitudes toward LMPPs. In the regression analysis of variables, three variables were associated with CPs' attitudes toward LMPPs. These are; maximum educational status, professional status, and year of experience in community pharmacy practice. Moreover, lack of belief in LMPPs, substitution agreement with the prescriber, cost-effectiveness of LMPPs, and consumer preference/demand were the most influencing reasons for the selection of LMPPs. In conclusion, the attitude of CPs seems suboptimal that requires an intervention to optimize LMPP utilization.

Keywords: locally manufactured pharmaceutical products, attitude, community pharmacist, Ethiopia

Procedia PDF Downloads 53
25164 Scheduling in a Single-Stage, Multi-Item Compatible Process Using Multiple Arc Network Model

Authors: Bokkasam Sasidhar, Ibrahim Aljasser

Abstract:

The problem of finding optimal schedules for each equipment in a production process is considered, which consists of a single stage of manufacturing and which can handle different types of products, where changeover for handling one type of product to the other type incurs certain costs. The machine capacity is determined by the upper limit for the quantity that can be processed for each of the products in a set up. The changeover costs increase with the number of set ups and hence to minimize the costs associated with the product changeover, the planning should be such that similar types of products should be processed successively so that the total number of changeovers and in turn the associated set up costs are minimized. The problem of cost minimization is equivalent to the problem of minimizing the number of set ups or equivalently maximizing the capacity utilization in between every set up or maximizing the total capacity utilization. Further, the production is usually planned against customers’ orders, and generally different customers’ orders are assigned one of the two priorities – “normal” or “priority” order. The problem of production planning in such a situation can be formulated into a Multiple Arc Network (MAN) model and can be solved sequentially using the algorithm for maximizing flow along a MAN and the algorithm for maximizing flow along a MAN with priority arcs. The model aims to provide optimal production schedule with an objective of maximizing capacity utilization, so that the customer-wise delivery schedules are fulfilled, keeping in view the customer priorities. Algorithms have been presented for solving the MAN formulation of the production planning with customer priorities. The application of the model is demonstrated through numerical examples.

Keywords: scheduling, maximal flow problem, multiple arc network model, optimization

Procedia PDF Downloads 379
25163 Causes of Terrorism: Perceptions of University Students of Teacher Training Institutions

Authors: Saghir Ahmad, Abid Hussain Ch, Misbah Malik, Ayesha Batool

Abstract:

Terrorism is the marvel in which dreadful circumstance is made by a gathering of individuals who view themselves as abused by society. Terrorism is the unlawful utilization of power or viciousness by a man or a sorted out gathering by the general population or property with the aim of intimidation or compulsion of social orders or governments frequently for ideological or political reasons. Terrorism is as old as people. The main aim of the study was to find out the causes of terrorism through the perceptions of the universities students of teacher training institutions. This study was quantitative in nature. Survey method was used to collect data. A sample of two hundred and sixty seven students was selected from public universities. A five point Likert scale was used to collect data. Mean, Standard deviation, independent sample t-test, and One Way ANOVA were applied to analyze the data. The major findings of the study indicated that students perceived the main causes of terrorism are poverty, foreign interference, wrong concept of Islamization, and social injustice. It is also concluded that mostly, students think that drone attacks are promoting the terrorist activities. The education is key to eliminate the terrorism. There is need to educate the people and specially youngsters to bring the peace in the world.

Keywords: dreadful circumstance, governments, power, students, terrorism

Procedia PDF Downloads 515
25162 Appraisal of Parents' Views and Supervision of Their Children's Use of Information Communication Technology

Authors: Olabisi Adedigba

Abstract:

It is a fundamental truth that Information Communication Technology (ICT) lies at the very heart of our today’s society and determines its development. The use of ICT has given a boost to the educational and mental development of an average pupil of this age far above their counterparts who lived centuries ago. Nevertheless, the present age children stand the risk of the scourge of this technology if proactive measures are not taken urgently to arrest the damages of its negative use on them. One of the measures that can be taken is supervision of children’s use of ICT. This research therefore investigated parents’ views and supervision of their children’s use of Information Communication Technology. Descriptive design was adopted for this study. 300 parents were randomly selected. “Parents’ Views and Supervision of Children’s Use of ICT” was used to collect data for the study. Data collected were analyzed using percentage, mean, standard deviation and t-test. The result revealed that parents’ view of their children’s use of ICT is negative while supervision of their children’s use of ICT is low. Recommendations were thus offered that schools and other stakeholders should educate parents on children’s proper utilization of ICT and parents are urged to maintain adequate supervision on their children use of ICT.

Keywords: appraisal of parents’ views and supervision, children’s use, information communication technology, t-test

Procedia PDF Downloads 473
25161 Visualization Tool for EEG Signal Segmentation

Authors: Sweeti, Anoop Kant Godiyal, Neha Singh, Sneh Anand, B. K. Panigrahi, Jayasree Santhosh

Abstract:

This work is about developing a tool for visualization and segmentation of Electroencephalograph (EEG) signals based on frequency domain features. Change in the frequency domain characteristics are correlated with change in mental state of the subject under study. Proposed algorithm provides a way to represent the change in the mental states using the different frequency band powers in form of segmented EEG signal. Many segmentation algorithms have been suggested in literature having application in brain computer interface, epilepsy and cognition studies that have been used for data classification. But the proposed method focusses mainly on the better presentation of signal and that’s why it could be a good utilization tool for clinician. Algorithm performs the basic filtering using band pass and notch filters in the range of 0.1-45 Hz. Advanced filtering is then performed by principal component analysis and wavelet transform based de-noising method. Frequency domain features are used for segmentation; considering the fact that the spectrum power of different frequency bands describes the mental state of the subject. Two sliding windows are further used for segmentation; one provides the time scale and other assigns the segmentation rule. The segmented data is displayed second by second successively with different color codes. Segment’s length can be selected as per need of the objective. Proposed algorithm has been tested on the EEG data set obtained from University of California in San Diego’s online data repository. Proposed tool gives a better visualization of the signal in form of segmented epochs of desired length representing the power spectrum variation in data. The algorithm is designed in such a way that it takes the data points with respect to the sampling frequency for each time frame and so it can be improved to use in real time visualization with desired epoch length.

Keywords: de-noising, multi-channel data, PCA, power spectra, segmentation

Procedia PDF Downloads 367
25160 Protein Isolates from Chickpea (Cicer arietinum L.) and Its Application in Cake

Authors: Mohamed Abdullah Ahmed

Abstract:

In a study of chickpea protein isolate (CPI) preparation, the wet alkaline extraction was carried out. The objectives were to determine the optimal extracting conditions of CPI and apply CPI into a sponge cake recipe to replace egg and make acceptable product. The design used in extraction was a central composite design. The response surface methodology was preferred to graphically express the relationship between extraction time and pH with the output variables of percent yield and protein content of CPI. It was noted that optimal extracting conditions were 60 min and pH 10.5 resulting in 90.07% protein content and 89.15% yield of CPI. The protein isolate (CPI) could be incorporated in cake to 20% without adversely affecting the cake physical properties such as cake hardness and sensory attributes. The higher protein content in cake was corresponding to the amount of CPI added. Therefore, adding CPI can significantly (p<0.05) increase protein content in cake. However, sensory evaluation showed that adding more than 20% of CPI decreased the overall acceptability. The results of this investigation could be used as a basic knowledge of CPI utilization in other food products.

Keywords: chick bean protein isolate, sponge cake, utilization, sponge

Procedia PDF Downloads 337
25159 Designing Metal Organic Frameworks for Sustainable CO₂ Utilization

Authors: Matthew E. Potter, Daniel J. Stewart, Lindsay M. Armstrong, Pier J. A. Sazio, Robert R. Raja

Abstract:

Rising CO₂ levels in the atmosphere means that CO₂ is a highly desirable feedstock. This requires specific catalysts to be designed to activate this inert molecule, combining a catalytic site tailored for CO₂ transformations with a support that can readily adsorb CO₂. Metal organic frameworks (MOFs) are regularly used as CO₂ sorbents. The organic nature of the linker molecules, connecting the metal nodes, offers many post-synthesis modifications to introduce catalytic active sites into the frameworks. However, the metal nodes may be coordinatively unsaturated, allowing them to bind to organic moieties. Imidazoles have shown promise catalyzing the formation of cyclic carbonates from epoxides with CO₂. Typically, this synthesis route employs toxic reagents such as phosgene, liberating HCl. Therefore an alternative route with CO₂ is highly appealing. In this work we design active sites for CO₂ activation, by tethering substituted-imidazole organocatalytic species to the available Cr3+ metal nodes of a Cr-MIL-101 MOF, for the first time, to create a tailored species for carbon capture utilization applications. Our tailored design strategy combining a CO₂ sorbent, Cr-MIL-101, with an anchored imidazole results in a highly active and selective multifunctional catalyst, achieving turnover frequencies of over 750 hr-1. These findings demonstrate the synergy between the MOF framework and imidazoles for CO₂ utilization applications. Further, the effect of substrate variation has been explored yielding mechanistic insights into this process. Through characterization, we show that the structural and compositional integrity of the Cr-MIL-101 has been preserved on functionalizing the imidazoles. Further, we show the binding of the imidazoles to the Cr3+ metal nodes. This can be seen through our EPR study, where the distortion of the Cr3+ on binding to the imidazole shows the CO₂ binding site is close to the active imidazole. This has a synergistic effect, improving catalytic performance. We believe the combination of MOF support and organocatalyst allows many possibilities to generate new multifunctional catalysts for CO₂ utilisation. In conclusion, we have validated our design procedure, combining a known CO₂ sorbent, with an active imidazole species to create a unique tailored multifunctional catalyst for CO₂ utilization. This species achieves high activity and selectivity for the formation of cyclic carbonates and offers a sustainable alternative to traditional synthesis methods. This work represents a unique design strategy for CO₂ utilization while offering exciting possibilities for further work in characterization, computational modelling, and post-synthesis modification.

Keywords: carbonate, catalysis, MOF, utilisation

Procedia PDF Downloads 149
25158 Leveraging Artificial Intelligence to Analyze the Interplay between Social Vulnerability Index and Mobility Dynamics in Pandemics

Authors: Joshua Harrell, Gideon Osei Bonsu, Susan Garza, Clarence Conner, Da’Neisha Harris, Emma Bukoswki, Zohreh Safari

Abstract:

The Social Vulnerability Index (SVI) stands as a pivotal tool for gauging community resilience amidst diverse stressors, including pandemics like COVID-19. This paper synthesizes recent research and underscores the significance of SVI in elucidating the differential impacts of crises on communities. Drawing on studies by Fox et al. (2023) and Mah et al. (2023), we delve into the application of SVI alongside emerging data sources to uncover nuanced insights into community vulnerability. Specifically, we explore the utilization of SVI in conjunction with mobility data from platforms like SafeGraph to probe the intricate relationship between social vulnerability and mobility dynamics during the COVID-19 pandemic. By leveraging 16 community variables derived from the American Community Survey, including socioeconomic status and demographic characteristics, SVI offers actionable intelligence for guiding targeted interventions and resource allocation. Building upon recent advancements, this paper contributes to the discourse on harnessing AI techniques to mitigate health disparities and fortify public health resilience in the face of pandemics and other crises.

Keywords: social vulnerability index, mobility dynamics, data analytics, health equity, pandemic preparedness, targeted interventions, data integration

Procedia PDF Downloads 33
25157 Bacillus licheniformis sp. nov. PS-6, an Arsenic Tolerance Bacterium with Biotransforming Potential Isolated from Sediments of Pichavaram Mangroves of South India

Authors: Padmanabhan D, Kavitha S

Abstract:

The purpose of the study is to investigate arsenic resistance ability of indigenous microflora and its ability to utilize arsenic species form containing water source. PS-6 potential arsenic tolerance bacterium was screened from thirty isolates from Pichavaram Mangroves of India having tolerance to grow up to 1000 mg/l of As (V) and 800 mg/l of As (III) and arsenic utilization ability of 98 % of As (V) and 97% of As (III) with initial concentration of 3-5 mg/l within 48 hrs. Optimum pH and temperature was found to be ~7-7.4 and 37°C. Active growth of PS-6 in minimal salt media (MSB) helps in cost effective biomass production. Dry weight analysis of PS-6 has shown significant difference in biomass when exposed to As (III) and As (V). Protein level study of PS-6 after exposing to As (V) and As (III) shown modification in total protein concentration and variation in SDS-PAGE pattern. PS-6 was identified as Bacillus licheniformis based on partially sequenced of 16S rRNA using NCBI Blast. Further investigation will help in using this potential bacterium as a well-grounded source for urgency.

Keywords: arsenite, arsenate, Bacillus licheniformis, utilization

Procedia PDF Downloads 373
25156 Analysis of Big Data

Authors: Sandeep Sharma, Sarabjit Singh

Abstract:

As per the user demand and growth trends of large free data the storage solutions are now becoming more challenge-able to protect, store and to retrieve data. The days are not so far when the storage companies and organizations are start saying 'no' to store our valuable data or they will start charging a huge amount for its storage and protection. On the other hand as per the environmental conditions it becomes challenge-able to maintain and establish new data warehouses and data centers to protect global warming threats. A challenge of small data is over now, the challenges are big that how to manage the exponential growth of data. In this paper we have analyzed the growth trend of big data and its future implications. We have also focused on the impact of the unstructured data on various concerns and we have also suggested some possible remedies to streamline big data.

Keywords: big data, unstructured data, volume, variety, velocity

Procedia PDF Downloads 512
25155 Optimization of Bills Assignment to Different Skill-Levels of Data Entry Operators in a Business Process Outsourcing Industry

Authors: M. S. Maglasang, S. O. Palacio, L. P. Ogdoc

Abstract:

Business Process Outsourcing has been one of the fastest growing and emerging industry in the Philippines today. Unlike most of the contact service centers, more popularly known as "call centers", The BPO Industry’s primary outsourced service is performing audits of the global clients' logistics. As a service industry, manpower is considered as the most important yet the most expensive resource in the company. Because of this, there is a need to maximize the human resources so people are effectively and efficiently utilized. The main purpose of the study is to optimize the current manpower resources through effective distribution and assignment of different types of bills to the different skill-level of data entry operators. The assignment model parameters include the average observed time matrix gathered from through time study, which incorporates the learning curve concept. Subsequently, a simulation model was made to duplicate the arrival rate of demand which includes the different batches and types of bill per day. Next, a mathematical linear programming model was formulated. Its objective is to minimize direct labor cost per bill by allocating the different types of bills to the different skill-levels of operators. Finally, a hypothesis test was done to validate the model, comparing the actual and simulated results. The analysis of results revealed that the there’s low utilization of effective capacity because of its failure to determine the product-mix, skill-mix, and simulated demand as model parameters. Moreover, failure to consider the effects of learning curve leads to overestimation of labor needs. From 107 current number of operators, the proposed model gives a result of 79 operators. This results to an increase of utilization of effective capacity to 14.94%. It is recommended that the excess 28 operators would be reallocated to the other areas of the department. Finally, a manpower capacity planning model is also recommended in support to management’s decisions on what to do when the current capacity would reach its limit with the expected increasing demand.

Keywords: optimization modelling, linear programming, simulation, time and motion study, capacity planning

Procedia PDF Downloads 486
25154 Predicting Shortage of Hospital Beds during COVID-19 Pandemic in United States

Authors: Saba Ebrahimi, Saeed Ahmadian, Hedie Ashrafi

Abstract:

World-wide spread of coronavirus grows the concern about planning for the excess demand of hospital services in response to COVID-19 pandemic. The surge in the hospital services demand beyond the current capacity leads to shortage of ICU beds and ventilators in some parts of US. In this study, we forecast the required number of hospital beds and possible shortage of beds in US during COVID-19 pandemic to be used in the planning and hospitalization of new cases. In this paper, we used a data on COVID-19 deaths and patients’ hospitalization besides the data on hospital capacities and utilization in US from publicly available sources and national government websites. we used a novel ensemble modelling of deep learning networks, based on stacking different linear and non-linear layers to predict the shortage in hospital beds. The results showed that our proposed approach can predict the excess hospital beds demand very well and this can be helpful in developing strategies and plans to mitigate this gap.

Keywords: COVID-19, deep learning, ensembled models, hospital capacity planning

Procedia PDF Downloads 129
25153 Real-world Characterization of Treatment Intensified (Add-on to Metformin) Adults with Type 2 Diabetes in Pakistan: A Multi-center Retrospective Study (Converge)

Authors: Muhammad Qamar Masood, Syed Abbas Raza, Umar Yousaf Raja, Imran Hassan, Bilal Afzal, Muhammad Aleem Zahir, Atika Shaheer

Abstract:

Background: Cardiovascular disease (CVD) is a major burden among people with type 2 diabetes (T2D) with 1 in 3 reported to have CVD. Therefore, understanding real-world clinical characteristics and prescribing patterns could help in better care. Objective: The CONVERGE (Cardiovascular Outcomes and Value in the Real world with GLP-1RAs) study characterized demographics and medication usage patterns in T2D intensified (add-on to metformin) overall population. The data were further divided into subgroups {dipeptidyl peptidase-4 inhibitors (DPP-4is), sulfonylureas (SUs), insulins, glucagon-like peptide-1 receptor agonists (GLP-1 RAs) and sodium-glucose cotransporter-2 inhibitors (SGLT-2is)}, according to the latest prescribed antidiabetic agent (ADA) in India/Pakistan/Thailand. Here, we report findings from Pakistan. Methods: A multi-center retrospective study utilized data from medical records between 13-Sep-2008 (post-market approval of GLP-1RAs) and 31-Dec-2017 in adults (≥18-year-old). The data for this study were collected from 05 centers / institutes located in major cities of Pakistan, including Karachi, Lahore, Islamabad, and Multan. These centers included National Hospital, Aga Khan University Hospital, Diabetes Endocrine Clinic Lahore, Shifa International Hospital, Mukhtar A Sheikh Hospital Multan. Data were collected at start of medical record and at 6 or 12-months prior to baseline based on variable type; analyzed descriptively. Results: Overall, 1,010 patients were eligible. At baseline, overall mean age (SD) was 51.6 (11.3) years, T2D duration was 2.4 (2.6) years, HbA1c was 8.3% (1.9) and 35% received ≥1CVD medications in the past 1-year (before baseline). Most frequently prescribed ADAs post-metformin were DPP-4is and SUs (~63%). Only 6.5% received GLP-1RAs and SGLT-2is were not available in Pakistan during the study period. Overall, it took a mean of 4.4 years and 5 years to initiate GLP-1RAs and SGLT-2is, respectively. In comparison to other subgroups, more patients from GLP-1RAs received ≥3 types of ADA (58%), ≥1 CVD medication (64%) and had higher body mass index (37kg/m2). Conclusions: Utilization of GLP-1RAs and SGLT-2is was low, took longer time to initiate and not before trying multiple ADAs. This may be due to lack of evidence for CV benefits for these agents during the study period. The planned phase 2 of the CONVERGE study can provide more insights into utilization and barriers to prescribe GLP-1RAs and SGLT-2is post 2018 in Pakistan.

Keywords: type 2 diabetes, GLP-1RA, treatment intensification, cardiovascular disease

Procedia PDF Downloads 22
25152 Evaluation of Deformation for Deep Excavations in the Greater Vancouver Area Through Case Studies

Authors: Boris Kolev, Matt Kokan, Mohammad Deriszadeh, Farshid Bateni

Abstract:

Due to the increasing demand for real estate and the need for efficient land utilization in Greater Vancouver, developers have been increasingly considering the construction of high-rise structures with multiple below-grade parking. The temporary excavations required to allow for the construction of underground levels have recently reached up to 40 meters in depth. One of the challenges with deep excavations is the prediction of wall displacements and ground settlements due to their effect on the integrity of City utilities, infrastructure, and adjacent buildings. A large database of survey monitoring data has been collected for deep excavations in various soil conditions and shoring systems. The majority of the data collected is for tie-back anchors and shotcrete lagging systems. The data were categorized, analyzed and the results were evaluated to find a relationship between the most dominant parameters controlling the displacement, such as depth of excavation, soil properties, and the tie-back anchor loading and arrangement. For a select number of deep excavations, finite element modeling was considered for analyses. The lateral displacements from the simulation results were compared to the recorded survey monitoring data. The study concludes with a discussion and comparison of the available empirical and numerical modeling methodologies for evaluating lateral displacements in deep excavations.

Keywords: deep excavations, lateral displacements, numerical modeling, shoring walls, tieback anchors

Procedia PDF Downloads 150