Search results for: Network Time Protocol
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22334

Search results for: Network Time Protocol

19934 Maximizing Profit Using Optimal Control by Exploiting the Flexibility in Thermal Power Plants

Authors: Daud Mustafa Minhas, Raja Rehan Khalid, Georg Frey

Abstract:

The next generation power systems are equipped with abundantly available free renewable energy resources (RES). During their low-cost operations, the price of electricity significantly reduces to a lower value, and sometimes it becomes negative. Therefore, it is recommended not to operate the traditional power plants (e.g. coal power plants) and to reduce the losses. In fact, it is not a cost-effective solution, because these power plants exhibit some shutdown and startup costs. Moreover, they require certain time for shutdown and also need enough pause before starting up again, increasing inefficiency in the whole power network. Hence, there is always a trade-off between avoiding negative electricity prices, and the startup costs of power plants. To exploit this trade-off and to increase the profit of a power plant, two main contributions are made: 1) introducing retrofit technology for state of art coal power plant; 2) proposing optimal control strategy for a power plant by exploiting different flexibility features. These flexibility features include: improving ramp rate of power plant, reducing startup time and lowering minimum load. While, the control strategy is solved as mixed integer linear programming (MILP), ensuring optimal solution for the profit maximization problem. Extensive comparisons are made considering pre and post-retrofit coal power plant having the same efficiencies under different electricity price scenarios. It concludes that if the power plant must remain in the market (providing services), more flexibility reflects direct economic advantage to the plant operator.

Keywords: discrete optimization, power plant flexibility, profit maximization, unit commitment model

Procedia PDF Downloads 145
19933 Computational Team Dynamics in Student New Product Development Teams

Authors: Shankaran Sitarama

Abstract:

Teamwork is an extremely effective pedagogical tool in engineering education. New Product Development (NPD) has been an effective strategy of companies to streamline and bring innovative products and solutions to customers. Thus, Engineering curriculum in many schools, some collaboratively with business schools, have brought NPD into the curriculum at the graduate level. Teamwork is invariably used during instruction, where students work in teams to come up with new products and solutions. There is a significant emphasis of grade on the semester long teamwork for it to be taken seriously by students. As the students work in teams and go through this process to develop the new product prototypes, their effectiveness and learning to a great extent depends on how they function as a team and go through the creative process, come together, and work towards the common goal. A core attribute of a successful NPD team is their creativity and innovation. The team needs to be creative as a group, generating a breadth of ideas and innovative solutions that solve or address the problem they are targeting and meet the user’s needs. They also need to be very efficient in their teamwork as they work through the various stages of the development of these ideas resulting in a POC (proof-of-concept) implementation or a prototype of the product. The simultaneous requirement of teams to be creative and at the same time also converge and work together imposes different types of tensions in their team interactions. These ideational tensions / conflicts and sometimes relational tensions / conflicts are inevitable. Effective teams will have to deal with the Team dynamics and manage it to be resilient enough and yet be creative. This research paper provides a computational analysis of the teams’ communication that is reflective of the team dynamics, and through a superimposition of latent semantic analysis with social network analysis, provides a computational methodology of arriving at patterns of visual interaction. These team interaction patterns have clear correlations to the team dynamics and provide insights into the functioning and thus the effectiveness of the teams. 23 student NPD teams over 2 years of a course on Managing NPD that has a blend of engineering and business school students is considered, and the results are presented. It is also correlated with the teams’ detailed and tailored individual and group feedback and self-reflection and evaluation questionnaire.

Keywords: team dynamics, social network analysis, team interaction patterns, new product development teamwork, NPD teams

Procedia PDF Downloads 119
19932 Improving Cost and Time Control of Construction Projects Management Practices in Nigeria

Authors: Mustapha Yakubu, Ahmed Usman, Hashim Ambursa

Abstract:

This paper presents the findings of a research which sought to investigate techniques used to improve cost and time control of construction projects management practice in Nigeria. However, there is limited research on issues surrounding the practical usage of these techniques. Data were collected through a questionnaire distributed to construction experts through a survey conducted on the 100 construction organisations and 50 construction consultancy firms in the Nigeria aimed at identifying common project cost and time control practices and factors inhibiting effective project control in practice. The study reveals that despite the vast application of control techniques a high proportion of respondents still experienced cost and time overruns on a significant proportion of their projects. Analysis of the survey results concluded that more effort should be geared at the management of the identified top project control inhibiting factors. This paper has outlined some measures for mitigating these inhibiting factors so that the outcome of project time and cost control can be improved in practice.

Keywords: construction project, cost control, Nigeria, time control

Procedia PDF Downloads 318
19931 Intrusion Detection Using Dual Artificial Techniques

Authors: Rana I. Abdulghani, Amera I. Melhum

Abstract:

With the abnormal growth of the usage of computers over networks and under the consideration or agreement of most of the computer security experts who said that the goal of building a secure system is never achieved effectively, all these points led to the design of the intrusion detection systems(IDS). This research adopts a comparison between two techniques for network intrusion detection, The first one used the (Particles Swarm Optimization) that fall within the field (Swarm Intelligence). In this Act, the algorithm Enhanced for the purpose of obtaining the minimum error rate by amending the cluster centers when better fitness function is found through the training stages. Results show that this modification gives more efficient exploration of the original algorithm. The second algorithm used a (Back propagation NN) algorithm. Finally a comparison between the results of two methods used were based on (NSL_KDD) data sets for the construction and evaluation of intrusion detection systems. This research is only interested in clustering the two categories (Normal and Abnormal) for the given connection records. Practices experiments result in intrude detection rate (99.183818%) for EPSO and intrude detection rate (69.446416%) for BP neural network.

Keywords: IDS, SI, BP, NSL_KDD, PSO

Procedia PDF Downloads 387
19930 Traffic Light Detection Using Image Segmentation

Authors: Vaishnavi Shivde, Shrishti Sinha, Trapti Mishra

Abstract:

Traffic light detection from a moving vehicle is an important technology both for driver safety assistance functions as well as for autonomous driving in the city. This paper proposed a deep-learning-based traffic light recognition method that consists of a pixel-wise image segmentation technique and a fully convolutional network i.e., UNET architecture. This paper has used a method for detecting the position and recognizing the state of the traffic lights in video sequences is presented and evaluated using Traffic Light Dataset which contains masked traffic light image data. The first stage is the detection, which is accomplished through image processing (image segmentation) techniques such as image cropping, color transformation, segmentation of possible traffic lights. The second stage is the recognition, which means identifying the color of the traffic light or knowing the state of traffic light which is achieved by using a Convolutional Neural Network (UNET architecture).

Keywords: traffic light detection, image segmentation, machine learning, classification, convolutional neural networks

Procedia PDF Downloads 182
19929 H2 Production and Treatment of Cake Wastewater Industry via Up-Flow Anaerobic Staged Reactor

Authors: Manal A. Mohsen, Ahmed Tawfik

Abstract:

Hydrogen production from cake wastewater by anaerobic dark fermentation via upflow anaerobic staged reactor (UASR) was investigated in this study. The reactor was continuously operated for four months at constant hydraulic retention time (HRT) of 21.57 hr, PH value of 6 ± 0.6, temperature of 21.1°C, and organic loading rate of 2.43 gCOD/l.d. The hydrogen production was 5.7 l H2/d and the hydrogen yield was 134.8 ml H2 /g CODremoved. The system showed an overall removal efficiency of TCOD, TBOD, TSS, TKN, and Carbohydrates of 40 ± 13%, 59 ± 18%, 84 ± 17%, 28 ± 27%, and 85 ± 15% respectively during the long term operation period. Based on the available results, the system is not sufficient for the effective treatment of cake wastewater, and the effluent quality of UASR is not complying for discharge into sewerage network, therefore a post treatment is needed (not covered in this study).

Keywords: cake wastewater industry, chemical oxygen demand (COD), hydrogen production, up-flow anaerobic staged reactor (UASR)

Procedia PDF Downloads 383
19928 Audit on Antibiotic Prophylaxis and Post-Procedure Complication Rate for Patients Undergoing Transperineal Template Biopsies of the Prostate

Authors: W. Hajuthman, R. Warner, S. Rahman, M. Abraham, H. Helliwell, D. Bodiwala

Abstract:

Context: Prostate cancer is a prevalent cancer in males in Europe and the US, with diagnosis primarily relying on PSA testing, mpMRI, and subsequent biopsies. However, this diagnostic strategy may lead to complications for patients. Research Aim: The aim of this study is to assess compliance with trust guidelines for antibiotic prophylaxis in patients undergoing transperineal template biopsies of the prostate and evaluate the rate of post-procedure complications. Methodology: This study is conducted retrospectively over an 8-month period. Data collection includes patient demographics, compliance with trust guidelines, associated risk factors, and post-procedure complications such as infection, haematuria, and urinary retention. Findings: The audit includes 100 patients with a median age of 66.11. The compliance with pre-procedure antibiotics was 98%, while compliance with antibiotic prophylaxis recommended by trust guidelines was 68%. Among the patients, 3% developed post-procedure sepsis, with 2 requiring admission for intravenous antibiotics. No evident risk factors were identified in these cases. Additionally, post-procedure urinary retention occurred in 3% of patients and post-procedure haematuria in 2%. Theoretical Importance: This study highlights the increasing use of transperineal template biopsies across UK centres and suggests that having a standardized protocol and compliance with guidelines can reduce confusion, ensure appropriate administration of antibiotics, and mitigate post-procedure complications. Data Collection and Analysis Procedures: Data for this study is collected retrospectively, involving the extraction and analysis of relevant information from patient records over the specified 8-month period. Question Addressed: This study addresses the following research questions: (1) What is the compliance rate with trust guidelines for antibiotic prophylaxis in transperineal template biopsies of the prostate? (2) What is the rate of post-procedure complications, such as infection, haematuria, and urinary retention? Conclusion: Transperineal template biopsies are becoming increasingly prevalent in the UK. Implementing a standardized protocol and ensuring compliance with guidelines can reduce confusion, ensure proper administration of antibiotics, and potentially minimize post-procedure complications. Additionally, considering that studies show no difference in outcomes when prophylactic antibiotics are not used, the reminder to follow trust guidelines may prompt a re-evaluation of antibiotic prescribing practices.

Keywords: prostate, transperineal template biopsies of prostate, antibiotics, complications, microbiology, guidelines

Procedia PDF Downloads 84
19927 Astronomical Panels of Measuring and Dividing Time in Ancient Egypt

Authors: Mohamed Saeed Ahmed Salman

Abstract:

The ancient Egyptians used the stars to measure time or, in a more precise sense, as one of the astronomical means of measuring time. These methods differed throughout the historical ages. They began with simple observations of observing astronomical phenomena and watching them, such as observing the movements of the stars in the sky. The year, to know the days, nights, and other means used to help set the time when the sky overcast, and so the researcher tries through archaeological evidence to demonstrate the knowledge of the ancient Egyptian stars of heaven, and movements through the first pre-history. It is not believed that the astronomical information possessed by the Egyptian was limited, and simple, it was reaching a level of almost optimal in terms of importance, and the goal he wanted to reach the ancient Egyptian, and also help him to know the time, and the passage of time; which ended in finally trying to find a system of timing and calculation of time. It was noted that there were signs that the stellar creed was known, and prosperous, especially since the pre-family ages, and this is evident on the inscriptions that come back to that period. The Egyptian realized that some of the stars remain visible at night, The ancient Egyptian was familiar with the daily journey of the stars. This is what was adopted in many paragraphs of the texts of the pyramids and its references to the rise of the deceased king of the heavenly world between the stars of the eternal sky. It was noted that the ancient Egyptian link between the doctrine of the star, we find that the public The lunar was known to the ancient Egyptians, and sang it for two years, and the stellar solar; but it was based on the appearance of the star Sirius, and this is the first means used to measure time and know the calendar stars.

Keywords: ancient Egyptian, astronomical panels, Egyptian, astronomical

Procedia PDF Downloads 28
19926 Increasing the Apparent Time Resolution of Tc-99m Diethylenetriamine Pentaacetic Acid Galactosyl Human Serum Albumin Dynamic SPECT by Use of an 180-Degree Interpolation Method

Authors: Yasuyuki Takahashi, Maya Yamashita, Kyoko Saito

Abstract:

In general, dynamic SPECT data acquisition needs a few minutes for one rotation. Thus, the time-activity curve (TAC) derived from the dynamic SPECT is relatively coarse. In order to effectively shorten the interval, between data points, we adopted a 180-degree interpolation method. This method is already used for reconstruction of the X-ray CT data. In this study, we applied this 180-degree interpolation method to SPECT and investigated its effectiveness.To briefly describe the 180-degree interpolation method: the 180-degree data in the second half of one rotation are combined with the 180-degree data in the first half of the next rotation to generate a 360-degree data set appropriate for the time halfway between the first and second rotations. In both a phantom and a patient study, the data points from the interpolated images fell in good agreement with the data points tracking the accumulation of 99mTc activity over time for appropriate region of interest. We conclude that data derived from interpolated images improves the apparent time resolution of dynamic SPECT.

Keywords: dynamic SPECT, time resolution, 180-degree interpolation method, 99mTc-GSA.

Procedia PDF Downloads 495
19925 Human-Centric Sensor Networks for Comfort and Productivity in Offices: Integrating Environmental, Body Area Network, and Participatory Sensing

Authors: Chenlu Zhang, Wanni Zhang, Florian Schaule

Abstract:

Indoor environment in office buildings directly affects comfort, productivity, health, and well-being of building occupants. Wireless environmental sensor networks have been deployed in many modern offices to monitor and control the indoor environments. However, indoor environmental variables are not strong enough predictors of comfort and productivity levels of every occupant due to personal differences, both physiologically and psychologically. This study proposes human-centric sensor networks that integrate wireless environmental sensors, body area network sensors and participatory sensing technologies to collect data from both environment and human and support building operations. The sensor networks have been tested in one small-size and one medium-size office rooms with 22 participants for five months. Indoor environmental data (e.g., air temperature and relative humidity), physiological data (e.g., skin temperature and Galvani skin response), and physiological responses (e.g., comfort and self-reported productivity levels) were obtained from each participant and his/her workplace. The data results show that: (1) participants have different physiological and physiological responses in the same environmental conditions; (2) physiological variables are more effective predictors of comfort and productivity levels than environmental variables. These results indicate that the human-centric sensor networks can support human-centric building control and improve comfort and productivity in offices.

Keywords: body area network, comfort and productivity, human-centric sensors, internet of things, participatory sensing

Procedia PDF Downloads 144
19924 Contribution to the Study of Automatic Epileptiform Pattern Recognition in Long Term EEG Signals

Authors: Christine F. Boos, Fernando M. Azevedo

Abstract:

Electroencephalogram (EEG) is a record of the electrical activity of the brain that has many applications, such as monitoring alertness, coma and brain death; locating damaged areas of the brain after head injury, stroke and tumor; monitoring anesthesia depth; researching physiology and sleep disorders; researching epilepsy and localizing the seizure focus. Epilepsy is a chronic condition, or a group of diseases of high prevalence, still poorly explained by science and whose diagnosis is still predominantly clinical. The EEG recording is considered an important test for epilepsy investigation and its visual analysis is very often applied for clinical confirmation of epilepsy diagnosis. Moreover, this EEG analysis can also be used to help define the types of epileptic syndrome, determine epileptiform zone, assist in the planning of drug treatment and provide additional information about the feasibility of surgical intervention. In the context of diagnosis confirmation the analysis is made using long term EEG recordings with at least 24 hours long and acquired by a minimum of 24 electrodes in which the neurophysiologists perform a thorough visual evaluation of EEG screens in search of specific electrographic patterns called epileptiform discharges. Considering that the EEG screens usually display 10 seconds of the recording, the neurophysiologist has to evaluate 360 screens per hour of EEG or a minimum of 8,640 screens per long term EEG recording. Analyzing thousands of EEG screens in search patterns that have a maximum duration of 200 ms is a very time consuming, complex and exhaustive task. Because of this, over the years several studies have proposed automated methodologies that could facilitate the neurophysiologists’ task of identifying epileptiform discharges and a large number of methodologies used neural networks for the pattern classification. One of the differences between all of these methodologies is the type of input stimuli presented to the networks, i.e., how the EEG signal is introduced in the network. Five types of input stimuli have been commonly found in literature: raw EEG signal, morphological descriptors (i.e. parameters related to the signal’s morphology), Fast Fourier Transform (FFT) spectrum, Short-Time Fourier Transform (STFT) spectrograms and Wavelet Transform features. This study evaluates the application of these five types of input stimuli and compares the classification results of neural networks that were implemented using each of these inputs. The performance of using raw signal varied between 43 and 84% efficiency. The results of FFT spectrum and STFT spectrograms were quite similar with average efficiency being 73 and 77%, respectively. The efficiency of Wavelet Transform features varied between 57 and 81% while the descriptors presented efficiency values between 62 and 93%. After simulations we could observe that the best results were achieved when either morphological descriptors or Wavelet features were used as input stimuli.

Keywords: Artificial neural network, electroencephalogram signal, pattern recognition, signal processing

Procedia PDF Downloads 531
19923 A Bacterial Foraging Optimization Algorithm Applied to the Synthesis of Polyacrylamide Hydrogels

Authors: Florin Leon, Silvia Curteanu

Abstract:

The Bacterial Foraging Optimization (BFO) algorithm is inspired by the behavior of bacteria such as Escherichia coli or Myxococcus xanthus when searching for food, more precisely the chemotaxis behavior. Bacteria perceive chemical gradients in the environment, such as nutrients, and also other individual bacteria, and move toward or in the opposite direction to those signals. The application example considered as a case study consists in establishing the dependency between the reaction yield of hydrogels based on polyacrylamide and the working conditions such as time, temperature, monomer, initiator, crosslinking agent and inclusion polymer concentrations, as well as type of the polymer added. This process is modeled with a neural network which is included in an optimization procedure based on BFO. An experimental study of BFO parameters is performed. The results show that the algorithm is quite robust and can obtain good results for diverse combinations of parameter values.

Keywords: bacterial foraging, hydrogels, modeling and optimization, neural networks

Procedia PDF Downloads 158
19922 Application of Combined Cluster and Discriminant Analysis to Make the Operation of Monitoring Networks More Economical

Authors: Norbert Magyar, Jozsef Kovacs, Peter Tanos, Balazs Trasy, Tamas Garamhegyi, Istvan Gabor Hatvani

Abstract:

Water is one of the most important common resources, and as a result of urbanization, agriculture, and industry it is becoming more and more exposed to potential pollutants. The prevention of the deterioration of water quality is a crucial role for environmental scientist. To achieve this aim, the operation of monitoring networks is necessary. In general, these networks have to meet many important requirements, such as representativeness and cost efficiency. However, existing monitoring networks often include sampling sites which are unnecessary. With the elimination of these sites the monitoring network can be optimized, and it can operate more economically. The aim of this study is to illustrate the applicability of the CCDA (Combined Cluster and Discriminant Analysis) to the field of water quality monitoring and optimize the monitoring networks of a river (the Danube), a wetland-lake system (Kis-Balaton & Lake Balaton), and two surface-subsurface water systems on the watershed of Lake Neusiedl/Lake Fertő and on the Szigetköz area over a period of approximately two decades. CCDA combines two multivariate data analysis methods: hierarchical cluster analysis and linear discriminant analysis. Its goal is to determine homogeneous groups of observations, in our case sampling sites, by comparing the goodness of preconceived classifications obtained from hierarchical cluster analysis with random classifications. The main idea behind CCDA is that if the ratio of correctly classified cases for a grouping is higher than at least 95% of the ratios for the random classifications, then at the level of significance (α=0.05) the given sampling sites don’t form a homogeneous group. Due to the fact that the sampling on the Lake Neusiedl/Lake Fertő was conducted at the same time at all sampling sites, it was possible to visualize the differences between the sampling sites belonging to the same or different groups on scatterplots. Based on the results, the monitoring network of the Danube yields redundant information over certain sections, so that of 12 sampling sites, 3 could be eliminated without loss of information. In the case of the wetland (Kis-Balaton) one pair of sampling sites out of 12, and in the case of Lake Balaton, 5 out of 10 could be discarded. For the groundwater system of the catchment area of Lake Neusiedl/Lake Fertő all 50 monitoring wells are necessary, there is no redundant information in the system. The number of the sampling sites on the Lake Neusiedl/Lake Fertő can decrease to approximately the half of the original number of the sites. Furthermore, neighbouring sampling sites were compared pairwise using CCDA and the results were plotted on diagrams or isoline maps showing the location of the greatest differences. These results can help researchers decide where to place new sampling sites. The application of CCDA proved to be a useful tool in the optimization of the monitoring networks regarding different types of water bodies. Based on the results obtained, the monitoring networks can be operated more economically.

Keywords: combined cluster and discriminant analysis, cost efficiency, monitoring network optimization, water quality

Procedia PDF Downloads 353
19921 Implementation of Enhanced Recovery after Cesarean Section at Koidu Government Hospital, Sierra Leone 2024. A Quality Improvement Project

Authors: Hailemariam Getachew, John Sandi, Isata Dumbuya, Patricia Efe.Azikiwe, Evaline Nginge, Moses Mugisha, Eseoghene Dase, Foday Mandaray, Grace Moore

Abstract:

Enhanced recovery after cesarean section (ERAC) is a standardized peri- operative care program that comprises the multidisciplinary team's collective efforts working in collaboration throughout the peri-operative period with the principal goal to improve quality of surgical care, decrease surgical related complications, and increasing patient satisfaction. Objective: The main objective of this project is to improve the implementation of enhanced recovery after cesarean section at Koidu Government hospital. Identified gap: Even though the hospital is providing comprehensive maternal and child care service, there are gaps in the implementation of ERAC. According to our survey, we found that there is low (13.3%) utilization of WHO surgical safety checklist, only limited (15.9%) patients get opioid free analgesia, pain was not recorded as a vital sign, there is no standardized checklist for hand over to and from Post Anesthesia care Unit(PACU). Furthermore, there is inconsistent evidence based post-operative care and there is no local consensus protocol and guideline as well. Implementation plan: we aimed at designing standardized protocol, checklist and guideline, provide training, build staff capacity, document pain as vital sign, perform regional analgesia, and provide evidence based post-operative care, monitoring and evaluation. Result: Data from 389 cesarean mothers showed that, Utilization of the WHO surgical safety check list found to be 95%, and pain assessment and documentation was done for all surgical patients. Oral feeding, ambulation and catheter removal was performed as per the ERAC standard for all patients. Postoperative complications drastically decreased from 13.6% to 8.1%. While, the rate of readmission was kept below 1%. Furthermore, the duration of hospital stay decreased from 4.64 days to 3.12 days. Conclusion The successful implementation of ERAC protocols demonstrates through this Quality Improvement Project that, the effectiveness of the protocols in improving recovery and patient outcome following cesarean section.

Keywords: cesarean delivery, enhanced recovery, quality improvement, patient outcome

Procedia PDF Downloads 21
19920 Selection of Potential Starter Using Their Transcription Level

Authors: Elif Coskun Daggecen, Seyma Dokucu, Yekta Gezginc, Ismail Akyol

Abstract:

Fermented dairy food quality is mainly determined by the sensory perception and influenced by many factors. Today, starter cultures for fermented foods are being developed to have a constant quality in these foods. Streptococcus thermophilus is one of the main species of most a starter cultures of yogurt fermentation. This species produces lactate by lactose fermentation from pyruvate. On the other hand, a small amount of pyruvate can alternatively be converted to various typical yoghurt flavor compounds such as diacetyl, acetoin, acetaldehyde, or acetic acid, for which the activity of three genes are shown to be especially important; ldh, nox and als. Up to date, commercially produced yoghurts have not yet met the desired aromatic properties that Turkish consumers find in traditional homemade yoghurts. Therefore, it is important to select starters carrying favorable metabolic characteristics from natural isolates. In this study, 30 strains of Str. Thermophilus were isolated from traditional Turkish yoghurts obtained from different regions of the country. In these strains, transcriptional levels of ldh, nox and als genes were determined via a newly developed qPCR protocol, which is a more reliable and precision method for analyzing the quantitative and qualitative expression of specific genes in different experimental conditions or in different organisms compared to conventional analytical methods. Additionally, the metabolite production potentials of the isolates were measured. Of all the strains examined, 60% were found to carry the metabolite production potential and the gene activity which appeared to be suitable to be used as a starter culture. Probable starter cultures were determined according to real-time PCR results.

Keywords: gene expression, RT-PCR, starter culture, Streptococcus thermophilus

Procedia PDF Downloads 191
19919 Design of Compact Dual-Band Planar Antenna for WLAN Systems

Authors: Anil Kumar Pandey

Abstract:

A compact planar monopole antenna with dual-band operation suitable for wireless local area network (WLAN) application is presented in this paper. The antenna occupies an overall area of 18 ×12 mm2. The antenna is fed by a coplanar waveguide (CPW) transmission line and it combines two folded strips, which radiates at 2.4 and 5.2 GHz. In the proposed antenna, by optimally selecting the antenna dimensions, dual-band resonant modes with a much wider impedance matching at the higher band can be produced. Prototypes of the obtained optimized design have been simulated using EM solver. The simulated results explore good dual-band operation with -10 dB impedance bandwidths of 50 MHz and 2400 MHz at bands of 2.4 and 5.2 GHz, respectively, which cover the 2.4/5.2/5.8 GHz WLAN operating bands. Good antenna performances such as radiation patterns and antenna gains over the operating bands have also been observed. The antenna with a compact size of 18×12×1.6 mm3 is designed on an FR4 substrate with a dielectric constant of 4.4.

Keywords: CPW antenna, dual-band, electromagnetic simulation, wireless local area network (WLAN)

Procedia PDF Downloads 216
19918 Energy Usage in Isolated Areas of Honduras

Authors: Bryan Jefry Sabillon, Arlex Molina Cedillo

Abstract:

Currently, the raise in the demand of electrical energy as a consequence of the development of technology and population growth, as well as some projections made by ‘La Agencia Internacional de la Energía’ (AIE) and research institutes, reveal alarming data about the expected raise of it in the next few decades. Because of this, something should be made to raise the awareness of the rational and efficient usage of this resource. Because of the global concern of providing electrical energy to isolated areas, projects consisting of energy generation using renewable resources are commonly carried out. On a socioeconomically and cultural point of view, it can be foreseen a positive impact that would result for the society to have this resource. This article is focused on the great potential that Honduras shows, as a country that is looking forward to produce renewable energy due to the crisis that it’s living nowadays. Because of this, we present a detailed research that exhibits the main necessities that the rural communities are facing today, to allay the negative aspects due to the scarcity of electrical energy. We also discuss which should be the type of electrical generation method to be used, according to the disposition, geography, climate, and of course the accessibility of each area. Honduras is actually in the process of developing new methods for the generation of energy; therefore, it is of our concern to talk about renewable energy, the exploitation of which is a global trend. Right now the countries’ main energetic generation methods are: hydrological, thermic, wind, biomass and photovoltaic (this is one of the main sources of clean electrical generation). The use of these resources was possible partially due to the studies made by the organizations that focus on electrical energy and its demand, such as ‘La Cooperación Alemana’ (GIZ), ‘La Secretaria de Energía y Recursos Naturales’ (SERNA), and ‘El Banco Centroamericano de Integración Económica’ (BCIE), which eased the complete guide that is to be used in the protocol to be followed to carry out the three stages of this type of projects: 1) Licences and Permitions, 2) Fincancial Aspects and 3) The inscription for the Protocol in Kyoto. This article pretends to take the reader through the necessary information (according to the difficult accessibility that each zone might present), about the best option of electrical generation in zones that are totally isolated from the net, pretending to use renewable resources to generate electrical energy. We finally conclude that the usage of hybrid systems of generation of energy for small remote communities brings about a positive impact, not only because of the fact of providing electrical energy but also because of the improvements in education, health, sustainable agriculture and livestock, and of course the advances in the generation of energy which is the main concern of this whole article.

Keywords: energy, isolated, renewable, accessibility

Procedia PDF Downloads 232
19917 Time Series Forecasting (TSF) Using Various Deep Learning Models

Authors: Jimeng Shi, Mahek Jain, Giri Narasimhan

Abstract:

Time Series Forecasting (TSF) is used to predict the target variables at a future time point based on the learning from previous time points. To keep the problem tractable, learning methods use data from a fixed-length window in the past as an explicit input. In this paper, we study how the performance of predictive models changes as a function of different look-back window sizes and different amounts of time to predict the future. We also consider the performance of the recent attention-based Transformer models, which have had good success in the image processing and natural language processing domains. In all, we compare four different deep learning methods (RNN, LSTM, GRU, and Transformer) along with a baseline method. The dataset (hourly) we used is the Beijing Air Quality Dataset from the UCI website, which includes a multivariate time series of many factors measured on an hourly basis for a period of 5 years (2010-14). For each model, we also report on the relationship between the performance and the look-back window sizes and the number of predicted time points into the future. Our experiments suggest that Transformer models have the best performance with the lowest Mean Average Errors (MAE = 14.599, 23.273) and Root Mean Square Errors (RSME = 23.573, 38.131) for most of our single-step and multi-steps predictions. The best size for the look-back window to predict 1 hour into the future appears to be one day, while 2 or 4 days perform the best to predict 3 hours into the future.

Keywords: air quality prediction, deep learning algorithms, time series forecasting, look-back window

Procedia PDF Downloads 159
19916 Associations among Fetuin A, Cortisol and Thyroid Hormones in Children with Morbid Obesity and Metabolic Syndrome

Authors: Mustafa Metin Donma, Orkide Donma

Abstract:

Obesity is a disease with an ever-increasing prevalence throughout the world. The metabolic network associated with obesity is very complicated. In metabolic syndrome (MetS), it becomes even more difficult to understand. Within this context, hormones, cytokines, and many others participate in this complex matrix. The collaboration among all of these parameters is a matter of great wonder. Cortisol, as a stress hormone, is closely associated with obesity. Thyroid hormones are involved in the regulation of energy as well as glucose metabolism with all of its associates. Fetuin A is known for years; however, the involvement of this parameter in obesity discussions is rather new. Recently, it has been defined as one of the new generation markers of obesity. In this study, the aim was to introduce complex interactions among all to be able to make clear comparisons, at least for a part of this complicated matter. Morbid obese (MO) children participated in the study. Two groups with 46 MO children and 43 with MetS were constituted. All children included in the study were above 99th age- and sex-adjusted body mass index (BMI) percentiles according to World Health Organization criteria. Forty-three morbid obese children in the second group had also MetS components. Informed consent forms were filled by the parents of the participants. The institutional ethics committee has given approval for the study protocol. Data as well as the findings of the study were evaluated from a statistical point of view. Two groups were matched for their age and gender compositions. Significantly higher body mass index (BMI), waist circumference, thyrotropin, and insulin values were observed in the MetS group. Triiodothyronine concentrations did not differ between the groups. Elevated levels for thyroxin, cortisol, and fetuin-A were detected in the MetS group compared to the first group (p > 0.05). In MO MetS- group, cortisol was correlated with thyroxin and fetuin-A (p < 0.05). In the MO MetS+ group, none of these correlations were present. Instead, a correlation between cortisol and thyrotropin was found (p < 0.05). In conclusion, findings have shown that cortisol was the key player in severely obese children. The association of this hormone with the participants of thyroid hormone metabolism was quite important. The lack of association with fetuin A in the morbid obese MetS+ group has suggested the possible interference of MetS components in the behavior of this new generation obesity marker. The most remarkable finding of the study was the unique correlation between cortisol and thyrotropin in the morbid obese MetS+ group, suggesting that thyrotropin may serve as a target along with cortisol in the morbid obese MetS+ group. This association may deserve specific attention during the development of remedies against MetS in the pediatric population.

Keywords: children, cortisol, fetuin A, morbid obesity, thyrotropin

Procedia PDF Downloads 183
19915 Application of Artificial Neural Network in Initiating Cleaning Of Photovoltaic Solar Panels

Authors: Mohamed Mokhtar, Mostafa F. Shaaban

Abstract:

Among the challenges facing solar photovoltaic (PV) systems in the United Arab Emirates (UAE), dust accumulation on solar panels is considered the most severe problem that faces the growth of solar power plants. The accumulation of dust on the solar panels significantly degrades output from these panels. Hence, solar PV panels have to be cleaned manually or using costly automated cleaning methods. This paper focuses on initiating cleaning actions when required to reduce maintenance costs. The cleaning actions are triggered only when the dust level exceeds a threshold value. The amount of dust accumulated on the PV panels is estimated using an artificial neural network (ANN). Experiments are conducted to collect the required data, which are used in the training of the ANN model. Then, this ANN model will be fed by the output power from solar panels, ambient temperature, and solar irradiance, and thus, it will be able to estimate the amount of dust accumulated on solar panels at these conditions. The model was tested on different case studies to confirm the accuracy of the developed model.

Keywords: machine learning, dust, PV panels, renewable energy

Procedia PDF Downloads 148
19914 Detection and Quantification of Ochratoxin A in Food by Aptasensor

Authors: Moez Elsaadani, Noel Durand, Brice Sorli, Didier Montet

Abstract:

Governments and international instances are trying to improve the food safety system to prevent, reduce or avoid the increase of food borne diseases. This food risk is one of the major concerns for the humanity. The contamination by mycotoxins is a threat to the health and life of humans and animals. One of the most common mycotoxin contaminating feed and foodstuffs is Ochratoxin A (OTA), which is a secondary metabolite, produced by Aspergillus and Penicillium strains. OTA has a chronic toxic effect and proved to be mutagenic, nephrotoxic, teratogenic, immunosuppressive, and carcinogenic. On the other side, because of their high stability, specificity, affinity, and their easy chemical synthesis, aptamer based methods are applied to OTA biosensing as alternative to traditional analytical technique. In this work, five aptamers have been tested to confirm qualitatively and quantitatively their binding with OTA. In the same time, three different analytical methods were tested and compared based on their ability to detect and quantify the OTA. The best protocol that was established to quantify free OTA from linked OTA involved an ultrafiltration method in green coffee solution with. OTA was quantified by HPLC-FLD to calculate the binding percentage of all five aptamers. One aptamer (The most effective with 87% binding with OTA) has been selected to be our biorecognition element to study its electrical response (variation of electrical properties) in the presence of OTA in order to be able to make a pairing with a radio frequency identification (RFID). This device, which is characterized by its low cost, speed, and a simple wireless information transmission, will implement the knowledge on the mycotoxins molecular sensors (aptamers), an electronic device that will link the information, the quantification and make it available to operators.

Keywords: aptamer, aptasensor, detection, Ochratoxin A

Procedia PDF Downloads 187
19913 Acceptability of the Carers-ID Intervention for Family Carers of People with Intellectual Disabilities

Authors: Mark Linden, Michael Brown, Lynne Marsh, Maria Truesdale, Stuart Todd, Nathan Hughes, Trisha Forbes, Rachel Leonard

Abstract:

Background: Family carers of people with intellectual disabilities (ID) face ongoing challenges in accessing services and often experience poor mental health. Online support programmes may prove effective in addressing the mental health and well-being needs of family carers. This study sought to test the acceptability of a newly developed online support programme for carers of people with intellectual disabilities called Carers-ID. Methods A sequential mixed-methods explanatory design was utilised. An adapted version of the Acceptability of Health Apps among Adolescents (AHAA) Scale was distributed to family carers who had viewed the Carers-ID.com intervention. Following this, participants were invited to take part in an online interview. Interview questions focused on participants’ experiences of using the programme and its acceptability. Qualitative and quantitative data were analysed separately and then brought together through the triangulation protocol developed by Farmer et al (2006). Findings: Seventy family carers responded to the acceptability survey, whilst 10 took part in interviews. Six themes were generated from interviews with family carers. Based on our triangulation, four areas of convergence were identified, these included, programme usability and ease, attitudes towards the programme, perceptions of effectiveness, and programme relatability. Conclusions: In order to be acceptable, online interventions for carers of people with ID need to be accessible, understandable and easy to use, as carers time is precious. Further research is needed to investigate the effectiveness of online interventions for family carers, specifically considering which carers the intervention works for, and for whom it may not.

Keywords: intellectual disability, family carer, acceptability study, online intervention

Procedia PDF Downloads 95
19912 Calibration of Residential Buildings Energy Simulations Using Real Data from an Extensive in situ Sensor Network – A Study of Energy Performance Gap

Authors: Mathieu Bourdeau, Philippe Basset, Julien Waeytens, Elyes Nefzaoui

Abstract:

As residential buildings account for a third of the overall energy consumption and greenhouse gas emissions in Europe, building energy modeling is an essential tool to reach energy efficiency goals. In the energy modeling process, calibration is a mandatory step to obtain accurate and reliable energy simulations. Nevertheless, the comparison between simulation results and the actual building energy behavior often highlights a significant performance gap. The literature discusses different origins of energy performance gaps, from building design to building operation. Then, building operation description in energy models, especially energy usages and users’ behavior, plays an important role in the reliability of simulations but is also the most accessible target for post-occupancy energy management and optimization. Therefore, the present study aims to discuss results on the calibration ofresidential building energy models using real operation data. Data are collected through a sensor network of more than 180 sensors and advanced energy meters deployed in three collective residential buildings undergoing major retrofit actions. The sensor network is implemented at building scale and in an eight-apartment sample. Data are collected for over one year and half and coverbuilding energy behavior – thermal and electricity, indoor environment, inhabitants’ comfort, occupancy, occupants behavior and energy uses, and local weather. Building energy simulations are performed using a physics-based building energy modeling software (Pleaides software), where the buildings’features are implemented according to the buildingsthermal regulation code compliance study and the retrofit project technical files. Sensitivity analyses are performed to highlight the most energy-driving building features regarding each end-use. These features are then compared with the collected post-occupancy data. Energy-driving features are progressively replaced with field data for a step-by-step calibration of the energy model. Results of this study provide an analysis of energy performance gap on an existing residential case study under deep retrofit actions. It highlights the impact of the different building features on the energy behavior and the performance gap in this context, such as temperature setpoints, indoor occupancy, the building envelopeproperties but also domestic hot water usage or heat gains from electric appliances. The benefits of inputting field data from an extensive instrumentation campaign instead of standardized scenarios are also described. Finally, the exhaustive instrumentation solution provides useful insights on the needs, advantages, and shortcomings of the implemented sensor network for its replicability on a larger scale and for different use cases.

Keywords: calibration, building energy modeling, performance gap, sensor network

Procedia PDF Downloads 166
19911 A Social Decision Support Mechanism for Group Purchasing

Authors: Lien-Fa Lin, Yung-Ming Li, Fu-Shun Hsieh

Abstract:

With the advancement of information technology and development of group commerce, people have obviously changed in their lifestyle. However, group commerce faces some challenging problems. The products or services provided by vendors do not satisfactorily reflect customers’ opinions, so that the sale and revenue of group commerce gradually become lower. On the other hand, the process for a formed customer group to reach group-purchasing consensus is time-consuming and the final decision is not the best choice for each group members. In this paper, we design a social decision support mechanism, by using group discussion message to recommend suitable options for group members and we consider social influence and personal preference to generate option ranking list. The proposed mechanism can enhance the group purchasing decision making efficiently and effectively and venders can provide group products or services according to the group option ranking list.

Keywords: social network, group decision, text mining, group commerce

Procedia PDF Downloads 490
19910 Relay Mining: Verifiable Multi-Tenant Distributed Rate Limiting

Authors: Daniel Olshansky, Ramiro Rodrıguez Colmeiro

Abstract:

Relay Mining presents a scalable solution employing probabilistic mechanisms and crypto-economic incentives to estimate RPC volume usage, facilitating decentralized multitenant rate limiting. Network traffic from individual applications can be concurrently serviced by multiple RPC service providers, with costs, rewards, and rate limiting governed by a native cryptocurrency on a distributed ledger. Building upon established research in token bucket algorithms and distributed rate-limiting penalty models, our approach harnesses a feedback loop control mechanism to adjust the difficulty of mining relay rewards, dynamically scaling with network usage growth. By leveraging crypto-economic incentives, we reduce coordination overhead costs and introduce a mechanism for providing RPC services that are both geopolitically and geographically distributed.

Keywords: remote procedure call, crypto-economic, commit-reveal, decentralization, scalability, blockchain, rate limiting, token bucket

Procedia PDF Downloads 58
19909 The Role of Mobile Applications on Consumerism Case Study: Snappfood Application

Authors: Vajihe Fasihi

Abstract:

With the advancement of technology and the expansion of the Internet, a significant change in lifestyle and consumption can be seen in societies. The increasing number of mobile applications (such as SnappFood) has expanded the scope of using apps for wider access to services to citizens and meets the needs of a large number of citizens in the shortest time and with reasonable quality. First, this article seeks to understand the concept and function of the Internet distribution network on the Iranian society, which was investigated in a smaller sample (students of the Faculty of Social Sciences of the Tehran university ) and uses the semi-structured interview method, and then explores the concept of consumerism. The main issue of this research is the effect of mobile apps, especially SnappFood, on increasing consumption and the difference between real needs and false needs among consumers. The findings of this research show that the use of the mentioned program has been effective in increasing the false needs of the sample community and has led to the phenomenon of consumerism.

Keywords: consumerism economics, false needs, mobile applications, reel needs

Procedia PDF Downloads 61
19908 A Hybrid Distributed Algorithm for Solving Job Shop Scheduling Problem

Authors: Aydin Teymourifar, Gurkan Ozturk

Abstract:

In this paper, a distributed hybrid algorithm is proposed for solving the job shop scheduling problem. The suggested method executes different artificial neural networks, heuristics and meta-heuristics simultaneously on more than one machine. The neural networks are used to control the constraints of the problem while the meta-heuristics search the global space and the heuristics are used to prevent the premature convergence. To attain an efficient distributed intelligent method for solving big and distributed job shop scheduling problems, Apache Spark and Hadoop frameworks are used. In the algorithm implementation and design steps, new approaches are applied. Comparison between the proposed algorithm and other efficient algorithms from the literature shows its efficiency, which is able to solve large size problems in short time.

Keywords: distributed algorithms, Apache Spark, Hadoop, job shop scheduling, neural network

Procedia PDF Downloads 393
19907 Smartphone Addiction and Reaction Time in Geriatric Population

Authors: Anjali N. Shete, G. D. Mahajan, Nanda Somwanshi

Abstract:

Context: Smartphones are the new generation of mobile phones; they have emerged over the last few years. Technology has developed so much that it has become part of our life and mobile phones are one of them. These smartphones are equipped with the capabilities to display photos, play games, watch videos and navigation, etc. The advances have a huge impact on many walks of life. The adoption of new technology has been challenging for the elderly. But, the elder population is also moving towards digitally connected lives. As age advances, there is a decline in the motor and cognitive functions of the brain, and hence the reaction time is affected. The study was undertaken to assess the usefulness of smartphones in improving cognitive functions. Aims and Objectives: The aim of the study was to observe the effects of smartphone addiction on reaction time in elderly population Material and Methods: This is an experimental study. 100 elderly subjects were enrolled in this study randomly from urban areas. They all were using smartphones for several hours a day. They were divided into two groups according to the scores of the mobile phone addiction scale (MPAS). Simple reaction time was estimated by the Ruler drop method. The reaction time was then calculated for each subject in both groups. The data were analyzed using mean, standard deviation, and Pearson correlation test. Results: The mean reaction time in Group A is 0.27+ 0.040 and in Group B is 0.20 + 0.032. The values show a statistically significant change in reaction time. Conclusion: Group A with a high MPAS score has a low reaction time compared to Group B with a low MPAS score. Hence, it can be concluded that the use of smartphones in the elderly is useful, delaying the neurological decline, and smarten the brain.

Keywords: smartphones, MPAS, reaction time, elderly population

Procedia PDF Downloads 182
19906 Poultry in Motion: Text Mining Social Media Data for Avian Influenza Surveillance in the UK

Authors: Samuel Munaf, Kevin Swingler, Franz Brülisauer, Anthony O’Hare, George Gunn, Aaron Reeves

Abstract:

Background: Avian influenza, more commonly known as Bird flu, is a viral zoonotic respiratory disease stemming from various species of poultry, including pets and migratory birds. Researchers have purported that the accessibility of health information online, in addition to the low-cost data collection methods the internet provides, has revolutionized the methods in which epidemiological and disease surveillance data is utilized. This paper examines the feasibility of using internet data sources, such as Twitter and livestock forums, for the early detection of the avian flu outbreak, through the use of text mining algorithms and social network analysis. Methods: Social media mining was conducted on Twitter between the period of 01/01/2021 to 31/12/2021 via the Twitter API in Python. The results were filtered firstly by hashtags (#avianflu, #birdflu), word occurrences (avian flu, bird flu, H5N1), and then refined further by location to include only those results from within the UK. Analysis was conducted on this text in a time-series manner to determine keyword frequencies and topic modeling to uncover insights in the text prior to a confirmed outbreak. Further analysis was performed by examining clinical signs (e.g., swollen head, blue comb, dullness) within the time series prior to the confirmed avian flu outbreak by the Animal and Plant Health Agency (APHA). Results: The increased search results in Google and avian flu-related tweets showed a correlation in time with the confirmed cases. Topic modeling uncovered clusters of word occurrences relating to livestock biosecurity, disposal of dead birds, and prevention measures. Conclusions: Text mining social media data can prove to be useful in relation to analysing discussed topics for epidemiological surveillance purposes, especially given the lack of applied research in the veterinary domain. The small sample size of tweets for certain weekly time periods makes it difficult to provide statistically plausible results, in addition to a great amount of textual noise in the data.

Keywords: veterinary epidemiology, disease surveillance, infodemiology, infoveillance, avian influenza, social media

Procedia PDF Downloads 110
19905 Decision Support System for Fetus Status Evaluation Using Cardiotocograms

Authors: Oyebade K. Oyedotun

Abstract:

The cardiotocogram is a technical recording of the heartbeat rate and uterine contractions of a fetus during pregnancy. During pregnancy, several complications can occur to both the mother and the fetus; hence it is very crucial that medical experts are able to find technical means to check the healthiness of the mother and especially the fetus. It is very important that the fetus develops as expected in stages during the pregnancy period; however, the task of monitoring the health status of the fetus is not that which is easily achieved as the fetus is not wholly physically available to medical experts for inspection. Hence, doctors have to resort to some other tests that can give an indication of the status of the fetus. One of such diagnostic test is to obtain cardiotocograms of the fetus. From the analysis of the cardiotocograms, medical experts can determine the status of the fetus, and therefore necessary medical interventions. Generally, medical experts classify examined cardiotocograms into ‘normal’, ‘suspect’, or ‘pathological’. This work presents an artificial neural network based decision support system which can filter cardiotocograms data, producing the corresponding statuses of the fetuses. The capability of artificial neural network to explore the cardiotocogram data and learn features that distinguish one class from the others has been exploited in this research. In this research, feedforward and radial basis neural networks were trained on a publicly available database to classify the processed cardiotocogram data into one of the three classes: ‘normal’, ‘suspect’, or ‘pathological’. Classification accuracies of 87.8% and 89.2% were achieved during the test phase of the trained network for the feedforward and radial basis neural networks respectively. It is the hope that while the system described in this work may not be a complete replacement for a medical expert in fetus status evaluation, it can significantly reinforce the confidence in medical diagnosis reached by experts.

Keywords: decision support, cardiotocogram, classification, neural networks

Procedia PDF Downloads 339