Search results for: time constraints
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 18696

Search results for: time constraints

17526 Internet Media and Public: A Report of a Mutual Deception

Authors: Safet Zejnullahu

Abstract:

The relationship between the public and media is more than meaningful. It has been a topic of discussion as early as the birth of the media. The 'magic box' called radio adapted and transformed by following the tastes and interests of the public. Television went a step further by complementing the magic sound of the magic box with photos/images. Newspapers informed the reader, but from time to time, they also provided them the room to express their opinions. The media-public report in the traditional media is a report of a mutual respect. Today, the report between media and public should be well defined. The goal of this paper is to analyze the history of the media-public relationship, with a special emphasis on the analysis of this relationship in media of the internet time. This paper seeks to prove that the internet media has created a completely new and thus far unknown relationship between the media and public. Through research, which includes an analysis of the media in Kosovo and Albania, it will be proven that the media of the internet time has elevated this relationship to a new level, with many unknowns in terms of the functioning and role of the media. The results and findings of the paper are related to the conclusion that from the relationship in which the roles of the media and the public are known, nowadays, this relationship goes beyond the known principle and rules and is more defined as a relationship of mutual deception. The media goes beyond the line of the humility of the public, and the public seeks to direct the content of the media. The media-public report in the internet-media is a report based on mutual attempt for fraud.

Keywords: media, public, report, humility, direction

Procedia PDF Downloads 161
17525 Investigating the Vehicle-Bicyclists Conflicts using LIDAR Sensor Technology at Signalized Intersections

Authors: Alireza Ansariyar, Mansoureh Jeihani

Abstract:

Light Detection and Ranging (LiDAR) sensors are capable of recording traffic data including the number of passing vehicles and bicyclists, the speed of vehicles and bicyclists, and the number of conflicts among both road users. In order to collect real-time traffic data and investigate the safety of different road users, a LiDAR sensor was installed at Cold Spring Ln – Hillen Rd intersection in Baltimore City. The frequency and severity of collected real-time conflicts were analyzed and the results highlighted that 122 conflicts were recorded over a 10-month time interval from May 2022 to February 2023. By using an innovative image-processing algorithm, a new safety Measure of Effectiveness (MOE) was proposed to recognize the critical zones for bicyclists entering each zone. Considering the trajectory of conflicts, the results of the analysis demonstrated that conflicts in the northern approach (zone N) are more frequent and severe. Additionally, sunny weather is more likely to cause severe vehicle-bike conflicts.

Keywords: LiDAR sensor, post encroachment time threshold (PET), vehicle-bike conflicts, a measure of effectiveness (MOE), weather condition

Procedia PDF Downloads 216
17524 A Multilevel Approach of Reproductive Preferences and Subsequent Behavior in India

Authors: Anjali Bansal

Abstract:

Reproductive preferences mainly deal with two questions: when a couple wants children and how many they want. Questions related to these desires are often included in the fertility surveys as they can provide relevant information on the subsequent behavior. The aim of the study is to observe whether respondent’s response to these questions changed over time or not. We also tried to identify socio- economic and demographic factors associated with the stability (or instability) of fertility preferences. For this purpose, we used IHDS1 (2004-05) and follow up survey IHDS2 (2011-12) data and applied bivariate, multivariate and multilevel repeated measure analysis to it to find the consistency between responses. From the analysis, we found that preferences of women changes over the course of time as from the bivariate analysis we have found that 52% of women are not consistent in their desired family size and huge inconsistency are found in desire to continue childbearing. To get a better overlook of these inconsistencies, we have computed Intra Class Correlation (ICC) which tries to explain the consistency between individuals on their fertility responses at two time periods. We also explored that husband’s desire for additional child specifically male offspring contribute to these variations. Our findings lead us to a cessation that in India, individuals fertility preferences changed over a seven-year time period as the Intra Class correlation comes out to be very small which explains the variations among individuals. Concerted efforts should be made, therefore, to educate people, and conduct motivational programs to promote family planning for family welfare.

Keywords: change, consistency, preferences, over time

Procedia PDF Downloads 156
17523 Recursive Doubly Complementary Filter Design Using Particle Swarm Optimization

Authors: Ju-Hong Lee, Ding-Chen Chung

Abstract:

This paper deals with the optimal design of recursive doubly complementary (DC) digital filter design using a metaheuristic based optimization technique. Based on the theory of DC digital filters using two recursive digital all-pass filters (DAFs), the design problem is appropriately formulated to result in an objective function which is a weighted sum of the phase response errors of the designed DAFs. To deal with the stability of the recursive DC filters during the design process, we can either impose some necessary constraints on the phases of the recursive DAFs. Through a frequency sampling and a weighted least squares approach, the optimization problem of the objective function can be solved by utilizing a population based stochastic optimization approach. The resulting DC digital filters can possess satisfactory frequency response. Simulation results are presented for illustration and comparison.

Keywords: doubly complementary, digital all-pass filter, weighted least squares algorithm, particle swarm optimization

Procedia PDF Downloads 675
17522 Working Women and Leave in India

Authors: Ankita Verma

Abstract:

Women transform the group of people into a family and a house into a home. When a woman embraces motherhood, she undergoes several stresses – both physical and mental. Therefore, to be supportive of women during this critical stage is a societal responsibility. India is in the league of many developed nations in formulating women-friendly policies. One such initiative is the Maternity Benefits Act; first passed in 1961 and later amended from time to time with the latest amended Act of 2017. This review paper critically analyzes provisions of the Act, its implementation, and the legal issues arising out of implementation of the Act. The review suggests that the Act has made a positive impact and the judiciary also has played its role in streamlining the process of implementation of the Act. However, at the same time, it is also felt that employers often hesitate in hiring a mother or an expectant mother.

Keywords: maternity benefits, maternity benefits act 1961 & 2017, motherhood, maternity and paternity leave, medical bonus, work environment

Procedia PDF Downloads 166
17521 Modeling and Simulation Frameworks for Cloud Computing Environment: A Critical Evaluation

Authors: Abul Bashar

Abstract:

The recent surge in the adoption of cloud computing systems by various organizations has brought forth the challenge of evaluating their performance. One of the major issues faced by the cloud service providers and customers is to assess the ability of cloud computing systems to provide the desired services in accordance to the QoS and SLA constraints. To this end, an opportunity exists to develop means to ensure that the desired performance levels of such systems are met under simulated environments. This will eventually minimize the service disruptions and performance degradation issues during the commissioning and operational phase of cloud computing infrastructure. However, it is observed that several simulators and modelers are available for simulating the cloud computing systems. Therefore, this paper presents a critical evaluation of the state-of-the-art modeling and simulation frameworks applicable to cloud computing systems. It compares the prominent simulation frameworks in terms of the API features, programming flexibility, operating system requirements, supported services, licensing needs and popularity. Subsequently, it provides recommendations regarding the choice of the most appropriate framework for researchers, administrators and managers of cloud computing systems.

Keywords: cloud computing, modeling framework, performance evaluation, simulation tools

Procedia PDF Downloads 487
17520 Effect of Fast and Slow Tempo Music on Muscle Endurance Time

Authors: Rohit Kamal, Devaki Perumal Rajaram, Rajam Krishna, Sai Kumar Pindagiri, Silas Danielraj

Abstract:

Introduction: According to WHO, Global health observatory at least 2.8 million people die each year because of obesity and overweight. This is mainly because of the adverse metabolic effects of obesity and overweight on blood pressure, lipid profile especially cholesterol and insulin resistance. To achieve optimum health WHO has set the BMI in the range of 18.5 to 24.9 kg/m2. Due to modernization of life style, physical exercise in the form of work is no longer a possibility and hence an effective way to burn out calories to achieve the optimum BMI is the need of the hour. Studies have shown that exercising for more than 60 minutes /day helps to maintain the weight and to reduce the weight exercise should be done for 90 minutes a day. Moderate exercise for about 30 min is essential for burning up of calories. People with low endurance fail to perform even the low intensity exercise for minimal time. Hence, it is necessary to find out some effective method to increase the endurance time. Methodology: This study was approved by the Institutional Ethical committee of our college. After getting written informed consent, 25 apparently healthy males between the age group 18-20 years were selected. Subjects are with muscular disorder, subjects who are Hypertensive, Diabetes, Smokers, Alcoholics, taking drugs affecting the muscle strength. To determine the endurance time: Maximum voluntary contraction (MVC) was measured by asking the participants to squeeze the hand grip dynamometer as hard as possible and hold it for 3 seconds. This procedure was repeated thrice and the average of the three reading was taken as the maximum voluntary contraction. The participant was then asked to squeeze the dynamometer and hold it at 70% of the maximum voluntary contraction while hearing fast tempo music which was played for about ten minutes then the participant was asked to relax for ten minutes and was made to hold the hand grip dynamometer at 70% of the maximum voluntary contraction while hearing slow tempo music. To avoid the bias of getting habituated to the procedure the order of hearing for the fast and slow tempo music was changed. The time for which they can hold it at 70% of MVC was determined by using a stop watch and that was taken as the endurance time. Results: The mean value of the endurance time during fast and slow tempo music was compared in all the subjects. The mean MVC was 34.92 N. The mean endurance time was 21.8 (16.3) seconds with slow tempo music which was more then with fast tempo music with which the mean endurance time was 20.6 (11.7) seconds. The preference was more for slow tempo music then for fast tempo music. Conclusion: Music when played during exercise by some unknown mechanism helps to increase the endurance time by alleviating the symptoms of lactic acid accumulation.

Keywords: endurance time, fast tempo music, maximum voluntary contraction, slow tempo music

Procedia PDF Downloads 294
17519 Evaluation of the Effect of Learning Disabilities and Accommodations on the Prediction of the Exam Performance: Ordinal Decision-Tree Algorithm

Authors: G. Singer, M. Golan

Abstract:

Providing students with learning disabilities (LD) with extra time to grant them equal access to the exam is a necessary but insufficient condition to compensate for their LD; there should also be a clear indication that the additional time was actually used. For example, if students with LD use more time than students without LD and yet receive lower grades, this may indicate that a different accommodation is required. If they achieve higher grades but use the same amount of time, then the effectiveness of the accommodation has not been demonstrated. The main goal of this study is to evaluate the effect of including parameters related to LD and extended exam time, along with other commonly-used characteristics (e.g., student background and ability measures such as high-school grades), on the ability of ordinal decision-tree algorithms to predict exam performance. We use naturally-occurring data collected from hundreds of undergraduate engineering students. The sub-goals are i) to examine the improvement in prediction accuracy when the indicator of exam performance includes 'actual time used' in addition to the conventional indicator (exam grade) employed in most research; ii) to explore the effectiveness of extended exam time on exam performance for different courses and for LD students with different profiles (i.e., sets of characteristics). This is achieved by using the patterns (i.e., subgroups) generated by the algorithms to identify pairs of subgroups that differ in just one characteristic (e.g., course or type of LD) but have different outcomes in terms of exam performance (grade and time used). Since grade and time used to exhibit an ordering form, we propose a method based on ordinal decision-trees, which applies a weighted information-gain ratio (WIGR) measure for selecting the classifying attributes. Unlike other known ordinal algorithms, our method does not assume monotonicity in the data. The proposed WIGR is an extension of an information-theoretic measure, in the sense that it adjusts to the case of an ordinal target and takes into account the error severity between two different target classes. Specifically, we use ordinal C4.5, random-forest, and AdaBoost algorithms, as well as an ensemble technique composed of ordinal and non-ordinal classifiers. Firstly, we find that the inclusion of LD and extended exam-time parameters improves prediction of exam performance (compared to specifications of the algorithms that do not include these variables). Secondly, when the indicator of exam performance includes 'actual time used' together with grade (as opposed to grade only), the prediction accuracy improves. Thirdly, our subgroup analyses show clear differences in the effect of extended exam time on exam performance among different courses and different student profiles. From a methodological perspective, we find that the ordinal decision-tree based algorithms outperform their conventional, non-ordinal counterparts. Further, we demonstrate that the ensemble-based approach leverages the strengths of each type of classifier (ordinal and non-ordinal) and yields better performance than each classifier individually.

Keywords: actual exam time usage, ensemble learning, learning disabilities, ordinal classification, time extension

Procedia PDF Downloads 96
17518 Seismic Analysis of Adjacent Buildings Connected with Dampers

Authors: Devyani D. Samarth, Sachin V. Bakre, Ratnesh Kumar

Abstract:

This work deals with two buildings adjacent to each other connected with dampers. The “Imperial Valley Earthquake - El Centro", "May 18, 1940 earthquake time history is used for dynamic analysis of the system in the time domain. The effectiveness of fluid joint dampers is then investigated in terms of the reduction of displacement, acceleration and base shear responses of adjacent buildings. Finally, an extensive parametric study is carried out to find optimum damper properties like stiffness (Kd) and damping coefficient (Cd) for adjacent buildings. Results show that using fluid dampers to connect the adjacent buildings of different fundamental frequencies can effectively reduce earthquake-induced responses of either building if damper optimum properties are selected.

Keywords: energy dissipation devices, time history analysis, viscous damper, optimum parameters

Procedia PDF Downloads 482
17517 Compressed Suffix Arrays to Self-Indexes Based on Partitioned Elias-Fano

Authors: Guo Wenyu, Qu Youli

Abstract:

A practical and simple self-indexing data structure, Partitioned Elias-Fano (PEF) - Compressed Suffix Arrays (CSA), is built in linear time for the CSA based on PEF indexes. Moreover, the PEF-CSA is compared with two classical compressed indexing methods, Ferragina and Manzini implementation (FMI) and Sad-CSA on different type and size files in Pizza & Chili. The PEF-CSA performs better on the existing data in terms of the compression ratio, count, and locates time except for the evenly distributed data such as proteins data. The observations of the experiments are that the distribution of the φ is more important than the alphabet size on the compression ratio. Unevenly distributed data φ makes better compression effect, and the larger the size of the hit counts, the longer the count and locate time.

Keywords: compressed suffix array, self-indexing, partitioned Elias-Fano, PEF-CSA

Procedia PDF Downloads 241
17516 Impact of Climate Change on Energy Consumption of the Residential Building Stock in Turkey

Authors: Sadik Yigit

Abstract:

The energy consumed in the buildings constitutes a large portion of the total energy consumption in the world. In this study, it was aimed to measure the impact of climate change on the energy consumption of residential building stock by analyzing a typical mid-rise residential building in four different climate regions of Turkey. An integrated system was developed using the "Distribution Evolutionary Algorithms in Python" tool and Energy Plus. By using the developed integrated system, the energy performance of the typical residential building was analyzed under the effect of different climate change scenarios. The results indicated that predicted overheating will be experienced in the future, which will significantly increase the cooling energy loads of the buildings. In addition, design solutions to improve the future energy performance of the buildings were proposed, considering budget constraints. The results of the study will guide researchers studying in this area of research and designers in the sector in finding climate change resilient design solutions.

Keywords: energy_efficient, residential buildings, climate change, energyplus

Procedia PDF Downloads 96
17515 Assessment of the High-Speed Ice Friction of Bob Skeleton Runners

Authors: Agata Tomaszewska, Timothy Kamps, Stephan R. Turnock, Nicola Symonds

Abstract:

Bob skeleton is a highly competitive sport in which an athlete reaches speeds up to 40 m/s sliding, head first, down an ice track. It is believed that the friction between the runners and ice significantly contributes to the amount of the total energy loss during a bob skeleton descent. There is only limited available experimental data regarding the friction of bob skeleton runners or indeed steel on the ice at high sliding speeds ( > 20 m/s). Testing methods used to investigate the friction of steel on ice in winter sports have been outlined, and their accuracy and repeatability discussed. A system thinking approach was used to investigate the runner-ice interaction during sliding and create concept designs of three ice tribometers. The operational envelope of the bob skeleton system has been defined through mathematical modelling. Designs of a drum, linear and inertia pin-on-disk tribometers were developed specifically for bob skeleton runner testing with the requirement of reaching up to 40 m/s speed and facilitate fresh ice sliding. The design constraints have been outline and the proposed solutions compared based on the ease of operation, accuracy and the development cost.

Keywords: bob skeleton, ice friction, high-speed tribometers, sliding friction

Procedia PDF Downloads 251
17514 PVMODREL© Development Based on Reliability Evaluation of a PV Module Using Accelerated Degradation Testing

Authors: Abderafi Charki, David Bigaud

Abstract:

The aim of this oral speach is to present the PVMODREL© (PhotoVoltaic MODule RELiability) new software developed in the University of Angers. This new tool permits us to evaluate the lifetime and reliability of a PV module whatever its geographical location and environmental conditions. The electrical power output of a PV module decreases with time mainly as a result of the effects of corrosion, encapsulation discoloration, and solder bond failure. The failure of a PV module is defined as the point where the electrical power degradation reaches a given threshold value. Accelerated life tests (ALTs) are commonly used to assess the reliability of a PV module. However, ALTs provide limited data on the failure of a module and these tests are expensive to carry out. One possible solution is to conduct accelerated degradation tests. The Wiener process in conjunction with the accelerated failure time model makes it possible to carry out numerous simulations and thus to determine the failure time distribution based on the aforementioned threshold value. By this means, the failure time distribution and the lifetime (mean and uncertainty) can be evaluated. An example using the damp heat test is shown to demonstrate the usefulness PVMODREL.

Keywords: lifetime, reliability, PV Module, accelerated life testing, accelerated degradation testing

Procedia PDF Downloads 564
17513 Analyzing the Empirical Link between Islamic Finance and Growth of Real Output: A Time Series Application to Pakistan

Authors: Nazima Ellahi, Danish Ramzan

Abstract:

There is a growing trend among development economists regarding the importance of financial sector for economic development and growth activities. The development thus introduced, helps to promote welfare effects and poverty alleviation. This study is an attempt to find the nature of link between Islamic banking financing and development of output growth for Pakistan. Time series data set has been utilized for a time period ranging from 1990 to 2010. Following the Phillip Perron (PP) and Augmented Dicky Fuller (ADF) test of unit root this study applied Ordinary Least Squares (OLS) method of estimation and found encouraging results in favor of promoting the Islamic banking practices in Pakistan.

Keywords: Islamic finance, poverty alleviation, economic growth, finance, commerce

Procedia PDF Downloads 336
17512 Application of Harris Hawks Optimization Metaheuristic Algorithm and Random Forest Machine Learning Method for Long-Term Production Scheduling Problem under Uncertainty in Open-Pit Mines

Authors: Kamyar Tolouei, Ehsan Moosavi

Abstract:

In open-pit mines, the long-term production scheduling optimization problem (LTPSOP) is a complicated problem that contains constraints, large datasets, and uncertainties. Uncertainty in the output is caused by several geological, economic, or technical factors. Due to its dimensions and NP-hard nature, it is usually difficult to find an ideal solution to the LTPSOP. The optimal schedule generally restricts the ore, metal, and waste tonnages, average grades, and cash flows of each period. Past decades have witnessed important measurements of long-term production scheduling and optimal algorithms since researchers have become highly cognizant of the issue. In fact, it is not possible to consider LTPSOP as a well-solved problem. Traditional production scheduling methods in open-pit mines apply an estimated orebody model to produce optimal schedules. The smoothing result of some geostatistical estimation procedures causes most of the mine schedules and production predictions to be unrealistic and imperfect. With the expansion of simulation procedures, the risks from grade uncertainty in ore reserves can be evaluated and organized through a set of equally probable orebody realizations. In this paper, to synthesize grade uncertainty into the strategic mine schedule, a stochastic integer programming framework is presented to LTPSOP. The objective function of the model is to maximize the net present value and minimize the risk of deviation from the production targets considering grade uncertainty simultaneously while satisfying all technical constraints and operational requirements. Instead of applying one estimated orebody model as input to optimize the production schedule, a set of equally probable orebody realizations are applied to synthesize grade uncertainty in the strategic mine schedule and to produce a more profitable and risk-based production schedule. A mixture of metaheuristic procedures and mathematical methods paves the way to achieve an appropriate solution. This paper introduced a hybrid model between the augmented Lagrangian relaxation (ALR) method and the metaheuristic algorithm, the Harris Hawks optimization (HHO), to solve the LTPSOP under grade uncertainty conditions. In this study, the HHO is experienced to update Lagrange coefficients. Besides, a machine learning method called Random Forest is applied to estimate gold grade in a mineral deposit. The Monte Carlo method is used as the simulation method with 20 realizations. The results specify that the progressive versions have been considerably developed in comparison with the traditional methods. The outcomes were also compared with the ALR-genetic algorithm and ALR-sub-gradient. To indicate the applicability of the model, a case study on an open-pit gold mining operation is implemented. The framework displays the capability to minimize risk and improvement in the expected net present value and financial profitability for LTPSOP. The framework could control geological risk more effectively than the traditional procedure considering grade uncertainty in the hybrid model framework.

Keywords: grade uncertainty, metaheuristic algorithms, open-pit mine, production scheduling optimization

Procedia PDF Downloads 93
17511 Clustering Color Space, Time Interest Points for Moving Objects

Authors: Insaf Bellamine, Hamid Tairi

Abstract:

Detecting moving objects in sequences is an essential step for video analysis. This paper mainly contributes to the Color Space-Time Interest Points (CSTIP) extraction and detection. We propose a new method for detection of moving objects. Two main steps compose the proposed method. First, we suggest to apply the algorithm of the detection of Color Space-Time Interest Points (CSTIP) on both components of the Color Structure-Texture Image Decomposition which is based on a Partial Differential Equation (PDE): a color geometric structure component and a color texture component. A descriptor is associated to each of these points. In a second stage, we address the problem of grouping the points (CSTIP) into clusters. Experiments and comparison to other motion detection methods on challenging sequences show the performance of the proposed method and its utility for video analysis. Experimental results are obtained from very different types of videos, namely sport videos and animation movies.

Keywords: Color Space-Time Interest Points (CSTIP), Color Structure-Texture Image Decomposition, Motion Detection, clustering

Procedia PDF Downloads 371
17510 Application of Global Predictive Real Time Control Strategy to Improve Flooding Prevention Performance of Urban Stormwater Basins

Authors: Shadab Shishegar, Sophie Duchesne, Genevieve Pelletier

Abstract:

Sustainability as one of the key elements of Smart cities, can be realized by employing Real Time Control Strategies for city’s infrastructures. Nowadays Stormwater management systems play an important role in mitigating the impacts of urbanization on natural hydrological cycle. These systems can be managed in such a way that they meet the smart cities standards. In fact, there is a huge potential for sustainable management of urban stormwater and also its adaptability to global challenges like climate change. Hence, a dynamically managed system that can adapt itself to instability of the environmental conditions is desirable. A Global Predictive Real Time Control approach is proposed in this paper to optimize the performance of stormwater management basins in terms of flooding prevention. To do so, a mathematical optimization model is developed then solved using Genetic Algorithm (GA). Results show an improved performance at system-level for the stormwater basins in comparison to static strategy.

Keywords: environmental sustainability, optimization, real time control, storm water management

Procedia PDF Downloads 168
17509 Queueing Modeling of M/G/1 Fault Tolerant System with Threshold Recovery and Imperfect Coverage

Authors: Madhu Jain, Rakesh Kumar Meena

Abstract:

This paper investigates a finite M/G/1 fault tolerant multi-component machining system. The system incorporates the features such as standby support, threshold recovery and imperfect coverage make the study closer to real time systems. The performance prediction of M/G/1 fault tolerant system is carried out using recursive approach by treating remaining service time as a supplementary variable. The numerical results are presented to illustrate the computational tractability of analytical results by taking three different service time distributions viz. exponential, 3-stage Erlang and deterministic. Moreover, the cost function is constructed to determine the optimal choice of system descriptors to upgrading the system.

Keywords: fault tolerant, machine repair, threshold recovery policy, imperfect coverage, supplementary variable technique

Procedia PDF Downloads 283
17508 Of an 80 Gbps Passive Optical Network Using Time and Wavelength Division Multiplexing

Authors: Malik Muhammad Arslan, Muneeb Ullah, Dai Shihan, Faizan Khan, Xiaodong Yang

Abstract:

Internet Service Providers are driving endless demands for higher bandwidth and data throughput as new services and applications require higher bandwidth. Users want immediate and accurate data delivery. This article focuses on converting old conventional networks into passive optical networks based on time division and wavelength division multiplexing. The main focus of this research is to use a hybrid of time-division multiplexing and wavelength-division multiplexing to improve network efficiency and performance. In this paper, we design an 80 Gbps Passive Optical Network (PON), which meets the need of the Next Generation PON Stage 2 (NGPON2) proposed in this paper. The hybrid of the Time and Wavelength division multiplexing (TWDM) is said to be the best solution for the implementation of NGPON2, according to Full-Service Access Network (FSAN). To co-exist with or replace the current PON technologies, many wavelengths of the TWDM can be implemented simultaneously. By utilizing 8 pairs of wavelengths that are multiplexed and then transmitted over optical fiber for 40 Kms and on the receiving side, they are distributed among 256 users, which shows that the solution is reliable for implementation with an acceptable data rate. From the results, it can be concluded that the overall performance, Quality Factor, and bandwidth of the network are increased, and the Bit Error rate is minimized by the integration of this approach.

Keywords: bit error rate, fiber to the home, passive optical network, time and wavelength division multiplexing

Procedia PDF Downloads 63
17507 Two-Dimensional Symmetric Half-Plane Recursive Doubly Complementary Digital Lattice Filters

Authors: Ju-Hong Lee, Chong-Jia Ciou, Yuan-Hau Yang

Abstract:

This paper deals with the problem of two-dimensional (2-D) recursive doubly complementary (DC) digital filter design. We present a structure of 2-D recursive DC filters by using 2-D symmetric half-plane (SHP) recursive digital all-pass lattice filters (DALFs). The novelty of using 2-D SHP recursive DALFs to construct a 2-D recursive DC digital lattice filter is that the resulting 2-D SHP recursive DC digital lattice filter provides better performance than the existing 2-D SHP recursive DC digital filter. Moreover, the proposed structure possesses a favorable 2-D DC half-band (DC-HB) property that allows about half of the 2-D SHP recursive DALF’s coefficients to be zero. This leads to considerable savings in computational burden for implementation. To ensure the stability of a designed 2-D SHP recursive DC digital lattice filter, some necessary constraints on the phase of the 2-D SHP recursive DALF during the design process are presented. Design of a 2-D diamond-shape decimation/interpolation filter is presented for illustration and comparison.

Keywords: all-pass digital filter, doubly complementary, lattice structure, symmetric half-plane digital filter, sampling rate conversion

Procedia PDF Downloads 427
17506 Real Time Lidar and Radar High-Level Fusion for Obstacle Detection and Tracking with Evaluation on a Ground Truth

Authors: Hatem Hajri, Mohamed-Cherif Rahal

Abstract:

Both Lidars and Radars are sensors for obstacle detection. While Lidars are very accurate on obstacles positions and less accurate on their velocities, Radars are more precise on obstacles velocities and less precise on their positions. Sensor fusion between Lidar and Radar aims at improving obstacle detection using advantages of the two sensors. The present paper proposes a real-time Lidar/Radar data fusion algorithm for obstacle detection and tracking based on the global nearest neighbour standard filter (GNN). This algorithm is implemented and embedded in an automative vehicle as a component generated by a real-time multisensor software. The benefits of data fusion comparing with the use of a single sensor are illustrated through several tracking scenarios (on a highway and on a bend) and using real-time kinematic sensors mounted on the ego and tracked vehicles as a ground truth.

Keywords: ground truth, Hungarian algorithm, lidar Radar data fusion, global nearest neighbor filter

Procedia PDF Downloads 157
17505 Effect of Feed Supplement Optipartum C+ 200 (Alfa- Amylase and Beta-Glucanase) in In-Line Rumination Parameters

Authors: Ramūnas Antanaitis, Lina Anskienė, Robertas Stoškus

Abstract:

This study was conducted during 2021.05.01 – 2021.08.31 at the Lithuanian University of health sciences and one Lithuanian dairy farm with 500 dairy cows (55.911381565736, 21.881321760608195). Average calving – 50 cows per month. Cows (n=20) in the treatment group (TG) were fed with feed supplement Optipartum C+ 200 (Enzymes: Alfa- Amylase 57 Units; Beta-Glucanase 107 Units) from 21 days before calving till 30 days after calving with feeding rate 200g/cow/day. Cows in the control group (CG) were fed a feed ration without feed supplement. Measurements started from 6 days before calving and continued till 21 days after calving. The following indicators were registered: with the RumiWatch System: Rumination time; Eating time; Drinking time; Rumination chews; Eating chews; Drinking gulps; Bolus; Chews per minute; Chews per bolus. With SmaXtec system - the temperature, pH of the contents of cows' reticulorumens and cows' activity. According to our results, we found that feeding of cows, from 21 days before calving to 30 days after calving, with a feed supplement with alfa- amylase and beta-glucanase (Optipartum C+ 200) (with dose 200g/cow/day) can produce an increase in: 9% rumination time and eating time, 19% drinking time, 11% rumination chews, 16% eating chews,13% number of boluses per rumination, 5% chews per minute and 16% chews per bolus. We found 1.28 % lower reiticulorumen pH and 0.64% lower reticulorumen temperature in cows fed with the supplement compared with control group cows. Also, cows feeding with enzymes were 8.80% more active.

Keywords: Alfa-Amylase, Beta-Glucanase, cows, in-line, sensors

Procedia PDF Downloads 311
17504 Theoretical Appraisal of Satisfactory Decision: Uncertainty, Evolutionary Ideas and Beliefs, Satisfactory Time Use

Authors: Okay Gunes

Abstract:

Unsatisfactory experiences due to an information shortage regarding the future pay-offs of actual choices, yield satisficing decision-making. This research will examine, for the first time in the literature, the motivation behind suboptimal decisions due to uncertainty by subjecting Adam Smith’s and Jeremy Bentham’s assumptions about the nature of the actions that lead to satisficing behavior, in order to clarify the theoretical background of a “consumption-based satisfactory time” concept. The contribution of this paper with respect to the existing literature is threefold: Firstly, it is showed in this paper that Adam Smith’s uncertainty is related to the problem of the constancy of ideas and not related directly to beliefs. Secondly, possessions, as in Jeremy Bentham’s oeuvre, are assumed to be just as pleasing, as protecting and improving the actual or expected quality of life, so long as they reduce any displeasure due to the undesired outcomes of uncertainty. Finally, each consumption decision incurs its own satisfactory time period, owed to not feeling hungry, being healthy, not having transportation…etc. This reveals that the level of satisfaction is indeed a behavioral phenomenon where its value would depend on the simultaneous satisfaction derived from all activities.

Keywords: decision-making, idea and belief, satisficing, uncertainty

Procedia PDF Downloads 275
17503 Urban Refugees and Education in Developing Countries

Authors: Sheraz Akhtar

Abstract:

In recent years, a massive influx of refugees into developing countries has placed significant constraints on the host government’s capacities to provide social services, including education, to all. As a result, the refugee communities often find themselves deprived of their rights to education in these host countries, particularly for those who to live outside camps in urban locations. While previous research has examined the educational experiences of refugees who have resettled in developed nations, there remains a dearth of research on the educational experiences of urban refugees in developing nations. This study examines this issue through a case study of Pakistani Christian refugees living in urban settings in Thailand. Using a combination of observations within community learning centres set up by international non-government organisations (INGOs) working with these communities, and interviews with young Pakistani Christian refugees and their families, the research aims to give greater voice to the Pakistani Christian refugee community living in Thailand, and better understand their educational aspirations.

Keywords: Education, Developing Countries , INGOs, Urban Refugees

Procedia PDF Downloads 116
17502 New Hardy Type Inequalities of Two-Dimensional on Time Scales via Steklov Operator

Authors: Wedad Albalawi

Abstract:

The mathematical inequalities have been the core of mathematical study and used in almost all branches of mathematics as well in various areas of science and engineering. The inequalities by Hardy, Littlewood and Polya were the first significant composition of several science. This work presents fundamental ideas, results and techniques, and it has had much influence on research in various branches of analysis. Since 1934, various inequalities have been produced and studied in the literature. Furthermore, some inequalities have been formulated by some operators; in 1989, weighted Hardy inequalities have been obtained for integration operators. Then, they obtained weighted estimates for Steklov operators that were used in the solution of the Cauchy problem for the wave equation. They were improved upon in 2011 to include the boundedness of integral operators from the weighted Sobolev space to the weighted Lebesgue space. Some inequalities have been demonstrated and improved using the Hardy–Steklov operator. Recently, a lot of integral inequalities have been improved by differential operators. Hardy inequality has been one of the tools that is used to consider integrity solutions of differential equations. Then, dynamic inequalities of Hardy and Coposon have been extended and improved by various integral operators. These inequalities would be interesting to apply in different fields of mathematics (functional spaces, partial differential equations, mathematical modeling). Some inequalities have been appeared involving Copson and Hardy inequalities on time scales to obtain new special version of them. A time scale is an arbitrary nonempty closed subset of the real numbers. Then, the dynamic inequalities on time scales have received a lot of attention in the literature and has become a major field in pure and applied mathematics. There are many applications of dynamic equations on time scales to quantum mechanics, electrical engineering, neural networks, heat transfer, combinatorics, and population dynamics. This study focuses on Hardy and Coposon inequalities, using Steklov operator on time scale in double integrals to obtain special cases of time-scale inequalities of Hardy and Copson on high dimensions. The advantage of this study is that it uses the one-dimensional classical Hardy inequality to obtain higher dimensional on time scale versions that will be applied in the solution of the Cauchy problem for the wave equation. In addition, the obtained inequalities have various applications involving discontinuous domains such as bug populations, phytoremediation of metals, wound healing, maximization problems. The proof can be done by introducing restriction on the operator in several cases. The concepts in time scale version such as time scales calculus will be used that allows to unify and extend many problems from the theories of differential and of difference equations. In addition, using chain rule, and some properties of multiple integrals on time scales, some theorems of Fubini and the inequality of H¨older.

Keywords: time scales, inequality of hardy, inequality of coposon, steklov operator

Procedia PDF Downloads 84
17501 Part Performance Improvement through Design Optimisation of Cooling Channels in the Injection Moulding Process

Authors: M. A. Alhubail, A. I. Alateyah, D. Alenezi, B. Aldousiri

Abstract:

In this study conformal cooling channel (CCC) was employed to dissipate heat of, Polypropylene (PP) parts injected into the Stereolithography (SLA) insert to form tensile and flexural test specimens. The direct metal laser sintering (DMLS) process was used to fabricate a mould with optimised CCC, while optimum parameters of injection moulding were obtained using Optimal-D. The obtained results show that optimisation of the cooling channel layout using a DMLS mould has significantly shortened cycle time without sacrificing the part’s mechanical properties. By applying conformal cooling channels, the cooling time phase was reduced by 20 seconds, and also defected parts were eliminated.

Keywords: optimum parameters, injection moulding, conformal cooling channels, cycle time

Procedia PDF Downloads 219
17500 Analysis and Simulation of TM Fields in Waveguides with Arbitrary Cross-Section Shapes by Means of Evolutionary Equations of Time-Domain Electromagnetic Theory

Authors: Ömer Aktaş, Olga A. Suvorova, Oleg Tretyakov

Abstract:

The boundary value problem on non-canonical and arbitrary shaped contour is solved with a numerically effective method called Analytical Regularization Method (ARM) to calculate propagation parameters. As a result of regularization, the equation of first kind is reduced to the infinite system of the linear algebraic equations of the second kind in the space of L2. This equation can be solved numerically for desired accuracy by using truncation method. The parameters as cut-off wavenumber and cut-off frequency are used in waveguide evolutionary equations of electromagnetic theory in time-domain to illustrate the real-valued TM fields with lossy and lossless media.

Keywords: analytical regularization method, electromagnetic theory evolutionary equations of time-domain, TM Field

Procedia PDF Downloads 489
17499 Communication of Expected Survival Time to Cancer Patients: How It Is Done and How It Should Be Done

Authors: Geir Kirkebøen

Abstract:

Most patients with serious diagnoses want to know their prognosis, in particular their expected survival time. As part of the informed consent process, physicians are legally obligated to communicate such information to patients. However, there is no established (evidence based) ‘best practice’ for how to do this. The two questions explored in this study are: How do physicians communicate expected survival time to patients, and how should it be done? We explored the first, descriptive question in a study with Norwegian oncologists as participants. The study had a scenario and a survey part. In the scenario part, the doctors should imagine that a patient, recently diagnosed with a serious cancer diagnosis, has asked them: ‘How long can I expect to live with such a diagnosis? I want an honest answer from you!’ The doctors should assume that the diagnosis is certain, and that from an extensive recent study they had optimal statistical knowledge, described in detail as a right-skewed survival curve, about how long such patients with this kind of diagnosis could be expected to live. The main finding was that very few of the oncologists would explain to the patient the variation in survival time as described by the survival curve. The majority would not give the patient an answer at all. Of those who gave an answer, the typical answer was that survival time varies a lot, that it is hard to say in a specific case, that we will come back to it later etc. The survey part of the study clearly indicates that the main reason why the oncologists would not deliver the mortality prognosis was discomfort with its uncertainty. The scenario part of the study confirmed this finding. The majority of the oncologists explicitly used the uncertainty, the variation in survival time, as a reason to not give the patient an answer. Many studies show that patients want realistic information about their mortality prognosis, and that they should be given hope. The question then is how to communicate the uncertainty of the prognosis in a realistic and optimistic – hopeful – way. Based on psychological research, our hypothesis is that the best way to do this is by explicitly describing the variation in survival time, the (usually) right skewed survival curve of the prognosis, and emphasize to the patient the (small) possibility of being a ‘lucky outlier’. We tested this hypothesis in two scenario studies with lay people as participants. The data clearly show that people prefer to receive expected survival time as a median value together with explicit information about the survival curve’s right skewedness (e.g., concrete examples of ‘positive outliers’), and that communicating expected survival time this way not only provides people with hope, but also gives them a more realistic understanding compared with the typical way expected survival time is communicated. Our data indicate that it is not the existence of the uncertainty regarding the mortality prognosis that is the problem for patients, but how this uncertainty is, or is not, communicated and explained.

Keywords: cancer patients, decision psychology, doctor-patient communication, mortality prognosis

Procedia PDF Downloads 316
17498 A Simple Recursive Framework to Generate Gray Codes for Weak Orders in Constant Amortized Time

Authors: Marsden Jacques, Dennis Wong

Abstract:

A weak order is a way to rank n objects where ties are allowed. In this talk, we present a recursive framework to generate Gray codes for weak orders. We then describe a simple algorithm based on the framework that generates 2-Gray codes for weak orders in constant amortized time per string. This framework can easily be modified to generate other Gray codes for weak orders. We provide an example on using the framework to generate the first Shift Gray code for weak orders, also in constant amortized time, where consecutive strings differ by a shift or a symbol change.

Keywords: weak order, Cayley permutation, Gray code, shift Gray code

Procedia PDF Downloads 163
17497 Time-Domain Expressions for Bridge Self-Excited Aerodynamic Forces by Modified Particle Swarm Optimizer

Authors: Hao-Su Liu, Jun-Qing Lei

Abstract:

This study introduces the theory of modified particle swarm optimizer and its application in time-domain expressions for bridge self-excited aerodynamic forces. Based on the indicial function expression and the rational function expression in time-domain expression for bridge self-excited aerodynamic forces, the characteristics of the two methods, i.e. the modified particle swarm optimizer and conventional search method, are compared in flutter derivatives’ fitting process. Theoretical analysis and numerical results indicate that adopting whether the indicial function expression or the rational function expression, the fitting flutter derivatives obtained by modified particle swarm optimizer have better goodness of fit with ones obtained from experiment. As to the flutter derivatives which have higher nonlinearity, the self-excited aerodynamic forces, using the flutter derivatives obtained through modified particle swarm optimizer fitting process, are much closer to the ones simulated by the experimental. The modified particle swarm optimizer was used to recognize the parameters of time-domain expressions for flutter derivatives of an actual long-span highway-railway truss bridge with double decks at the wind attack angle of 0°, -3° and +3°. It was found that this method could solve the bounded problems of attenuation coefficient effectively in conventional search method, and had the ability of searching in unboundedly area. Accordingly, this study provides a method for engineering industry to frequently and efficiently obtain the time-domain expressions for bridge self-excited aerodynamic forces.

Keywords: time-domain expressions, bridge self-excited aerodynamic forces, modified particle swarm optimizer, long-span highway-railway truss bridge

Procedia PDF Downloads 306