Search results for: iterative hard thresholding
791 Assessment of Lipid Lowering Effect of Shilajit in Adult Male Rats
Authors: U. P. Rathnakar, Sejpal Jaykumar, Shenoy K. Ashok
Abstract:
The effect of Shilajit was investigated for lipid lowering activity and its effect on weight gain in Wistar albino rats. Shilajit, semi-hard brownish black resin formed through long-term humidification of several plant types, mainly bryophytes, can be obtained from steep rocks of the Himalayas at altitudes between 1000 to 5000 meters. Hyperlipidemia was produced by feeding the rats with the cholesterol-rich high-fat diet (HFD) for 2 months. This diet contained deoxycholic acid, cholesterol and warm coconut oil in powdered rat chow diet. At the end of study, Shilajit treated rats showed significant decrease in serum LDL, triglyceride and total cholesterol level as well as increase in serum HDL level, in comparison to rats fed on high-fat diet with no treatment. Also during study period, increase in weight in Shilajit treated group was significantly less than in the other group of rats fed on high-fat diet with no treatment. Thus, Shilajit has significantly controlled the development of hyperlipidemia and weight gain in high-fat diet fed rats in the present study.Keywords: Shilajit, hyperlipidemia, weight control, cholesterol-rich high-fat diet
Procedia PDF Downloads 189790 Knowledge Management Best Practice Model in Higher Learning Institution: A Systematic Literature Review
Authors: Ismail Halijah, Abdullah Rusli
Abstract:
Introduction: This systematic literature review aims to identify the Knowledge Management Best Practice components in the Knowledge Management Model for Higher Learning Institutions environment. Study design: Systematic literature review. Methods: A systematic literature re-view of Knowledge Management Best Practice to identify and define the components of Best Practice from the Knowledge Management models was conducted recently. Results: This review of published papers of conference and journals’ articles shows the components of Best Practice in Knowledge Management are basically divided into two aspect which is the soft aspect and the hard aspect. The lacks of combination of these two aspects into an integrated model decelerate Knowledge Management Best Practice to fully throttle. Evidence from the literature shows the lack of integration of this two aspects leads to the immaturity of the Higher Learning Institution (HLI) towards the implementation of Knowledge Management System. Conclusion: The first steps of identifying the attributes to measure the Knowledge Management Best Practice components from the models in the literature will led to the definition of the Knowledge Management Best Practice component for the higher learning environment.Keywords: knowledge management, knowledge management system, knowledge management best practice, knowledge management higher learning institution
Procedia PDF Downloads 593789 Nurse Schedule Problem in Mubarak Al Kabeer Hospital
Authors: Khaled Al-Mansour, Nawaf Esmael, Abdulaziz Al-Zaid, Mohammed Al Ateeqi, Ali Al-Yousfi, Sayed Al-Zalzalah
Abstract:
In this project we will create the new schedule of nurse according to the preference of them. We did our project in Mubarak Al Kabeer Hospital (in Kuwait). The project aims to optimize the schedule of nurses in Mubarak Al Kabeer Hospital. The schedule of the nurses was studied and understood well to do any modification for their schedule to make the nurses feel as much comfort as they are. First constraints were found to know what things we can change and what things we can’t, the hard constraints are the hospital and ministry policies where we can’t change anything about, and the soft constraints are things that make nurses more comfortable. Data were collected and nurses were interviewed to know what is more better for them. All these constraints and date have been formulated to mathematical equations. This report will first contain an introduction to the topic which includes details of the problem definition. It will also contain information regarding the optimization of a nurse schedule and its contents and importance; furthermore, the report will contain information about the data needed to solve the problem and how it was collected. The problem requires formulation and that is also to be shown. The methodology will be explained which will state what has already been done. We used the lingo software to find the best schedule for the nurse. The schedule has been made according to what the nurses prefer, and also took consideration of the hospital policy when we make the schedule.Keywords: nurse schedule problem, Kuwait, hospital policy, optimization of schedules
Procedia PDF Downloads 269788 Molecular Detection of Crimean-Congo Hemorrhagic Fever in Ticks of Golestan Province, Iran
Authors: Nariman Shahhosseini, Sadegh Chinikar
Abstract:
Introduction: Crimean-Congo hemorrhagic fever virus (CCHFV) causes severe disease with fatality rates of 30%. The virus is transmitted to humans through the bite of an infected tick, direct contact with the products of infected livestock and nosocomially. The disease occurs sporadically throughout many of African, Asian, and European countries. Different species of ticks serve either as vector or reservoir for CCHFV. Materials and Methods: A molecular survey was conducted on hard ticks (Ixodidae) in Golestan province, north of Iran during 2014-2015. Samples were sent to National Reference Laboratory of Arboviruses (Pasteur Institute of Iran) and viral RNA was detected by RT-PCR. Results: Result revealed the presence of CCHFV in 5.3% of the selected ticks. The infected ticks belonged to Hy. dromedarii, Hy. anatolicum, Hy. marginatum, and Rh. sanguineus. Conclusions: These data demonstrates that Hyalomma ticks are the main vectors of CCHFV in Golestan province. Thus, preventive strategies such as using acaricides and repellents in order to avoid contact with Hyalomma ticks are proposed. Also, personal protective equipment (PPE) must be utilized at abattoirs.Keywords: tick, CCHFV, surveillance, vector diversity
Procedia PDF Downloads 372787 Improving 99mTc-tetrofosmin Myocardial Perfusion Images by Time Subtraction Technique
Authors: Yasuyuki Takahashi, Hayato Ishimura, Masao Miyagawa, Teruhito Mochizuki
Abstract:
Quantitative measurement of myocardium perfusion is possible with single photon emission computed tomography (SPECT) using a semiconductor detector. However, accumulation of 99mTc-tetrofosmin in the liver may make it difficult to assess that accurately in the inferior myocardium. Our idea is to reduce the high accumulation in the liver by using dynamic SPECT imaging and a technique called time subtraction. We evaluated the performance of a new SPECT system with a cadmium-zinc-telluride solid-state semi- conductor detector (Discovery NM 530c; GE Healthcare). Our system acquired list-mode raw data over 10 minutes for a typical patient. From the data, ten SPECT images were reconstructed, one for every minute of acquired data. Reconstruction with the semiconductor detector was based on an implementation of a 3-D iterative Bayesian reconstruction algorithm. We studied 20 patients with coronary artery disease (mean age 75.4 ± 12.1 years; range 42-86; 16 males and 4 females). In each subject, 259 MBq of 99mTc-tetrofosmin was injected intravenously. We performed both a phantom and a clinical study using dynamic SPECT. An approximation to a liver-only image is obtained by reconstructing an image from the early projections during which time the liver accumulation dominates (0.5~2.5 minutes SPECT image-5~10 minutes SPECT image). The extracted liver-only image is then subtracted from a later SPECT image that shows both the liver and the myocardial uptake (5~10 minutes SPECT image-liver-only image). The time subtraction of liver was possible in both a phantom and the clinical study. The visualization of the inferior myocardium was improved. In past reports, higher accumulation in the myocardium due to the overlap of the liver is un-diagnosable. Using our time subtraction method, the image quality of the 99mTc-tetorofosmin myocardial SPECT image is considerably improved.Keywords: 99mTc-tetrofosmin, dynamic SPECT, time subtraction, semiconductor detector
Procedia PDF Downloads 336786 Statistical Shape Analysis of the Human Upper Airway
Authors: Ramkumar Gunasekaran, John Cater, Vinod Suresh, Haribalan Kumar
Abstract:
The main objective of this project is to develop a statistical shape model using principal component analysis that could be used for analyzing the shape of the human airway. The ultimate goal of this project is to identify geometric risk factors for diagnosis and management of Obstructive Sleep Apnoea (OSA). Anonymous CBCT scans of 25 individuals were obtained from the Otago Radiology Group. The airways were segmented between the hard-palate and the aryepiglottic fold using snake active contour segmentation. The point data cloud of the segmented images was then fitted with a bi-cubic mesh, and pseudo landmarks were placed to perform PCA on the segmented airway to analyze the shape of the airway and to find the relationship between the shape and OSA risk factors. From the PCA results, the first four modes of variation were found to be significant. Mode 1 was interpreted to be the overall length of the airway, Mode 2 was related to the anterior-posterior width of the retroglossal region, Mode 3 was related to the lateral dimension of the oropharyngeal region and Mode 4 was related to the anterior-posterior width of the oropharyngeal region. All these regions are subjected to the risk factors of OSA.Keywords: medical imaging, image processing, FEM/BEM, statistical modelling
Procedia PDF Downloads 514785 Sensitivity Analysis of Prestressed Post-Tensioned I-Girder and Deck System
Authors: Tahsin A. H. Nishat, Raquib Ahsan
Abstract:
Sensitivity analysis of design parameters of the optimization procedure can become a significant factor while designing any structural system. The objectives of the study are to analyze the sensitivity of deck slab thickness parameter obtained from both the conventional and optimum design methodology of pre-stressed post-tensioned I-girder and deck system and to compare the relative significance of slab thickness. For analysis on conventional method, the values of 14 design parameters obtained by the conventional iterative method of design of a real-life I-girder bridge project have been considered. On the other side for analysis on optimization method, cost optimization of this system has been done using global optimization methodology 'Evolutionary Operation (EVOP)'. The problem, by which optimum values of 14 design parameters have been obtained, contains 14 explicit constraints and 46 implicit constraints. For both types of design parameters, sensitivity analysis has been conducted on deck slab thickness parameter which can become too sensitive for the obtained optimum solution. Deviations of slab thickness on both the upper and lower side of its optimum value have been considered reflecting its realistic possible ranges of variations during construction. In this procedure, the remaining parameters have been kept unchanged. For small deviations from the optimum value, compliance with the explicit and implicit constraints has been examined. Variations in the cost have also been estimated. It is obtained that without violating any constraint deck slab thickness obtained by the conventional method can be increased up to 25 mm whereas slab thickness obtained by cost optimization can be increased only up to 0.3 mm. The obtained result suggests that slab thickness becomes less sensitive in case of conventional method of design. Therefore, for realistic design purpose sensitivity should be conducted for any of the design procedure of girder and deck system.Keywords: sensitivity analysis, optimum design, evolutionary operations, PC I-girder, deck system
Procedia PDF Downloads 137784 Mannequin Evaluation of 3D-Printed Intermittent Oro-Esophageal Tube Guide for Dysphagia
Authors: Yujin Jeong, Youkyung Son, Myounghwan Choi, Sanghyub Lee, Sangyeol Lee, Changho Hwang, Kyo-in Koo
Abstract:
Dysphasia is difficulty in swallowing food because of oral cavity impairments induced by stroke, muscle damage, tumor. Intermittent oro-esophageal (IOE) tube feeding is one of the well-known feeding methods for the dysphasia patients. However, it is hard to insert at the proper position in esophagus. In this study, we design and fabricate the IOE tube guide using 3-dimensional (3D) printer. The printed IOE tube is tested in a mannequin (Airway Management Trainer, Co., Ltd., Copenhagen, Denmark) mimicking human’s esophagus. The gag reflex point is measured as the design point in the mannequin. To avoid the gag reflex, we design various shapes of IOE tube guide. One structure is separated into three parts; biting part, part through oral cavity, connecting part to oro-esophageal. We designed 6 types of IOE tube guide adjusting length and angle of these three parts. To evaluate the IOE tube guide, it is inserted in the mannequin, and through the inserted guide, an endoscopic camera successfully arrived at the oro-esophageal. We had planned to apply this mannequin-based design experience to patients in near future.Keywords: dysphagia, feeding method, IOE tube guide, 3-D printer
Procedia PDF Downloads 434783 Virtual Experiments on Coarse-Grained Soil Using X-Ray CT and Finite Element Analysis
Authors: Mohamed Ali Abdennadher
Abstract:
Digital rock physics, an emerging field leveraging advanced imaging and numerical techniques, offers a promising approach to investigating the mechanical properties of granular materials without extensive physical experiments. This study focuses on using X-Ray Computed Tomography (CT) to capture the three-dimensional (3D) structure of coarse-grained soil at the particle level, combined with finite element analysis (FEA) to simulate the soil's behavior under compression. The primary goal is to establish a reliable virtual testing framework that can replicate laboratory results and offer deeper insights into soil mechanics. The methodology involves acquiring high-resolution CT scans of coarse-grained soil samples to visualize internal particle morphology. These CT images undergo processing through noise reduction, thresholding, and watershed segmentation techniques to isolate individual particles, preparing the data for subsequent analysis. A custom Python script is employed to extract particle shapes and conduct a statistical analysis of particle size distribution. The processed particle data then serves as the basis for creating a finite element model comprising approximately 500 particles subjected to one-dimensional compression. The FEA simulations explore the effects of mesh refinement and friction coefficient on stress distribution at grain contacts. A multi-layer meshing strategy is applied, featuring finer meshes at inter-particle contacts to accurately capture mechanical interactions and coarser meshes within particle interiors to optimize computational efficiency. Despite the known challenges in parallelizing FEA to high core counts, this study demonstrates that an appropriate domain-level parallelization strategy can achieve significant scalability, allowing simulations to extend to very high core counts. The results show a strong correlation between the finite element simulations and laboratory compression test data, validating the effectiveness of the virtual experiment approach. Detailed stress distribution patterns reveal that soil compression behavior is significantly influenced by frictional interactions, with frictional sliding, rotation, and rolling at inter-particle contacts being the primary deformation modes under low to intermediate confining pressures. These findings highlight that CT data analysis combined with numerical simulations offers a robust method for approximating soil behavior, potentially reducing the need for physical laboratory experiments.Keywords: X-Ray computed tomography, finite element analysis, soil compression behavior, particle morphology
Procedia PDF Downloads 37782 A Genetic Algorithm Approach to Solve a Weaving Job Scheduling Problem, Aiming Tardiness Minimization
Authors: Carolina Silva, João Nuno Oliveira, Rui Sousa, João Paulo Silva
Abstract:
This study uses genetic algorithms to solve a job scheduling problem in a weaving factory. The underline problem regards an NP-Hard problem concerning unrelated parallel machines, with sequence-dependent setup times. This research uses real data regarding a weaving industry located in the North of Portugal, with a capacity of 96 looms and a production, on average, of 440000 meters of fabric per month. Besides, this study includes a high level of complexity once most of the real production constraints are applied, and several real data instances are tested. Topics such as data analyses and algorithm performance are addressed and tested, to offer a solution that can generate reliable and due date results. All the approaches will be tested in the operational environment, and the KPIs monitored, to understand the solution's impact on the production, with a particular focus on the total number of weeks of late deliveries to clients. Thus, the main goal of this research is to develop a solution that allows for the production of automatically optimized production plans, aiming to the tardiness minimizing.Keywords: genetic algorithms, textile industry, job scheduling, optimization
Procedia PDF Downloads 157781 Inclusion of Students with Disabilities (SWD) in Higher Education Institutions (HEIs): Self-Advocacy and Engagement as Central
Authors: Tadesse Abera
Abstract:
This study aimed to investigate the contribution of self-advocacy and engagement in the inclusion of SWDs in HEIs. A convergent parallel mixed methods design was employed. This article reports the quantitative strand. A total of 246 SWDs were selected through stratified proportionate random sampling technique from five public HEIs in Ethiopia. Data were collected through Self-advocacy questionnaire, student engagement scale, and college student experience questionnaire and analyzed through frequency, percentage, mean, standard deviation, correlation, one sample t-test and multiple regression. Both self-advocacy and engagement were found to have a predictive power on inclusion of respondents in the HEIs, where engagement was found to be more predictor. From the components of self-advocacy, knowledge of self and leadership and from engagement dimensions sense of belonging, cognitive, and valuing in their respective orders were found to have a stronger predictive power on the inclusion of respondents in the institutions. Based on the findings it was concluded that, if students with disabilities work hard to be self-determined, strive for realizing social justice, exert quality effort and seek active involvement, their inclusion in the institutions would be ensured.Keywords: self-advocacy, engagement, inclusion, students with disabilities, higher education institution
Procedia PDF Downloads 78780 Feasibility of Implementing Digital Healthcare Technologies to Prevent Disease: A Mixed-Methods Evaluation of a Digital Intervention Piloted in the National Health Service
Authors: Rosie Cooper, Tracey Chantler, Ellen Pringle, Sadie Bell, Emily Edmundson, Heidi Nielsen, Sheila Roberts, Michael Edelstein, Sandra Mounier Jack
Abstract:
Introduction: In line with the National Health Service’s (NHS) long-term plan, the NHS is looking to implement more digital health interventions. This study explores a case study in this area: a digital intervention used by NHS Trusts in London to consent adolescents for Human Papilloma Virus (HPV) immunisation. Methods: The electronic consent intervention was implemented in 14 secondary schools in inner city, London. These schools were statistically matched with 14 schools from the same area that were consenting using paper forms. Schools were matched on deprivation and English as an additional language. Consent form return rates and HPV vaccine uptake were compared quantitatively between intervention and matched schools. Data from observations of immunisation sessions and school feedback forms were analysed thematically. Individual and group interviews were undertaken with implementers parents and adolescents and a focus group with adolescents were undertaken and analysed thematically. Results: Twenty-eight schools (14 e-consent schools and 14 paper consent schools) comprising 3219 girls (1733 in paper consent schools and 1486 in e-consent schools) were included in the study. The proportion of pupils eligible for free school meals, with English as an additional language and students' ethnicity profile, was similar between the e-consent and paper consent schools. Return of consent forms was not increased by the implementation of the e-consent intervention. There was no difference in the proportion of pupils that were vaccinated at the scheduled vaccination session between the paper (n=14) and e-consent (n=14) schools (80.6% vs. 81.3%, p=0.93). The transition to using the system was not straightforward, whilst schools and staff understood the potential benefits, they found it difficult to adapt to new ways of working which removed some level or control from schools. Part of the reason for lower consent form return in e-consent schools was that some parents found the intervention difficult to use due to limited access to the internet, finding it hard to open the weblink, language barriers, and in some cases, the system closed a few days prior to sessions. Adolescents also highlighted the potential for e-consent interventions to by-pass their information needs. Discussion: We would advise caution against dismissing the e-consent intervention because it did not achieve its goal of increasing the return of consent forms. Given the problems embedding a news service, it was encouraging that HPV vaccine uptake remained stable. Introducing change requires stakeholders to understand, buy in, and work together with others. Schools and staff understood the potential benefits of using e-consent but found the new ways of working removed some level of control from schools, which they found hard to adapt to, possibly suggesting implementing digital technology will require an embedding process. Conclusion: The future direction of the NHS will require implementation of digital technology. Obtaining electronic consent from parents could help streamline school-based adolescent immunisation programmes. Findings from this study suggest that when implementing new digital technologies, it is important to allow for a period of embedding to enable them to become incorporated in everyday practice.Keywords: consent, digital, immunisation, prevention
Procedia PDF Downloads 148779 Computational Fluid Dynamics Simulations of Thermal and Flow Fields inside a Desktop Personal Computer Cabin
Authors: Mohammad Salehi, Mohammad Erfan Doraki
Abstract:
In this paper, airflow analysis inside a desktop computer case is performed by simulating computational fluid dynamics. The purpose is to investigate the cooling process of the central processing unit (CPU) with thermal capacities of 80 and 130 watts. The airflow inside the computer enclosure, selected from the microATX model, consists of the main components of heat production such as CPU, hard disk drive, CD drive, floppy drive, memory card and power supply unit; According to the amount of thermal power produced by the CPU with 80 and 130 watts of power, two different geometries have been used for a direct and radial heat sink. First, the independence of the computational mesh and the validation of the solution were performed, and after ensuring the correctness of the numerical solution, the results of the solution were analyzed. The simulation results showed that changes in CPU temperature and other components linearly increased with increasing CPU heat output. Also, the ambient air temperature has a significant effect on the maximum processor temperature.Keywords: computational fluid dynamics, CPU cooling, computer case simulation, heat sink
Procedia PDF Downloads 124778 Definition and Core Components of the Role-Partner Allocation Problem in Collaborative Networks
Authors: J. Andrade-Garda, A. Anguera, J. Ares-Casal, M. Hidalgo-Lorenzo, J.-A. Lara, D. Lizcano, S. Suárez-Garaboa
Abstract:
In the current constantly changing economic context, collaborative networks allow partners to undertake projects that would not be possible if attempted by them individually. These projects usually involve the performance of a group of tasks (named roles) that have to be distributed among the partners. Thus, an allocation/matching problem arises that will be referred to as Role-Partner Allocation problem. In real life this situation is addressed by negotiation between partners in order to reach ad hoc agreements. Besides taking a long time and being hard work, both historical evidence and economic analysis show that such approach is not recommended. Instead, the allocation process should be automated by means of a centralized matching scheme. However, as a preliminary step to start the search for such a matching mechanism (or even the development of a new one), the problem and its core components must be specified. To this end, this paper establishes (i) the definition of the problem and its constraints, (ii) the key features of the involved elements (i.e., roles and partners); and (iii) how to create preference lists both for roles and partners. Only this way it will be possible to conduct subsequent methodological research on the solution method.Keywords: collaborative network, matching, partner, preference list, role
Procedia PDF Downloads 237777 Finite Element Analysis for Earing Prediction Incorporating the BBC2003 Material Model with Fully Implicit Integration Method: Derivation and Numerical Algorithm
Authors: Sajjad Izadpanah, Seyed Hadi Ghaderi, Morteza Sayah Irani, Mahdi Gerdooei
Abstract:
In this research work, a sophisticated yield criterion known as BBC2003, capable of describing planar anisotropic behaviors of aluminum alloy sheets, was integrated into the commercial finite element code ABAQUS/Standard via a user subroutine. The complete formulation of the implementation process using a fully implicit integration scheme, i.e., the classic backward Euler method, is presented, and relevant aspects of the yield criterion are introduced. In order to solve nonlinear differential and algebraic equations, the line-search algorithm was adopted in the user-defined material subroutine (UMAT) to expand the convergence domain of the iterative Newton-Raphson method. The developed subroutine was used to simulate a challenging computational problem with complex stress states, i.e., deep drawing of an anisotropic aluminum alloy AA3105. The accuracy and stability of the developed subroutine were confirmed by comparing the numerically predicted earing and thickness variation profiles with the experimental results, which showed an excellent agreement between numerical and experimental earing and thickness profiles. The integration of the BBC2003 yield criterion into ABAQUS/Standard represents a significant contribution to the field of computational mechanics and provides a useful tool for analyzing the mechanical behavior of anisotropic materials subjected to complex loading conditions.Keywords: BBC2003 yield function, plastic anisotropy, fully implicit integration scheme, line search algorithm, explicit and implicit integration schemes
Procedia PDF Downloads 75776 A Polynomial Approach for a Graphical-based Integrated Production and Transport Scheduling with Capacity Restrictions
Authors: M. Ndeley
Abstract:
The performance of global manufacturing supply chains depends on the interaction of production and transport processes. Currently, the scheduling of these processes is done separately without considering mutual requirements, which leads to no optimal solutions. An integrated scheduling of both processes enables the improvement of supply chain performance. The integrated production and transport scheduling problem (PTSP) is NP-hard, so that heuristic methods are necessary to efficiently solve large problem instances as in the case of global manufacturing supply chains. This paper presents a heuristic scheduling approach which handles the integration of flexible production processes with intermodal transport, incorporating flexible land transport. The method is based on a graph that allows a reformulation of the PTSP as a shortest path problem for each job, which can be solved in polynomial time. The proposed method is applied to a supply chain scenario with a manufacturing facility in South Africa and shipments of finished product to customers within the Country. The obtained results show that the approach is suitable for the scheduling of large-scale problems and can be flexibly adapted to different scenarios.Keywords: production and transport scheduling problem, graph based scheduling, integrated scheduling
Procedia PDF Downloads 475775 South Asia’s Political Landscape: Precipitating Terrorism
Authors: Saroj Kumar Rath
Abstract:
India's Muslims represent 15 percent of the nation's population, the world's third largest group in any nation after Indonesia and Pakistan. Extremist groups like the Islamic State, Al Qaeda, the Taliban and the Haqqani network increasingly view India as a target. Several trends explain the rise: Terrorism threats in South Asia are linked and mobile - if one source is batted down, jihadists relocate to find another Islamic cause. As NATO withdraws from Afghanistan, some jihadists will eye India. Pakistan regards India as a top enemy and some officials even encourage terrorists to target areas like Kashmir or Mumbai. Meanwhile, a stream of Wahhabi preachers have visited India, offering hard-line messages; extremist groups like Al Qaeda and the Islamic State compete for influence, and militants even pay jihadists. Muslims as a minority population in India could offer fertile ground for the extremist recruiters. This paper argues that there is an urgent need for the Indian government to profile militants and examine social media sites to attack Wahhabi indoctrination while supporting education and entrepreneurship for all of India's citizens.Keywords: Al Qaeda, terrorism, Islamic state, India, haqqani network, Pakistan, Taliban
Procedia PDF Downloads 617774 A Study of the Trade-off Energy Consumption-Performance-Schedulability for DVFS Multicore Systems
Authors: Jalil Boudjadar
Abstract:
Dynamic Voltage and Frequency Scaling (DVFS) multicore platforms are promising execution platforms that enable high computational performance, less energy consumption and flexibility in scheduling the system processes. However, the resulting interleaving and memory interference together with per-core frequency tuning make real-time guarantees hard to be delivered. Besides, energy consumption represents a strong constraint for the deployment of such systems on energy-limited settings. Identifying the system configurations that would achieve a high performance and consume less energy while guaranteeing the system schedulability is a complex task in the design of modern embedded systems. This work studies the trade-off between energy consumption, cores utilization and memory bottleneck and their impact on the schedulability of DVFS multicore time-critical systems with a hierarchy of shared memories. We build a model-based framework using Parametrized Timed Automata of UPPAAL to analyze the mutual impact of performance, energy consumption and schedulability of DVFS multicore systems, and demonstrate the trade-off on an actual case study.Keywords: time-critical systems, multicore systems, schedulability analysis, energy consumption, performance analysis
Procedia PDF Downloads 108773 A Neural Network System for Predicting the Hardness of Titanium Aluminum Nitrite (TiAlN) Coatings
Authors: Omar M. Elmabrouk
Abstract:
The cutting tool, in the high-speed machining process, is consistently dealing with high localized stress at the tool tip, tip temperature exceeds 800°C and the chip slides along the rake face. These conditions are affecting the tool wear, the cutting tool performances, the quality of the produced parts and the tool life. Therefore, a thin film coating on the cutting tool should be considered to improve the tool surface properties while maintaining its bulks properties. One of the general coating processes in applying thin film for hard coating purpose is PVD magnetron sputtering. In this paper, the prediction of the effects of PVD magnetron sputtering coating process parameters, sputter power in the range of (4.81-7.19 kW), bias voltage in the range of (50.00-300.00 Volts) and substrate temperature in the range of (281.08-600.00 °C), were studied using artificial neural network (ANN). The results were compared with previously published results using RSM model. It was found that the ANN is more accurate in prediction of tool hardness, and hence, it will not only improve the tool life of the tool but also significantly enhances the efficiency of the machining processes.Keywords: artificial neural network, hardness, prediction, titanium aluminium nitrate coating
Procedia PDF Downloads 554772 Public Wi-Fi Security Threat Evil Twin Attack Detection Based on Signal Variant and Hop Count
Authors: Said Abdul Ahad Ahadi, Elyas Baray, Nitin Rakesh, Sudeep Varshney
Abstract:
Wi-Fi is a widely used internet source that is used to provide internet access in many areas such as Stores, Cafes, University campuses, Restaurants and so on. This technology brought more facilities in communication and networking. On the other hand, due to the transmission of data over the air, which makes the network vulnerable, so it becomes prone to various threats such as Evil Twin and etc. The Evil Twin is a kind of adversary which impersonates a legitimate access point (LAP) as it can happen by spoofing the name (SSID) and MAC address (BSSID) of a legitimate access point (LAP). And this attack can cause many threats such as MITM, Service Interruption, Access point service blocking. Various Evil Twin Attack Detection Techniques are proposed, but they require additional hardware, or they require protocol modification. In this paper, we proposed a new technique based on Access Point’s two fingerprints, Received Signal Strength Indicator (RSSI) and Hop Count, that is hard to copy by an adversary. And we implemented the technique in a system called “ETDetector,” which can detect and prevent the attack.Keywords: evil twin, LAP, SSID, Wi-Fi security, signal variation, ETAD, kali linux, scapy, python
Procedia PDF Downloads 144771 A Segmentation Method for Grayscale Images Based on the Firefly Algorithm and the Gaussian Mixture Model
Authors: Donatella Giuliani
Abstract:
In this research, we propose an unsupervised grayscale image segmentation method based on a combination of the Firefly Algorithm and the Gaussian Mixture Model. Firstly, the Firefly Algorithm has been applied in a histogram-based research of cluster means. The Firefly Algorithm is a stochastic global optimization technique, centered on the flashing characteristics of fireflies. In this context it has been performed to determine the number of clusters and the related cluster means in a histogram-based segmentation approach. Successively these means are used in the initialization step for the parameter estimation of a Gaussian Mixture Model. The parametric probability density function of a Gaussian Mixture Model is represented as a weighted sum of Gaussian component densities, whose parameters are evaluated applying the iterative Expectation-Maximization technique. The coefficients of the linear super-position of Gaussians can be thought as prior probabilities of each component. Applying the Bayes rule, the posterior probabilities of the grayscale intensities have been evaluated, therefore their maxima are used to assign each pixel to the clusters, according to their gray-level values. The proposed approach appears fairly solid and reliable when applied even to complex grayscale images. The validation has been performed by using different standard measures, more precisely: the Root Mean Square Error (RMSE), the Structural Content (SC), the Normalized Correlation Coefficient (NK) and the Davies-Bouldin (DB) index. The achieved results have strongly confirmed the robustness of this gray scale segmentation method based on a metaheuristic algorithm. Another noteworthy advantage of this methodology is due to the use of maxima of responsibilities for the pixel assignment that implies a consistent reduction of the computational costs.Keywords: clustering images, firefly algorithm, Gaussian mixture model, meta heuristic algorithm, image segmentation
Procedia PDF Downloads 217770 Inter-Communication-Management in Cases with Disabled Children (ICDC)
Authors: Dena A. Hussain
Abstract:
The objective of this project is to design an Information and Communication Technologies (ICT) tool based on a standardized platform to assist the work-integrated learning process of caretakers of disabled children. The tool should assist the intercommunication between caretakers and improve the learning process through knowledge bridging between all involved caretakers. Some children are born with disabilities while others have special needs after an illness or accident. Special needs children often need help in their learning process and require tools and services in a different way. In some cases the child has multiple disabilities that affect several capabilities in different ways. These needs are to be transformed into different learning techniques that the staff or personal (called caretakers in this project) caring for the child needs to learn and adapt. The caretakers involved are also required to learn new learning or training techniques and utilities specialized for the child’s needs. In many cases the number of people caring for the child’s development is rather large; the parents, specialist pedagogues, teachers, therapists, psychologists, personal assistants, etc. Each group of specialists has different objectives and in some cases the merge between theses specifications is very unique. This makes the synchronization between different caretakers difficult, resulting often in low level cooperation. By better intercommunication between professions both the child’s development could be improved but also the caretakers’ methods and knowledge of each other’s work processes and their own profession. This introduces a unique work integrated learning environment for all personnel involve, merging learning and knowledge in the work environment and at the same time assist the children’s development process. Creating an iterative process generates a unique learning experience for all involved. Using a work integrated platform will help encourage and support the process of all the teams involved in the process.We believe that working with children who have special needs is a continues learning/working process that is always integrated to achieve one main goal, which is to make a better future for all children.Keywords: information and communication technologies (ICT), work integrated learning (WIL), sustainable learning, special needs children
Procedia PDF Downloads 295769 Optimization of Reliability and Communicability of a Random Two-Dimensional Point Patterns Using Delaunay Triangulation
Authors: Sopheak Sorn, Kwok Yip Szeto
Abstract:
Reliability is one of the important measures of how well the system meets its design objective, and mathematically is the probability that a complex system will perform satisfactorily. When the system is described by a network of N components (nodes) and their L connection (links), the reliability of the system becomes a network design problem that is an NP-hard combinatorial optimization problem. In this paper, we address the network design problem for a random point set’s pattern in two dimensions. We make use of a Voronoi construction with each cell containing exactly one point in the point pattern and compute the reliability of the Voronoi’s dual, i.e. the Delaunay graph. We further investigate the communicability of the Delaunay network. We find that there is a positive correlation and a negative correlation between the homogeneity of a Delaunay's degree distribution with its reliability and its communicability respectively. Based on the correlations, we alter the communicability and the reliability by performing random edge flips, which preserve the number of links and nodes in the network but can increase the communicability in a Delaunay network at the cost of its reliability. This transformation is later used to optimize a Delaunay network with the optimum geometric mean between communicability and reliability. We also discuss the importance of the edge flips in the evolution of real soap froth in two dimensions.Keywords: Communicability, Delaunay triangulation, Edge Flip, Reliability, Two dimensional network, Voronio
Procedia PDF Downloads 420768 The Impact of Experiential Learning on the Success of Upper Division Mechanical Engineering Students
Authors: Seyedali Seyedkavoosi, Mohammad Obadat, Seantorrion Boyle
Abstract:
The purpose of this study is to assess the effectiveness of a nontraditional experiential learning strategy in improving the success and interest of mechanical engineering students, using the Kinematics/Dynamics of Machine course as a case study. This upper-division technical course covers a wide range of topics, including mechanism and machine system analysis and synthesis, yet the complexities of ideas like acceleration, motion, and machine component relationships are hard to explain using standard teaching techniques. To solve this problem, a thorough design project was created that gave students hands-on experience developing, manufacturing, and testing their inventions. The main goals of the project were to improve students' grasp of machine design and kinematics, to develop problem-solving and presenting abilities, and to familiarize them with professional software. A questionnaire survey was done to evaluate the effect of this technique on students' performance and interest in mechanical engineering. The outcomes of the study shed light on the usefulness of nontraditional experiential learning approaches in engineering education.Keywords: experiential learning, nontraditional teaching, hands-on design project, engineering education
Procedia PDF Downloads 98767 The Effect of Heating-Liquid Nitrogen Cooling on Fracture Toughness of Anisotropic Rock
Authors: A. Kavandi, K. Goshtasbi, M. R. Hadei, H. Nejati
Abstract:
In geothermal energy production, the method of liquid nitrogen (LN₂) fracturing in hot, dry rock is one of the most effective methods to increase the permeability of the reservoir. The geothermal reservoirs mainly consist of hard rocks such as granites and metamorphic rocks like gneiss with high temperatures. Gneiss, as a metamorphic rock, experiences a high level of inherent anisotropy. This type of anisotropy is considered as the nature of rocks, which affects the mechanical behavior of rocks. The aim of this study is to investigate the effects of heating-liquid nitrogen (LN₂) cooling treatment and rock anisotropy on the fracture toughness of gneiss. For this aim, a series of semi-circular bend (SCB) tests were carried out on specimens of gneiss with different anisotropy plane angles (0°, 30°, 60°, and 90°). In this study, gneiss specimens were exposed to heating–cooling treatment through gradual heating to 100°C followed by LN₂ cooling. Results indicate that the fracture toughness of treated samples is lower than that of untreated samples, and with increasing the anisotropy plane angle, the fracture toughness increases. The scanning electron microscope (SEM) technique is also implemented to evaluate the fracture process zone (FPZ) ahead of the crack tip.Keywords: heating-cooling, anisotropic rock, fracture toughness, liquid nitrogen
Procedia PDF Downloads 58766 Teaching Continuities in the Great Books Tradition and Contemporary Popular Culture
Authors: Alex Kizuk
Abstract:
This paper studies the trope or meme of the Siren in terms of what long-standing cultural continuities can be found in college classrooms today. Those who have raised children may remember reading from Hans Christian Anderson's 'The Little Mermaid' (1836), not to mention regaling them with colorful Disneyesque versions when they were younger. Though Anderson tempered the darker first ending of the story to give the little mermaid more agency in her salvation—a prognostic developed in Disney adaptations—nonetheless, the tale pivots on an image of a 'heavenly realm' that the mermaid may eventually come to know or comprehend as a beloved woman on dry land. Only after 300 years, however, may she hope to see that 'which lives forever' and 'rises through thin air, up to the shining stars. Just as [sea-people] rise through the water to see the lands on earth.' What students today can see in this example is a trope of the agonistic soul in a hard-won disembarkation at a harbour of knowledge--where the seeker after truth may come to know through persistence (300 years)—all that is good and true concerning human life. This paper discusses several such examples from the Great Books and popular culture to suggest that teaching in the world of the 21st century could do worse than accede to some such perennial seeking.Keywords: the Great Books, tradition, popular culture, 21st century directions in teaching
Procedia PDF Downloads 158765 Fuzzy Population-Based Meta-Heuristic Approaches for Attribute Reduction in Rough Set Theory
Authors: Mafarja Majdi, Salwani Abdullah, Najmeh S. Jaddi
Abstract:
One of the global combinatorial optimization problems in machine learning is feature selection. It concerned with removing the irrelevant, noisy, and redundant data, along with keeping the original meaning of the original data. Attribute reduction in rough set theory is an important feature selection method. Since attribute reduction is an NP-hard problem, it is necessary to investigate fast and effective approximate algorithms. In this paper, we proposed two feature selection mechanisms based on memetic algorithms (MAs) which combine the genetic algorithm with a fuzzy record to record travel algorithm and a fuzzy controlled great deluge algorithm to identify a good balance between local search and genetic search. In order to verify the proposed approaches, numerical experiments are carried out on thirteen datasets. The results show that the MAs approaches are efficient in solving attribute reduction problems when compared with other meta-heuristic approaches.Keywords: rough set theory, attribute reduction, fuzzy logic, memetic algorithms, record to record algorithm, great deluge algorithm
Procedia PDF Downloads 456764 Study of Behavior Tribological Cutting Tools Based on Coating
Authors: A. Achour L. Chekour, A. Mekroud
Abstract:
Tribology, the science of lubrication, friction and wear, plays an important role in science "crossroads" initiated by the recent developments in the industry. Its multidisciplinary nature reinforces its scientific interest. It covers all the sciences that deal with the contact between two solids loaded and relative motion. It is thus one of the many intersections more clearly established disciplines such as solid mechanics and the fluids, rheological, thermal, materials science and chemistry. As for his experimental approach, it is based on the physical and processing signals and images. The optimization of operating conditions by cutting tool must contribute significantly to the development and productivity of advanced automation of machining techniques because their implementation requires sufficient knowledge of how the process and in particular the evolution of tool wear. In addition, technological advances have developed the use of very hard materials, refractory difficult machinability, requiring highly resistant materials tools. In this study, we present the behavior wear a machining tool during the roughing operation according to the cutting parameters. The interpretation of the experimental results is based mainly on observations and analyzes of sharp edges e tool using the latest techniques: scanning electron microscopy (SEM) and optical rugosimetry laser beam.Keywords: friction, wear, tool, cutting
Procedia PDF Downloads 331763 Reconsidering the Legitimacy of Capital Punishment in the Interpretation of the Human Right to Life in the Two Traditional Approaches
Authors: Yujie Zhang
Abstract:
There are debates around the legitimacy of capital punishment, i.e., whether death could serve as a proper execution in our legal system or not. Different arguments have been raised. However, none of them seem able to provide a determined answer to the issue; this results in a lack of instruction in the legal practice. This article, therefore, devotes itself to the effort to find such an answer. It takes the perspective of rights, through interpreting the concept of right to life, which capital punishment appears to be in confliction with in the two traditional approaches, to reveal a possibly best account of the right and its conclusion on capital punishment. However, this effort is not a normative one which focuses on what ought to be. It means the article does not try to work out which argument we should choose and solve the hot debate on whether capital punishment should be allowed or not. It, again, does not propose which perspective we should take to approach this issue or generally which account of right must be better; rather, it is more a thought experiment. It attempts to raise a new perspective to approach the issue of the legitimacy of capital punishment. Both its perspective and conclusion therefore are tentative: what if we view this issue in a way we have never tried before, for example the different accounts of right to life? In this sense, the perspective could be defied, while the conclusion could be rejected. Other perspectives and conclusions are also possible. Notwithstanding, this tentative perspective and account of the right still could not be denied from serving as a potential approach, since it does have the ability to provide us with a determined attitude toward capital punishment that is hard to achieve through existing arguments.Keywords: capital punishment, right to life, theories of rights, the choice theory
Procedia PDF Downloads 196762 Biostabilisation of Sediments for the Protection of Marine Infrastructure from Scour
Authors: Rob Schindler
Abstract:
Industry-standard methods of mitigating erosion of seabed sediments rely on ‘hard engineering’ approaches which have numerous environmental shortcomings: (1) direct loss of habitat by smothering of benthic species, (2) disruption of sediment transport processes, damaging geomorphic and ecosystem functionality (3) generation of secondary erosion problems, (4) introduction of material that may propagate non-local species, and (5) provision of pathways for the spread of invasive species. Recent studies have also revealed the importance of biological cohesion, the result of naturally occurring extra-cellular polymeric substances (EPS), in stabilizing natural sediments. Mimicking the strong bonding kinetics through the deliberate addition of EPS to sediments – henceforth termed ‘biostabilisation’ - offers a means in which to mitigate against erosion induced by structures or episodic increases in hydrodynamic forcing (e.g. storms and floods) whilst avoiding, or reducing, hard engineering. Here we present unique experiments that systematically examine how biostabilisation reduces scour around a monopile in a current, a first step to realizing the potential of this new method of scouring reduction for a wide range of engineering purposes in aquatic substrates. Experiments were performed in Plymouth University’s recirculating sediment flume which includes a recessed scour pit. The model monopile was 0.048 m in diameter, D. Assuming a prototype monopile diameter of 2.0 m yields a geometric ratio of 41.67. When applied to a 10 m prototype water depth this yields a model depth, d, of 0.24 m. The sediment pit containing the monopile was filled with different biostabilised substrata prepared using a mixture of fine sand (D50 = 230 μm) and EPS (Xanthan gum). Nine sand-EPS mixtures were examined spanning EPS contents of 0.0% < b0 < 0.50%. Scour development was measured using a laser point gauge along a 530 mm centreline at 10 mm increments at regular periods over 5 h. Maximum scour depth and excavated area were determined at different time steps and plotted against time to yield equilibrium values. After 5 hours the current was stopped and a detailed scan of the final scour morphology was taken. Results show that increasing EPS content causes a progressive reduction in the equilibrium depth and lateral extent of scour, and hence excavated material. Very small amounts equating to natural communities (< 0.1% by mass) reduce scour rate, depth and extent of scour around monopiles. Furthermore, the strong linear relationships between EPS content, equilibrium scour depth, excavation area and timescales of scouring offer a simple index on which to modify existing scour prediction methods. We conclude that the biostabilisation of sediments with EPS may offer a simple, cost-effective and ecologically sensitive means of reducing scour in a range of contexts including OWFs, bridge piers, pipeline installation, and void filling in rock armour. Biostabilisation may also reduce economic costs through (1) Use of existing site sediments, or waste dredged sediments (2) Reduced fabrication of materials, (3) Lower transport costs, (4) Less dependence on specialist vessels and precise sub-sea assembly. Further, its potential environmental credentials may allow sensitive use of the seabed in marine protection zones across the globe.Keywords: biostabilisation, EPS, marine, scour
Procedia PDF Downloads 167