Search results for: scientific computing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2970

Search results for: scientific computing

2010 Impact of Normative Institutional Factors on Sustainability Reporting

Authors: Lina Dagilienė

Abstract:

The article explores the impact of normative institutional factors on the development of sustainability reporting. The vast majority of research in the scientific literature focuses on mandatory institutional factors, i.e. how public institutions and market regulators affect sustainability reporting. Meanwhile, there is lack of empirical data for the impact of normative institutional factors. The effect of normative factors in this paper is based on the role of non-governmental organizations (NGO) and institutional theory. The case of Global Compact Local Network in the developing country was examined. The research results revealed that in the absence of regulated factors, companies were not active with regard to social disclosures; they presented non-systemized social information of a descriptive nature. Only 10% of sustainability reports were prepared using the GRI methodology. None of the reports were assured by third parties.

Keywords: institutional theory, normative, sustainability reporting, Global Compact Local Network

Procedia PDF Downloads 377
2009 Effect of Ionized Plasma Medium on the Radiation of a Rectangular Microstrip Antenna on Ferrite Substrate

Authors: Ayman Al Sawalha

Abstract:

This paper presents theoretical investigations on the radiation of rectangular microstrip antenna printed on a magnetized ferrite substrate Ni0.62Co0.02Fe1.948O4 in the presence of ionized plasma medium. The theoretical study of rectangular microstrip antenna in free space is carried out by applying the transmission line model combining with potential function techniques while hydrodynamic theory is used for it is analysis in plasma medium. By taking the biased and unbiased ferrite cases, far-field radiation patterns in free space and plasma medium are obtained which in turn are applied in computing radiated power, directivity, quality factor and bandwidth of antenna. It is found that the presence of plasma medium affects the performance of rectangular microstrip antenna structure significantly.

Keywords: ferrite, microstrip antenna, plasma, radiation

Procedia PDF Downloads 315
2008 Nano Gold and Silver for Control of Mosquitoes Manipulating Nanogeometries

Authors: Soam Prakash, Namita Soni

Abstract:

The synthesis of metallic nanoparticles is an active area of academic and more significantly, applied research in nanotechnology. Currently, nanoparticle research is an area of intense scientific interest. Silver (Ag) and Gold (Au) nanoparticles (NPs) have been the focus of fungi and plant based syntheses. Silver and gold nanoparticles are nanoparticles of silver and gold. These particles are of between 1 nm and 100 nm in size. Silver and gold have been use in the wide variety of potential applications in biomedical, optical, electronic field, treatment of burns, wounds, and several bacterial infections. There is a crucial need to produce new insecticides due to resistance and high-cost of organic insecticides which are more environmentally-friendly, safe, and target-specific. Synthesizing nanoparticles using plants and microorganisms can eliminate this problem by making the nanoparticles more biocompatible. Here we reviewed the mosquitocidal and antimicrobials activity of silver and gold nanoparticles using fungi, plants as well as bacteria.

Keywords: nano gold, nano silver, Malaria, Chikengunia, dengue control

Procedia PDF Downloads 424
2007 Fast and Non-Invasive Patient-Specific Optimization of Left Ventricle Assist Device Implantation

Authors: Huidan Yu, Anurag Deb, Rou Chen, I-Wen Wang

Abstract:

The use of left ventricle assist devices (LVADs) in patients with heart failure has been a proven and effective therapy for patients with severe end-stage heart failure. Due to the limited availability of suitable donor hearts, LVADs will probably become the alternative solution for patient with heart failure in the near future. While the LVAD is being continuously improved toward enhanced performance, increased device durability, reduced size, a better understanding of implantation management becomes critical in order to achieve better long-term blood supplies and less post-surgical complications such as thrombi generation. Important issues related to the LVAD implantation include the location of outflow grafting (OG), the angle of the OG, the combination between LVAD and native heart pumping, uniform or pulsatile flow at OG, etc. We have hypothesized that an optimal implantation of LVAD is patient specific. To test this hypothesis, we employ a novel in-house computational modeling technique, named InVascular, to conduct a systematic evaluation of cardiac output at aortic arch together with other pertinent hemodynamic quantities for each patient under various implantation scenarios aiming to get an optimal implantation strategy. InVacular is a powerful computational modeling technique that integrates unified mesoscale modeling for both image segmentation and fluid dynamics with the cutting-edge GPU parallel computing. It first segments the aortic artery from patient’s CT image, then seamlessly feeds extracted morphology, together with the velocity wave from Echo Ultrasound image of the same patient, to the computation model to quantify 4-D (time+space) velocity and pressure fields. Using one NVIDIA Tesla K40 GPU card, InVascular completes a computation from CT image to 4-D hemodynamics within 30 minutes. Thus it has the great potential to conduct massive numerical simulation and analysis. The systematic evaluation for one patient includes three OG anastomosis (ascending aorta, descending thoracic aorta, and subclavian artery), three combinations of LVAD and native heart pumping (1:1, 1:2, and 1:3), three angles of OG anastomosis (inclined upward, perpendicular, and inclined downward), and two LVAD inflow conditions (uniform and pulsatile). The optimal LVAD implantation is suggested through a comprehensive analysis of the cardiac output and related hemodynamics from the simulations over the fifty-four scenarios. To confirm the hypothesis, 5 random patient cases will be evaluated.

Keywords: graphic processing unit (GPU) parallel computing, left ventricle assist device (LVAD), lumped-parameter model, patient-specific computational hemodynamics

Procedia PDF Downloads 129
2006 Diagnosis of Diabetes Using Computer Methods: Soft Computing Methods for Diabetes Detection Using Iris

Authors: Piyush Samant, Ravinder Agarwal

Abstract:

Complementary and Alternative Medicine (CAM) techniques are quite popular and effective for chronic diseases. Iridology is more than 150 years old CAM technique which analyzes the patterns, tissue weakness, color, shape, structure, etc. for disease diagnosis. The objective of this paper is to validate the use of iridology for the diagnosis of the diabetes. The suggested model was applied in a systemic disease with ocular effects. 200 subject data of 100 each diabetic and non-diabetic were evaluated. Complete procedure was kept very simple and free from the involvement of any iridologist. From the normalized iris, the region of interest was cropped. All 63 features were extracted using statistical, texture analysis, and two-dimensional discrete wavelet transformation. A comparison of accuracies of six different classifiers has been presented. The result shows 89.66% accuracy by the random forest classifier.

Keywords: complementary and alternative medicine, classification, iridology, iris, feature extraction, disease prediction

Procedia PDF Downloads 394
2005 Fat-Tail Test of Regulatory DNA Sequences

Authors: Jian-Jun Shu

Abstract:

The statistical properties of CRMs are explored by estimating similar-word set occurrence distribution. It is observed that CRMs tend to have a fat-tail distribution for similar-word set occurrence. Thus, the fat-tail test with two fatness coefficients is proposed to distinguish CRMs from non-CRMs, especially from exons. For the first fatness coefficient, the separation accuracy between CRMs and exons is increased as compared with the existing content-based CRM prediction method – fluffy-tail test. For the second fatness coefficient, the computing time is reduced as compared with fluffy-tail test, making it very suitable for long sequences and large data-base analysis in the post-genome time. Moreover, these indexes may be used to predict the CRMs which have not yet been observed experimentally. This can serve as a valuable filtering process for experiment.

Keywords: statistical approach, transcription factor binding sites, cis-regulatory modules, DNA sequences

Procedia PDF Downloads 285
2004 New Approach to Construct Phylogenetic Tree

Authors: Ouafae Baida, Najma Hamzaoui, Maha Akbib, Abdelfettah Sedqui, Abdelouahid Lyhyaoui

Abstract:

Numerous scientific works present various methods to analyze the data for several domains, specially the comparison of classifications. In our recent work, we presented a new approach to help the user choose the best classification method from the results obtained by every method, by basing itself on the distances between the trees of classification. The result of our approach was in the form of a dendrogram contains methods as a succession of connections. This approach is much needed in phylogeny analysis. This discipline is intended to analyze the sequences of biological macro molecules for information on the evolutionary history of living beings, including their relationship. The product of phylogeny analysis is a phylogenetic tree. In this paper, we recommend the use of a new method of construction the phylogenetic tree based on comparison of different classifications obtained by different molecular genes.

Keywords: hierarchical classification, classification methods, structure of tree, genes, phylogenetic analysis

Procedia PDF Downloads 501
2003 Exploration of Various Metrics for Partitioning of Cellular Automata Units for Efficient Reconfiguration of Field Programmable Gate Arrays (FPGAs)

Authors: Peter Tabatt, Christian Siemers

Abstract:

Using FPGA devices to improve the behavior of time-critical parts of embedded systems is a proven concept for years. With reconfigurable FPGA devices, the logical blocks can be partitioned and grouped into static and dynamic parts. The dynamic parts can be reloaded 'on demand' at runtime. This work uses cellular automata, which are constructed through compilation from (partially restricted) ANSI-C sources, to determine the suitability of various metrics for optimal partitioning. Significant metrics, in this case, are for example the area on the FPGA device for the partition, the pass count for loop constructs and communication characteristics to other partitions. With successful partitioning, it is possible to use smaller FPGA devices for the same requirements as with not reconfigurable FPGA devices or – vice versa – to use the same FPGAs for larger programs.

Keywords: reconfigurable FPGA, cellular automata, partitioning, metrics, parallel computing

Procedia PDF Downloads 261
2002 Investigating the Form of the Generalised Equations of Motion of the N-Bob Pendulum and Computing Their Solution Using MATLAB

Authors: Divij Gupta

Abstract:

Pendular systems have a range of both mathematical and engineering applications, ranging from modelling the behaviour of a continuous mass-density rope to utilisation as Tuned Mass Dampers (TMD). Thus, it is of interest to study the differential equations governing the motion of such systems. Here we attempt to generalise these equations of motion for the plane compound pendulum with a finite number of N point masses. A Lagrangian approach is taken, and we attempt to find the generalised form for the Euler-Lagrange equations of motion for the i-th bob of the N -bob pendulum. The co-ordinates are parameterized as angular quantities to reduce the number of degrees of freedom from 2N to N to simplify the form of the equations. We analyse the form of these equations up to N = 4 to determine the general form of the equation. We also develop a MATLAB program to compute a solution to the system for a given input value of N and a given set of initial conditions.

Keywords: classical mechanics, differential equation, lagrangian analysis, pendulum

Procedia PDF Downloads 198
2001 Prosthetic Rehabilitation of Midfacial: Nasal Defects

Authors: Bilal Ahmed

Abstract:

Rehabilitation of congenital and acquired maxillofacial defects is always a challenging clinical scenario. These defects pose major physiological and psychological threat not only to the patient but to the entire family. There has been an enormous scientific development in maxillofacial rehabilitation with the advent of CAD CAM, 3-D scanning, Osseo-integrated implants and improved restorative materials. There are also specialized centers with latest diagnostic and treatment facilities in the developed countries. However, in certain clinical case scenarios, conventional prosthodontic principles are still the gold standards. Similarly in a less developed world, financial and technical constraints are factors affecting treatment planning and final outcomes. However, we can do a lot of benefits to the affected human beings, even with use of simple and cost-effective conventional prosthodontic techniques and materials. These treatment strategies may sometimes be considered as intermediate or temporary options, but with regular follow-up maintenance these can be used on a definitive basis.

Keywords: maxillofacial defects, obturators, prosthodontics, medical and health sciences

Procedia PDF Downloads 340
2000 A New Computational Package for Using in CFD and Other Problems (Third Edition)

Authors: Mohammad Reza Akhavan Khaleghi

Abstract:

This paper shows changes done to the Reduced Finite Element Method (RFEM) that its result will be the most powerful numerical method that has been proposed so far (some forms of this method are so powerful that they can approximate the most complex equations simply Laplace equation!). Finite Element Method (FEM) is a powerful numerical method that has been used successfully for the solution of the existing problems in various scientific and engineering fields such as its application in CFD. Many algorithms have been expressed based on FEM, but none have been used in popular CFD software. In this section, full monopoly is according to Finite Volume Method (FVM) due to better efficiency and adaptability with the physics of problems in comparison with FEM. It doesn't seem that FEM could compete with FVM unless it was fundamentally changed. This paper shows those changes and its result will be a powerful method that has much better performance in all subjects in comparison with FVM and another computational method. This method is not to compete with the finite volume method but to replace it.

Keywords: reduced finite element method, new computational package, new finite element formulation, new higher-order form, new isogeometric analysis

Procedia PDF Downloads 108
1999 Low-Cost Fog Edge Computing for Smart Power Management and Home Automation

Authors: Belkacem Benadda, Adil Benabdellah, Boutheyna Souna

Abstract:

The Internet of Things (IoT) is an unprecedented creation. Electronics objects are now able to interact, share, respond and adapt to their environment on a much larger basis. Actual spread of these modern means of connectivity and solutions with high data volume exchange are affecting our ways of life. Accommodation is becoming an intelligent living space, not only suited to the people circumstances and desires, but also to systems constraints to make daily life simpler, cheaper, increase possibilities and achieve a higher level of services and luxury. In this paper we are as Internet access, teleworking, consumption monitoring, information search, etc.). This paper addresses the design and integration of a smart home, it also purposes an IoT solution that allows smart power consumption based on measurements from power-grid and deep learning analysis.

Keywords: array sensors, IoT, power grid, FPGA, embedded

Procedia PDF Downloads 110
1998 Challenges in Learning Legal English from the Students’ Perspective at Hanoi Law University

Authors: Nhac Thanh Huong

Abstract:

Legal English, also known as Language of the Law (Mellinkoff, David. 2004), is an indispensable factor contributing to the development of legal field. At Hanoi Law University, legal English is a compulsory subject in the syllabus of legal English major; International Trade law and Fast-track law training program. The question that what obstacles students face with when dealing with legal English, however, has not been answered at that institution. Therefore, this present research, which makes use of survey questionnaires as the main method, aims to study the challenges of learning legal English from the students’ perspective, from which some useful solutions are drawn up to overcome these difficulties and improve the effectiveness of learning legal English. The results indicate notable difficulties arising from the level of general English skills, the characteristics of legal English and legal background knowledge. These findings lay a scientific foundation for suggesting some solutions for practical applications in teaching as well as learning legal English among both teachers and students.

Keywords: challenges, HLU, Legal English, students' perspective

Procedia PDF Downloads 187
1997 Some Discrepancies between Experimentally-Based Theory of Toxic Metals Combined Action and Actual Approaches to Occupational and Environmental Health Risk Assessment and Management

Authors: Ilzira A. Minigalieva

Abstract:

Assessment of cumulative health risks associated with the widely observed combined exposures to two or more metals and their compounds on the organism in industrial or general environment, as well as respective regulatory and technical risk management decision-making have presumably the theoretical and experimental toxicology of mixtures as their reliable scientific basis. Analysis of relevant literature and our own experience proves, however, that there is no full match between these different practices. Moreover, some of the contradictions between them are of a fundamental nature. This unsatisfactory state of things may be explained not only by unavoidable simplifications characteristic of the methodologies of risk assessment and permissible exposure standards setting but also by the extreme intrinsic complexity of the combined toxicity theory, the most essential issues of which are considered and briefly discussed in this paper.

Keywords: toxic metals, nanoparticles, typology of combined toxicity, mathematical modeling, health risk assessment and management

Procedia PDF Downloads 322
1996 Wearable Music: Generation of Costumes from Music and Generative Art and Wearing Them by 3-Way Projectors

Authors: Noriki Amano

Abstract:

The final goal of this study is to create another way in which people enjoy music through the performance of 'Wearable Music'. Concretely speaking, we generate colorful costumes in real- time from music and to realize their dressing by projecting them to a person. For this purpose, we propose three methods in this study. First, a method of giving color to music in a three-dimensionally way. Second, a method of generating images of costumes from music. Third, a method of wearing the images of music. In particular, this study stands out from other related work in that we generate images of unique costumes from music and realize to wear them. In this study, we use the technique of generative arts to generate images of unique costumes and project the images to the fog generated around a person from 3-way using projectors. From this study, we can get how to enjoy music as 'wearable'. Furthermore, we are also able to have the prospect of unconventional entertainment based on the fusion between music and costumes.

Keywords: entertainment computing, costumes, music, generative programming

Procedia PDF Downloads 167
1995 DAG Design and Tradeoff for Full Live Virtual Machine Migration over XIA Network

Authors: Dalu Zhang, Xiang Jin, Dejiang Zhou, Jianpeng Wang, Haiying Jiang

Abstract:

Traditional TCP/IP network is showing lots of shortages and research for future networks is becoming a hotspot. FIA (Future Internet Architecture) and FIA-NP (Next Phase) are supported by US NSF for future Internet designing. Moreover, virtual machine migration is a significant technique in cloud computing. As a network application, it should also be supported in XIA (expressive Internet Architecture), which is in both FIA and FIA-NP projects. This paper is an experimental study aims at verifying the feasibility of VM migration over XIA. We present three ways to maintain VM connectivity and communication states concerning DAG design and routing table modification. VM migration experiments are conducted intra-AD and inter-AD with KVM instances. The procedure is achieved by a migration control protocol which is suitable for the characters of XIA. Evaluation results show that our solutions can well supports full live VM migration over XIA network respectively, keeping services seamless.

Keywords: DAG, downtime, virtual machine migration, XIA

Procedia PDF Downloads 849
1994 A Conceptual Framework of Digital Twin for Homecare

Authors: Raja Omman Zafar, Yves Rybarczyk, Johan Borg

Abstract:

This article proposes a conceptual framework for the application of digital twin technology in home care. The main goal is to bridge the gap between advanced digital twin concepts and their practical implementation in home care. This study uses a literature review and thematic analysis approach to synthesize existing knowledge and proposes a structured framework suitable for homecare applications. The proposed framework integrates key components such as IoT sensors, data-driven models, cloud computing, and user interface design, highlighting the importance of personalized and predictive homecare solutions. This framework can significantly improve the efficiency, accuracy, and reliability of homecare services. It paves the way for the implementation of digital twins in home care, promoting real-time monitoring, early intervention, and better outcomes.

Keywords: digital twin, homecare, older adults, healthcare, IoT, artificial intelligence

Procedia PDF Downloads 52
1993 Generation Y in Organizations: Distinctive Characteristics and Behavior at Work of Moroccan YERs

Authors: Fatima Ezzahra Siragi, Omar Benaini

Abstract:

For many years, Generation Y has been at the center of controversies. This topic made the buzz in the Media as well as in scientific literature. Previous research led to contradictory results; some scholars considered this population a wealth for companies, while the others believe it constitutes a young danger in need of proper control. Existing literature has almost studied Generation Y in developed countries; very rare studies were conducted in developing countries. To our knowledge, no published articles have treated Generation Y in Morocco. The purpose of this research is to examine the distinctive characteristics of Generation Y in Morocco as well as their behavior at work. Using quantitative method, the study was conducted on a sample of 250 Moroccan employees that have a high educational level and who belong to Generation Y. Our results have shown high resemblance between Moroccan and Occidental Yers (France, USA, Canada …)

Keywords: Behavior in Organizations, Generation Y, Key Characteristics, Moroccan Yers, Motivation

Procedia PDF Downloads 273
1992 High Efficiency Double-Band Printed Rectenna Model for Energy Harvesting

Authors: Rakelane A. Mendes, Sandro T. M. Goncalves, Raphaella L. R. Silva

Abstract:

The concepts of energy harvesting and wireless energy transfer have been widely discussed in recent times. There are some ways to create autonomous systems for collecting ambient energy, such as solar, vibratory, thermal, electromagnetic, radiofrequency (RF), among others. In the case of the RF it is possible to collect up to 100 μW / cm². To collect and/or transfer energy in RF systems, a device called rectenna is used, which is defined by the junction of an antenna and a rectifier circuit. The rectenna presented in this work is resonant at the frequencies of 1.8 GHz and 2.45 GHz. Frequencies at 1.8 GHz band are e part of the GSM / LTE band. The GSM (Global System for Mobile Communication) is a frequency band of mobile telephony, it is also called second generation mobile networks (2G), it came to standardize mobile telephony in the world and was originally developed for voice traffic. LTE (Long Term Evolution) or fourth generation (4G) has emerged to meet the demand for wireless access to services such as Internet access, online games, VoIP and video conferencing. The 2.45 GHz frequency is part of the ISM (Instrumentation, Scientific and Medical) frequency band, this band is internationally reserved for industrial, scientific and medical development with no need for licensing, and its only restrictions are related to maximum power transfer and bandwidth, which must be kept within certain limits (in Brazil the bandwidth is 2.4 - 2.4835 GHz). The rectenna presented in this work was designed to present efficiency above 50% for an input power of -15 dBm. It is known that for wireless energy capture systems the signal power is very low and varies greatly, for this reason this ultra-low input power was chosen. The Rectenna was built using the low cost FR4 (Flame Resistant) substrate, the antenna selected is a microfita antenna, consisting of a Meandered dipole, and this one was optimized using the software CST Studio. This antenna has high efficiency, high gain and high directivity. Gain is the quality of an antenna in capturing more or less efficiently the signals transmitted by another antenna and/or station. Directivity is the quality that an antenna has to better capture energy in a certain direction. The rectifier circuit used has series topology and was optimized using Keysight's ADS software. The rectifier circuit is the most complex part of the rectenna, since it includes the diode, which is a non-linear component. The chosen diode is the Schottky diode SMS 7630, this presents low barrier voltage (between 135-240 mV) and a wider band compared to other types of diodes, and these attributes make it perfect for this type of application. In the rectifier circuit are also used inductor and capacitor, these are part of the input and output filters of the rectifier circuit. The inductor has the function of decreasing the dispersion effect on the efficiency of the rectifier circuit. The capacitor has the function of eliminating the AC component of the rectifier circuit and making the signal undulating.

Keywords: dipole antenna, double-band, high efficiency, rectenna

Procedia PDF Downloads 114
1991 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression

Authors: Anne M. Denton, Rahul Gomes, David W. Franzen

Abstract:

High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.

Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression

Procedia PDF Downloads 122
1990 Managers’ Mobile Information Behavior in an Openness Paradigm Era

Authors: Abd Latif Abdul Rahman, Zuraidah Arif, Muhammad Faizal Iylia, Mohd Ghazali, Asmadi Mohammed Ghazali

Abstract:

Mobile information is a significant access point for human information activities. Theories and models of human information behavior have developed over several decades but have not yet considered the role of the user’s computing device in digital information interactions. This paper reviews the literature that leads to developing a conceptual framework of a study on the managers mobile information behavior. Based on the literature review, dimensions of mobile information behavior are identified, namely, dimension information needs, dimension information access, information retrieval and dimension of information use. The study is significant to understand the nature of librarians’ behavior in searching, retrieving and using information via the mobile device. Secondly, the study would provide suggestions about various kinds of mobile applications which organization can provide for their staff to improve their services.

Keywords: mobile information behavior, information behavior, mobile information, mobile devices

Procedia PDF Downloads 341
1989 Household Food Wastage Assessment: A Case Study in South Africa

Authors: Fhumulani R. Ramukhwatho, Roelien du Plessis, Suzan H. H. Oelofse

Abstract:

There are a growing number of scientific papers, journals and reports on household food waste, the reason being that food waste has become a significant global issue that is costing billions of Rands in resources. To reduce food waste in a sustainable manner, it requires an understanding of the generation of food waste. This paper assesses household food wastage in the City of Tshwane Metropolitan Municipality (CTMM). A total of 210 interviewed participants using face-to-face interviews based on a structured questionnaire and the actual weighing of households’ food wasted was quantified using a weighing kitchen scale. Fifty-nine percent of respondents agreed that they wasted food, while 41% thought they did not waste food at all. Households wasted an average total of 6 kg of food waste per week per household. The study concluded that households buy and prepare more food that ends up wasted.

Keywords: assessment, developing country, food waste, household

Procedia PDF Downloads 316
1988 Evaluating Service Trustworthiness for Service Selection in Cloud Environment

Authors: Maryam Amiri, Leyli Mohammad-Khanli

Abstract:

Cloud computing is becoming increasingly popular and more business applications are moving to cloud. In this regard, services that provide similar functional properties are increasing. So, the ability to select a service with the best non-functional properties, corresponding to the user preference, is necessary for the user. This paper presents an Evaluation Framework of Service Trustworthiness (EFST) that evaluates the trustworthiness of equivalent services without need to additional invocations of them. EFST extracts user preference automatically. Then, it assesses trustworthiness of services in two dimensions of qualitative and quantitative metrics based on the experiences of past usage of services. Finally, EFST determines the overall trustworthiness of services using Fuzzy Inference System (FIS). The results of experiments and simulations show that EFST is able to predict the missing values of Quality of Service (QoS) better than other competing approaches. Also, it propels users to select the most appropriate services.

Keywords: user preference, cloud service, trustworthiness, QoS metrics, prediction

Procedia PDF Downloads 283
1987 Multi-Scaled Non-Local Means Filter for Medical Images Denoising: Empirical Mode Decomposition vs. Wavelet Transform

Authors: Hana Rabbouch

Abstract:

In recent years, there has been considerable growth of denoising techniques mainly devoted to medical imaging. This important evolution is not only due to the progress of computing techniques, but also to the emergence of multi-resolution analysis (MRA) on both mathematical and algorithmic bases. In this paper, a comparative study is conducted between the two best-known MRA-based decomposition techniques: the Empirical Mode Decomposition (EMD) and the Discrete Wavelet Transform (DWT). The comparison is carried out in a framework of multi-scale denoising, where a Non-Local Means (NLM) filter is performed scale-by-scale to a sample of benchmark medical images. The results prove the effectiveness of the multiscaled denoising, especially when the NLM filtering is coupled with the EMD.

Keywords: medical imaging, non local means, denoising, multiscaled analysis, empirical mode decomposition, wavelets

Procedia PDF Downloads 134
1986 Cryptography and Cryptosystem a Panacea to Security Risk in Wireless Networking

Authors: Modesta E. Ezema, Chikwendu V. Alabekee, Victoria N. Ishiwu, Ifeyinwa NwosuArize, Chinedu I. Nwoye

Abstract:

The advent of wireless networking in computing technology cannot be overemphasized, it opened up easy accessibility to information resources, networking made easier and brought internet accessibility to our doorsteps, but despite all these, some mishap came in with it that is causing mayhem in today ‘s overall information security. The cyber criminals will always compromise the integrity of a message that is not encrypted or that is encrypted with a weak algorithm.In other to correct the mayhem, this study focuses on cryptosystem and cryptography. This ensures end to end crypt messaging. The study of various cryptographic algorithms, as well as the techniques and applications of the cryptography for efficiency, were all considered in the work., present and future applications of cryptography were dealt with as well as Quantum Cryptography was exposed as the current and the future area in the development of cryptography. An empirical study was conducted to collect data from network users.

Keywords: algorithm, cryptography, cryptosystem, network

Procedia PDF Downloads 339
1985 Using Teachers' Perceptions of Science Outreach Activities to Design an 'Optimum' Model of Science Outreach

Authors: Victoria Brennan, Andrea Mallaburn, Linda Seton

Abstract:

Science outreach programmes connect school pupils with external agencies to provide activities and experiences that enhance their exposure to science. It can be argued that these programmes not only aim to support teachers with curriculum engagement and promote scientific literacy but also provide pivotal opportunities to spark scientific interest in students. In turn, a further objective of these programmes is to increase awareness of career opportunities within this field. Although outreach work is also often described as a fun and satisfying venture, a plethora of researchers express caution to how successful the processes are to increases engagement post-16 in science. When researching the impact of outreach programmes, it is often student feedback regarding the activities or enrolment numbers to particular science courses post-16, which are generated and analysed. Although this is informative, the longevity of the programme’s impact could be better informed by the teacher’s perceptions; the evidence of which is far more limited in the literature. In addition, there are strong suggestions that teachers can have an indirect impact on a student’s own self-concept. These themes shape the focus and importance of this ongoing research project as it presents the rationale that teachers are under-used resources when it comes to considering the design of science outreach programmes. Therefore, the end result of the research will consist of a presentation of an ‘optimum’ model of outreach. The result of which should be of interest to the wider stakeholders such as universities or private or government organisations who design science outreach programmes in the hope to recruit future scientists. During phase one, questionnaires (n=52) and interviews (n=8) have generated both quantitative and qualitative data. These have been analysed using the Wilcoxon non-parametric test to compare teachers’ perceptions of science outreach interventions and thematic analysis for open-ended questions. Both of these research activities provide an opportunity for a cross-section of teacher opinions of science outreach to be obtained across all educational levels. Therefore, an early draft of the ‘optimum’ model of science outreach delivery was generated using both the wealth of literature and primary data. This final (ongoing) phase aims to refine this model using teacher focus groups to provide constructive feedback about the proposed model. The analysis uses principles of modified Grounded Theory to ensure that focus group data is used to further strengthen the model. Therefore, this research uses a pragmatist approach as it aims to focus on the strengths of the different paradigms encountered to ensure the data collected will provide the most suitable information to create an improved model of sustainable outreach. The results discussed will focus on this ‘optimum’ model and teachers’ perceptions of benefits and drawbacks when it comes to engaging with science outreach work. Although the model is still a ‘work in progress’, it provides both insight into how teachers feel outreach delivery can be a sustainable intervention tool within the classroom and what providers of such programmes should consider when designing science outreach activities.

Keywords: educational partnerships, science education, science outreach, teachers

Procedia PDF Downloads 117
1984 Variability of L-Band GPS Scintillation over Auroral Region, Maitri, Antarctica

Authors: Prakash Khatarkar, P. A. Khan, Shweta Mukherjee, Roshni Atulkar, P. K. Purohit, A. K. Gwal

Abstract:

We have investigated the occurrence characteristics of ionospheric scintillations, using dual frequency GPS, installed and operated at Indian scientific base station Maitri (71.45S and 11.45E), Antarctica, during December 2009 to December 2010. The scintillation morphology is described in terms of S4 Index. The scintillations are classified into four main categories as Weak (0.21.0). From the analysis we found that the percentage of weak, moderate, strong and saturated scintillations were 96%, 80%, 58% and 7%, respectively. The maximum percentage of all types of scintillation was observed in the summer season, followed by equinox and the least in winter season. As the year 2010 was a low solar activity period, consequently the maximum occurrences of scintillations were those of weak and moderate and only four cases of saturated scintillation were observed.

Keywords: L-band scintillation, GPS, auroral region, low solar activity

Procedia PDF Downloads 643
1983 Linguistics and Grammar Conceptions - An Honor to Ferdinand de Saussure

Authors: Adriana Aparecida Rodrigues Leite

Abstract:

Linguistics and grammar conceptions are necessary to comprehend the structure of a language. On one hand, grammar depicts structural rules and instructions. On the other hand, linguistcs is a science which intends to understand the changes that occur in a language. Ferdinand de Saussure throughout his book called: Cours de linguistique générale (Course of general Linguistics) developed theories that portray explanations which defines linguistics as a real object of study of a language. It differs from grammar which is seen by Saussure as an element without scientific pattern. By that means, this research plans to decipher whether Ferdinand de Saussure would be against these conceptions and rules proposed by grammar. Besides that, this paper was based on the exploratory approach to pose concrete principles and facts to provide a response for the problem. This research is divided in the following sections: Introduction, Ferdinand de Saussure, Linguistics Conceptions, Linguistics for Saussure, Grammar Conceptions, and Grammar for Saussure. The result obtained from the analysis of the problem is highlighted in the section: Final Considerations.

Keywords: linguistics, grammar, ferdinand de saussure, language

Procedia PDF Downloads 9
1982 Bee Products Development and Innovation

Authors: Hasan Vural

Abstract:

In this study, innovation subject is explained firstly. Later the basic concepts of innovation and new food products development in marketing of bee products are investigated. Examples of the application of research results will be presented. Subject will be discussed benefiting from scientific studies based on literature review. Innovation is widely recognised as important to commercial success in the food industry, as both a major source of competitive advantage and the creation of a company’s future. However, the new product development process is described as being fraught with failures, with only approximately 10% of new products remaining on the market within a year of commercialisation. In addition, for every new food product that does reach commercialisation, there are likely to be many concepts that are rejected during the new food product development process. No roadmap exactly describes a route to a goal: exhortations to follow ‘10 Steps to a successful Product’ or use ‘Smith’s Method to Do Successful Products’ are, therefore, all approximations. Roadmaps do not describe the actual journey, only the general direction.

Keywords: innovation, agrofood product development, beekeeping products, honey marketing

Procedia PDF Downloads 406
1981 Multi-Level Priority Based Task Scheduling Algorithm for Workflows in Cloud Environment

Authors: Anju Bala, Inderveer Chana

Abstract:

Task scheduling is the key concern for the execution of performance-driven workflow applications. As efficient scheduling can have major impact on the performance of the system, task scheduling is often chosen for assigning the request to resources in an efficient way based on cloud resource characteristics. In this paper, priority based task scheduling algorithm has been proposed that prioritizes the tasks based on the length of the instructions. The proposed scheduling approach prioritize the tasks of Cloud applications according to the limits set by six sigma control charts based on dynamic threshold values. Further, the proposed algorithm has been validated through the CloudSim toolkit. The experimental results demonstrate that the proposed algorithm is effective for handling multiple task lists from workflows and in considerably reducing Makespan and Execution time.

Keywords: cloud computing, priority based scheduling, task scheduling, VM allocation

Procedia PDF Downloads 512