Search results for: digital image watermarking
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2266

Search results for: digital image watermarking

106 Respirator System For Total Liquid Ventilation

Authors: Miguel A. Gómez , Enrique Hilario , Francisco J. Alvarez , Elena Gastiasoro , Antonia Alvarez, Juan L. Larrabe

Abstract:

Total liquid ventilation can support gas exchange in animal models of lung injury. Clinical application awaits further technical improvements and performance verification. Our aim was to develop a liquid ventilator, able to deliver accurate tidal volumes, and a computerized system for measuring lung mechanics. The computer-assisted, piston-driven respirator controlled ventilatory parameters that were displayed and modified on a real-time basis. Pressure and temperature transducers along with a lineal displacement controller provided the necessary signals to calculate lung mechanics. Ten newborn lambs (<6 days old) with respiratory failure induced by lung lavage, were monitored using the system. Electromechanical, hydraulic and data acquisition/analysis components of the ventilator were developed and tested in animals with respiratory failure. All pulmonary signals were collected synchronized in time, displayed in real-time, and archived on digital media. The total mean error (due to transducers, A/D conversion, amplifiers, etc.) was less than 5% compared to calibrated signals. Improvements in gas exchange and lung mechanics were observed during liquid ventilation, without impairment of cardiovascular profiles. The total liquid ventilator maintained accurate control of tidal volumes and the sequencing of inspiration/expiration. The computerized system demonstrated its ability to monitor in vivo lung mechanics, providing valuable data for early decision-making.

Keywords: immature lamb, perfluorocarbon, pressure-limited, total liquid ventilation, ventilator; volume-controlled

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1491
105 RRNS-Convolutional Concatenated Code for OFDM based Wireless Communication with Direct Analog-to-Residue Converter

Authors: Shahana T. K., Babita R. Jose, K. Poulose Jacob, Sreela Sasi

Abstract:

The modern telecommunication industry demands higher capacity networks with high data rate. Orthogonal frequency division multiplexing (OFDM) is a promising technique for high data rate wireless communications at reasonable complexity in wireless channels. OFDM has been adopted for many types of wireless systems like wireless local area networks such as IEEE 802.11a, and digital audio/video broadcasting (DAB/DVB). The proposed research focuses on a concatenated coding scheme that improve the performance of OFDM based wireless communications. It uses a Redundant Residue Number System (RRNS) code as the outer code and a convolutional code as the inner code. Here, a direct conversion of analog signal to residue domain is done to reduce the conversion complexity using sigma-delta based parallel analog-to-residue converter. The bit error rate (BER) performances of the proposed system under different channel conditions are investigated. These include the effect of additive white Gaussian noise (AWGN), multipath delay spread, peak power clipping and frame start synchronization error. The simulation results show that the proposed RRNS-Convolutional concatenated coding (RCCC) scheme provides significant improvement in the system performance by exploiting the inherent properties of RRNS.

Keywords: Analog-to-residue converter, Concatenated codes, OFDM, Redundant Residue Number System, Sigma-delta modulator, Wireless communication

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1907
104 Time Series Simulation by Conditional Generative Adversarial Net

Authors: Rao Fu, Jie Chen, Shutian Zeng, Yiping Zhuang, Agus Sudjianto

Abstract:

Generative Adversarial Net (GAN) has proved to be a powerful machine learning tool in image data analysis and generation. In this paper, we propose to use Conditional Generative Adversarial Net (CGAN) to learn and simulate time series data. The conditions include both categorical and continuous variables with different auxiliary information. Our simulation studies show that CGAN has the capability to learn different types of normal and heavy-tailed distributions, as well as dependent structures of different time series. It also has the capability to generate conditional predictive distributions consistent with training data distributions. We also provide an in-depth discussion on the rationale behind GAN and the neural networks as hierarchical splines to establish a clear connection with existing statistical methods of distribution generation. In practice, CGAN has a wide range of applications in market risk and counterparty risk analysis: it can be applied to learn historical data and generate scenarios for the calculation of Value-at-Risk (VaR) and Expected Shortfall (ES), and it can also predict the movement of the market risk factors. We present a real data analysis including a backtesting to demonstrate that CGAN can outperform Historical Simulation (HS), a popular method in market risk analysis to calculate VaR. CGAN can also be applied in economic time series modeling and forecasting. In this regard, we have included an example of hypothetical shock analysis for economic models and the generation of potential CCAR scenarios by CGAN at the end of the paper.

Keywords: Conditional Generative Adversarial Net, market and credit risk management, neural network, time series.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1132
103 Infrared Lightbox and iPhone App for Improving Detection Limit of Phosphate Detecting Dip Strips

Authors: H. Heidari-Bafroui, B. Ribeiro, A. Charbaji, C. Anagnostopoulos, M. Faghri

Abstract:

In this paper, we report the development of a portable and inexpensive infrared lightbox for improving the detection limits of paper-based phosphate devices. Commercial paper-based devices utilize the molybdenum blue protocol to detect phosphate in the environment. Although these devices are easy to use and have a long shelf life, their main deficiency is their low sensitivity based on the qualitative results obtained via a color chart. To improve the results, we constructed a compact infrared lightbox that communicates wirelessly with a smartphone. The system measures the absorbance of radiation for the molybdenum blue reaction in the infrared region of the spectrum. It consists of a lightbox illuminated by four infrared light-emitting diodes, an infrared digital camera, a Raspberry Pi microcontroller, a mini-router, and an iPhone to control the microcontroller. An iPhone application was also developed to analyze images captured by the infrared camera in order to quantify phosphate concentrations. Additionally, the app connects to an online data center to present a highly scalable worldwide system for tracking and analyzing field measurements. In this study, the detection limits for two popular commercial devices were improved by a factor of 4 for the Quantofix devices (from 1.3 ppm using visible light to 300 ppb using infrared illumination) and a factor of 6 for the Indigo units (from 9.2 ppm to 1.4 ppm) with repeatability of less than or equal to 1.2% relative standard deviation (RSD). The system also provides more granular concentration information compared to the discrete color chart used by commercial devices and it can be easily adapted for use in other applications.

Keywords: Infrared lightbox, paper-based device, phosphate detection, smartphone colorimetric analyzer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 594
102 Q-Map: Clinical Concept Mining from Clinical Documents

Authors: Sheikh Shams Azam, Manoj Raju, Venkatesh Pagidimarri, Vamsi Kasivajjala

Abstract:

Over the past decade, there has been a steep rise in the data-driven analysis in major areas of medicine, such as clinical decision support system, survival analysis, patient similarity analysis, image analytics etc. Most of the data in the field are well-structured and available in numerical or categorical formats which can be used for experiments directly. But on the opposite end of the spectrum, there exists a wide expanse of data that is intractable for direct analysis owing to its unstructured nature which can be found in the form of discharge summaries, clinical notes, procedural notes which are in human written narrative format and neither have any relational model nor any standard grammatical structure. An important step in the utilization of these texts for such studies is to transform and process the data to retrieve structured information from the haystack of irrelevant data using information retrieval and data mining techniques. To address this problem, the authors present Q-Map in this paper, which is a simple yet robust system that can sift through massive datasets with unregulated formats to retrieve structured information aggressively and efficiently. It is backed by an effective mining technique which is based on a string matching algorithm that is indexed on curated knowledge sources, that is both fast and configurable. The authors also briefly examine its comparative performance with MetaMap, one of the most reputed tools for medical concepts retrieval and present the advantages the former displays over the latter.

Keywords: Information retrieval (IR), unified medical language system (UMLS), Syntax Based Analysis, natural language processing (NLP), medical informatics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 733
101 Comparative Study Using Weka for Red Blood Cells Classification

Authors: Jameela Ali Alkrimi, Hamid A. Jalab, Loay E. George, Abdul Rahim Ahmad, Azizah Suliman, Karim Al-Jashamy

Abstract:

Red blood cells (RBC) are the most common types of blood cells and are the most intensively studied in cell biology. The lack of RBCs is a condition in which the amount of hemoglobin level is lower than normal and is referred to as “anemia”. Abnormalities in RBCs will affect the exchange of oxygen. This paper presents a comparative study for various techniques for classifying the RBCs as normal or abnormal (anemic) using WEKA. WEKA is an open source consists of different machine learning algorithms for data mining applications. The algorithms tested are Radial Basis Function neural network, Support vector machine, and K-Nearest Neighbors algorithm. Two sets of combined features were utilized for classification of blood cells images. The first set, exclusively consist of geometrical features, was used to identify whether the tested blood cell has a spherical shape or non-spherical cells. While the second set, consist mainly of textural features was used to recognize the types of the spherical cells. We have provided an evaluation based on applying these classification methods to our RBCs image dataset which were obtained from Serdang Hospital - Malaysia, and measuring the accuracy of test results. The best achieved classification rates are 97%, 98%, and 79% for Support vector machines, Radial Basis Function neural network, and K-Nearest Neighbors algorithm respectively.

Keywords: K-Nearest Neighbors, Neural Network, Radial Basis Function, Red blood cells, Support vector machine.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2940
100 Automatic Staging and Subtype Determination for Non-Small Cell Lung Carcinoma Using PET Image Texture Analysis

Authors: Seyhan Karaçavuş, Bülent Yılmaz, Ömer Kayaaltı, Semra İçer, Arzu Taşdemir, Oğuzhan Ayyıldız, Kübra Eset, Eser Kaya

Abstract:

In this study, our goal was to perform tumor staging and subtype determination automatically using different texture analysis approaches for a very common cancer type, i.e., non-small cell lung carcinoma (NSCLC). Especially, we introduced a texture analysis approach, called Law’s texture filter, to be used in this context for the first time. The 18F-FDG PET images of 42 patients with NSCLC were evaluated. The number of patients for each tumor stage, i.e., I-II, III or IV, was 14. The patients had ~45% adenocarcinoma (ADC) and ~55% squamous cell carcinoma (SqCCs). MATLAB technical computing language was employed in the extraction of 51 features by using first order statistics (FOS), gray-level co-occurrence matrix (GLCM), gray-level run-length matrix (GLRLM), and Laws’ texture filters. The feature selection method employed was the sequential forward selection (SFS). Selected textural features were used in the automatic classification by k-nearest neighbors (k-NN) and support vector machines (SVM). In the automatic classification of tumor stage, the accuracy was approximately 59.5% with k-NN classifier (k=3) and 69% with SVM (with one versus one paradigm), using 5 features. In the automatic classification of tumor subtype, the accuracy was around 92.7% with SVM one vs. one. Texture analysis of FDG-PET images might be used, in addition to metabolic parameters as an objective tool to assess tumor histopathological characteristics and in automatic classification of tumor stage and subtype.

Keywords: Cancer stage, cancer cell type, non-small cell lung carcinoma, PET, texture analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 931
99 Thresholding Approach for Automatic Detection of Pseudomonas aeruginosa Biofilms from Fluorescence in situ Hybridization Images

Authors: Zonglin Yang, Tatsuya Akiyama, Kerry S. Williamson, Michael J. Franklin, Thiruvarangan Ramaraj

Abstract:

Pseudomonas aeruginosa is an opportunistic pathogen that forms surface-associated microbial communities (biofilms) on artificial implant devices and on human tissue. Biofilm infections are difficult to treat with antibiotics, in part, because the bacteria in biofilms are physiologically heterogeneous. One measure of biological heterogeneity in a population of cells is to quantify the cellular concentrations of ribosomes, which can be probed with fluorescently labeled nucleic acids. The fluorescent signal intensity following fluorescence in situ hybridization (FISH) analysis correlates to the cellular level of ribosomes. The goals here are to provide computationally and statistically robust approaches to automatically quantify cellular heterogeneity in biofilms from a large library of epifluorescent microscopy FISH images. In this work, the initial steps were developed toward these goals by developing an automated biofilm detection approach for use with FISH images. The approach allows rapid identification of biofilm regions from FISH images that are counterstained with fluorescent dyes. This methodology provides advances over other computational methods, allowing subtraction of spurious signals and non-biological fluorescent substrata. This method will be a robust and user-friendly approach which will enable users to semi-automatically detect biofilm boundaries and extract intensity values from fluorescent images for quantitative analysis of biofilm heterogeneity.

Keywords: Image informatics, Pseudomonas aeruginosa, biofilm, FISH, computer vision, data visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1112
98 Gate Tunnel Current Calculation for NMOSFET Based on Deep Sub-Micron Effects

Authors: Ashwani K. Rana, Narottam Chand, Vinod Kapoor

Abstract:

Aggressive scaling of MOS devices requires use of ultra-thin gate oxides to maintain a reasonable short channel effect and to take the advantage of higher density, high speed, lower cost etc. Such thin oxides give rise to high electric fields, resulting in considerable gate tunneling current through gate oxide in nano regime. Consequently, accurate analysis of gate tunneling current is very important especially in context of low power application. In this paper, a simple and efficient analytical model has been developed for channel and source/drain overlap region gate tunneling current through ultra thin gate oxide n-channel MOSFET with inevitable deep submicron effect (DSME).The results obtained have been verified with simulated and reported experimental results for the purpose of validation. It is shown that the calculated tunnel current is well fitted to the measured one over the entire oxide thickness range. The proposed model is suitable enough to be used in circuit simulator due to its simplicity. It is observed that neglecting deep sub-micron effect may lead to large error in the calculated gate tunneling current. It is found that temperature has almost negligible effect on gate tunneling current. It is also reported that gate tunneling current reduces with the increase of gate oxide thickness. The impact of source/drain overlap length is also assessed on gate tunneling current.

Keywords: Gate tunneling current, analytical model, gate dielectrics, non uniform poly gate doping, MOSFET, fringing field effect and image charges.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1694
97 Time Series Forecasting Using Various Deep Learning Models

Authors: Jimeng Shi, Mahek Jain, Giri Narasimhan

Abstract:

Time Series Forecasting (TSF) is used to predict the target variables at a future time point based on the learning from previous time points. To keep the problem tractable, learning methods use data from a fixed length window in the past as an explicit input. In this paper, we study how the performance of predictive models change as a function of different look-back window sizes and different amounts of time to predict into the future. We also consider the performance of the recent attention-based transformer models, which had good success in the image processing and natural language processing domains. In all, we compare four different deep learning methods (Recurrent Neural Network (RNN), Long Short-term Memory (LSTM), Gated Recurrent Units (GRU), and Transformer) along with a baseline method. The dataset (hourly) we used is the Beijing Air Quality Dataset from the website of University of California, Irvine (UCI), which includes a multivariate time series of many factors measured on an hourly basis for a period of 5 years (2010-14). For each model, we also report on the relationship between the performance and the look-back window sizes and the number of predicted time points into the future. Our experiments suggest that Transformer models have the best performance with the lowest Mean   Absolute Errors (MAE = 14.599, 23.273) and Root Mean Square Errors (RSME = 23.573, 38.131) for most of our single-step and multi-steps predictions. The best size for the look-back window to predict 1 hour into the future appears to be one day, while 2 or 4 days perform the best to predict 3 hours into the future.

Keywords: Air quality prediction, deep learning algorithms, time series forecasting, look-back window.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1045
96 Lighting Consumption Analysis in Retail Industry: Comparative Study

Authors: Elena C. Tamaş, Grațiela M. Țârlea, Gianni Flamaropol, Dragoș Hera

Abstract:

This article is referring to a comparative study regarding the electrical energy consumption for lighting on diverse types of big sizes commercial buildings built in Romania after 2007, having 3, 4, 5 versus 8, 9, 10 operational years. Some buildings have installed building management systems (BMS) to monitor also the lighting performances starting with the opening days till the present days but some have chosen only local meters to implement. Firstly, for each analyzed building, the total required energy power and the energy power consumption for lighting were calculated depending on the lamps number, the unit power and the average daily running hours. All objects and installations were chosen depending on the destination/location of the lighting (exterior parking or access, interior or covering parking, building interior and building perimeter). Secondly, to all lighting objects and installations, mechanical counters were installed, and to the ones linked to BMS there were installed the digital meters as well for a better monitoring. Some efficient solutions are proposed to improve the power consumption, for example the 1/3 lighting functioning for the covered and exterior parking lighting to those buildings if can be done. This type of lighting share can be performed on each level, especially on the night shifts. Another example is to use the dimmers to reduce the light level, depending on the executed work in the respective area, and a 30% power energy saving can be achieved. Using the right BMS to monitor, the energy consumption depending on the average operational daily hours and changing the non-performant unit lights with the ones having LED technology or economical ones might increase significantly the energy performances and reduce the energy consumption of the buildings.

Keywords: Lighting consumption, commercial buildings, maintenance, energy performances.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 934
95 Applying Kinect on the Development of a Customized 3D Mannequin

Authors: Shih-Wen Hsiao, Rong-Qi Chen

Abstract:

In the field of fashion design, 3D Mannequin is a kind of assisting tool which could rapidly realize the design concepts. While the concept of 3D Mannequin is applied to the computer added fashion design, it will connect with the development and the application of design platform and system. Thus, the situation mentioned above revealed a truth that it is very critical to develop a module of 3D Mannequin which would correspond with the necessity of fashion design. This research proposes a concrete plan that developing and constructing a system of 3D Mannequin with Kinect. In the content, ergonomic measurements of objective human features could be attained real-time through the implement with depth camera of Kinect, and then the mesh morphing can be implemented through transformed the locations of the control-points on the model by inputting those ergonomic data to get an exclusive 3D mannequin model. In the proposed methodology, after the scanned points from the Kinect are revised for accuracy and smoothening, a complete human feature would be reconstructed by the ICP algorithm with the method of image processing. Also, the objective human feature could be recognized to analyze and get real measurements. Furthermore, the data of ergonomic measurements could be applied to shape morphing for the division of 3D Mannequin reconstructed by feature curves. Due to a standardized and customer-oriented 3D Mannequin would be generated by the implement of subdivision, the research could be applied to the fashion design or the presentation and display of 3D virtual clothes. In order to examine the practicality of research structure, a system of 3D Mannequin would be constructed with JAVA program in this study. Through the revision of experiments the practicability-contained research result would come out.

Keywords: 3D Mannequin, kinect scanner, interactive closest point, shape morphing, subdivision.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2028
94 Response Time Behavior Trends of Proptional, Propotional Integral and Proportional Integral Derivative Mode on Lab Scale

Authors: Syed Zohaib Javaid Zaidi, W. Iqbal

Abstract:

The industrial automation is dependent upon pneumatic control systems. The industrial units are now controlled with digital control systems to tackle the process variables like Temperature, Pressure, Flow rates and Composition.

This research work produces an evaluation of the response time fluctuations for proportional mode, proportional integral and proportional integral derivative modes of automated chemical process control. The controller output is measured for different values of gain with respect to time in three modes (P, PI and PID). In case of P-mode for different values of gain the controller output has negligible change. When the controller output of PI-mode is checked for constant gain, it can be seen that by decreasing the integral time the controller output has showed more fluctuations. The PID mode results have found to be more interesting in a way that when rate minute has changed, the controller output has also showed fluctuations with respect to time.  The controller output for integral mode and derivative mode are observed with lesser steady state error, minimum offset and larger response time to control the process variable.   The tuning parameters in case of P-mode are only steady state gain with greater errors with respect to controller output. The integral mode showed controller outputs with intermediate responses during integral gain (ki).  By increasing the rate minute the derivative gain (kd) also increased which showed the controlled oscillations in case of PID mode and lesser overshoot.

Keywords: Controller Output, P, PI &PID modes, Steady state gain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5548
93 Self-Healing Phenomenon Evaluation in Cementitious Matrix with Different Water/Cement Ratios and Crack Opening Age

Authors: V. G. Cappellesso, D. M. G. da Silva, J. A. Arndt, N. dos Santos Petry, A. B. Masuero, D. C. C. Dal Molin

Abstract:

Concrete elements are subject to cracking, which can be an access point for deleterious agents that can trigger pathological manifestations reducing the service life of these structures. Finding ways to minimize or eliminate the effects of this aggressive agents’ penetration, such as the sealing of these cracks, is a manner of contributing to the durability of these structures. The cementitious self-healing phenomenon can be classified in two different processes. The autogenous self-healing that can be defined as a natural process in which the sealing of this cracks occurs without the stimulation of external agents, meaning, without different materials being added to the mixture, while on the other hand, the autonomous seal-healing phenomenon depends on the insertion of a specific engineered material added to the cement matrix in order to promote its recovery. This work aims to evaluate the autogenous self-healing of concretes produced with different water/cement ratios and exposed to wet/dry cycles, considering two ages of crack openings, 3 days and 28 days. The self-healing phenomenon was evaluated using two techniques: crack healing measurement using ultrasonic waves and image analysis performed with an optical microscope. It is possible to observe that by both methods, it possible to observe the self-healing phenomenon of the cracks. For young ages of crack openings and lower water/cement ratios, the self-healing capacity is higher when compared to advanced ages of crack openings and higher water/cement ratios. Regardless of the crack opening age, these concretes were found to stabilize the self-healing processes after 80 days or 90 days.

Keywords: Self-healing, autogenous, water/cement ratio, curing cycles, test methods.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 914
92 An Educational Application of Online Games for Learning Difficulties

Authors: M. Margoudi, Z. Smyrnaiou

Abstract:

The current paper presents the results of a conducted case study. During the past few years the number of children diagnosed with Learning Difficulties has drastically augmented and especially the cases of ADHD (Attention Deficit Hyperactivity Disorder). One of the core characteristics of ADHD is a deficit in working memory functions. The review of the literature indicates a plethora of educational software that aim at training and enhancing the working memory. Nevertheless, in the current paper, the possibility of using for the same purpose free, online games will be explored. Another issue of interest is the potential effect of the working memory training to the core symptoms of ADHD. In order to explore the abovementioned research questions, three digital tests are employed, all of which are developed on the E-slate platform by the author, in order to check the levels of ADHD’s symptoms and to be used as diagnostic tools, both in the beginning and in the end of the case study. The tools used during the main intervention of the research are free online games for the training of working memory. The research and the data analysis focus on the following axes: a) the presence and the possible change in two of the core symptoms of ADHD, attention and impulsivity and b) a possible change in the general cognitive abilities of the individual. The case study was conducted with the participation of a thirteen year-old, female student, diagnosed with ADHD, during after-school hours. The results of the study indicate positive changes both in the levels of attention and impulsivity. Therefore, we conclude that the training of working memory through the use of free, online games has a positive impact on the characteristics of ADHD. Finally, concerning the second research question, the change in general cognitive abilities, no significant changes were noted.

Keywords: ADHD, attention, impulsivity, online games.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1832
91 Constructing Masculinity through Images: Content Analysis of Lifestyle Magazines in Croatia

Authors: Marija Lončar, Zorana Šuljug Vučica, Magdalena Nigoević

Abstract:

Diverse social, cultural and economic trends and changes in contemporary societies influence the ways masculinity is represented in a variety of media. Masculinity is constructed within media images as a dynamic process that changes slowly over time and is shaped by various social factors. In many societies, dominant masculinity is still associated with authority, heterosexuality, marriage, professional and financial success, ethnic dominance and physical strength. But contemporary media depict men in ways that suggest a change in the approach to media images. The number of media images of men, which promote men’s identity through their body, have increased. With the male body more scrutinized and commodified, it is necessary to highlight how the body is represented and which visual elements are crucial since the body has an important role in the construction of masculinities. The study includes content analysis of male body images in the advertisements of different men’s and women’s lifestyle magazines available in Croatia. The main aim was to explore how masculinities are currently being portrayed through body regarding age, physical appearance, fashion, touch and gaze. The findings are also discussed in relation to female images since women are central in many of the processes constructing masculinities and according to the recent conceptualization of masculinity. Although the construction of male images varies through body features, almost all of them convey the message that men’s identity could be managed through manipulation and by enhancing the appearance. Furthermore, they suggest that men should engage in “bodywork” through advertised products, activities and/or practices, in order to achieve their preferred social image.

Keywords: Body images, content analysis, lifestyle magazines, masculinity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1412
90 Survey of Communication Technologies for IoT Deployments in Developing Regions

Authors: Namugenyi Ephrance Eunice, Julianne Sansa Otim, Marco Zennaro, Stephen D. Wolthusen

Abstract:

The Internet of Things (IoT) is a network of connected data processing devices, mechanical and digital machinery, items, animals, or people that may send data across a network without requiring human-to-human or human-to-computer interaction. Each component has sensors that can pick up on specific phenomena, as well as processing software and other technologies that can link to and communicate with other systems and/or devices over the Internet or other communication networks and exchange data with them. IoT is increasingly being used in fields other than consumer electronics, such as public safety, emergency response, industrial automation, autonomous vehicles, the Internet of Medical Things (IoMT), and general environmental monitoring. Consumer-based IoT applications, like smart home gadgets and wearables, are also becoming more prevalent. This paper presents the main IoT deployment areas for environmental monitoring in developing regions and the backhaul options suitable for them based on a couple of related works. The study includes an overview of existing IoT deployments, the underlying communication architectures, protocols, and technologies that support them. This overview shows that Low Power Wireless Area Networks (LPWANs) are very well suited for monitoring environment architectures designed for remote locations. LoRa technology, particularly the LoRaWAN protocol, has an advantage over other technologies due to its low power consumption, adaptability, and suitable communication range. The current challenges of various architectures are discussed in detail, with the major issue identified as obstruction of communication paths by buildings, trees, hills, etc.

Keywords: Communication technologies, environmental monitoring, Internet of Things, IoT, IoT deployment challenges.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 299
89 Stress Relaxation of Date at Different Temperature and Moisture Content of Product: A New Approach

Authors: D. Zare, M. Alirezaei, S.M. Nassiri

Abstract:

Iran is one of the greatest producers of date in the world. However due to lack of information about its viscoelastic properties, much of the production downgraded during harvesting and postharvesting processes. In this study the effect of temperature and moisture content of product were investigated on stress relaxation characteristics. Therefore, the freshly harvested date (kabkab) at tamar stage were put in controlled environment chamber to obtain different temperature levels (25, 35, 45, and 55 0C) and moisture contents (8.5, 8.7, 9.2, 15.3, 20, 32.2 %d.b.). A texture analyzer TAXT2 (Stable Microsystems, UK) was used to apply uniaxial compression tests. A chamber capable to control temperature was designed and fabricated around the plunger of texture analyzer to control the temperature during the experiment. As a new approach a CCD camera (A4tech, 30 fps) was mounted on a cylindrical glass probe to scan and record contact area between date and disk. Afterwards, pictures were analyzed using image processing toolbox of Matlab software. Individual date fruit was uniaxially compressed at speed of 1 mm/s. The constant strain of 30% of thickness of date was applied to the horizontally oriented fruit. To select a suitable model for describing stress relaxation of date, experimental data were fitted with three famous stress relaxation models including the generalized Maxwell, Nussinovitch, and Pelege. The constant in mentioned model were determined and correlated with temperature and moisture content of product using non-linear regression analysis. It was found that Generalized Maxwell and Nussinovitch models appropriately describe viscoelastic characteristics of date fruits as compared to Peleg mode.

Keywords: Stress relaxation, Viscoelastic properties, Date, Texture analyzer.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1865
88 A Set Theory Based Factoring Technique and Its Use for Low Power Logic Design

Authors: Padmanabhan Balasubramanian, Ryuta Arisaka

Abstract:

Factoring Boolean functions is one of the basic operations in algorithmic logic synthesis. A novel algebraic factorization heuristic for single-output combinatorial logic functions is presented in this paper and is developed based on the set theory paradigm. The impact of factoring is analyzed mainly from a low power design perspective for standard cell based digital designs in this paper. The physical implementation of a number of MCNC/IWLS combinational benchmark functions and sub-functions are compared before and after factoring, based on a simple technology mapping procedure utilizing only standard gate primitives (readily available as standard cells in a technology library) and not cells corresponding to optimized complex logic. The power results were obtained at the gate-level by means of an industry-standard power analysis tool from Synopsys, targeting a 130nm (0.13μm) UMC CMOS library, for the typical case. The wire-loads were inserted automatically and the simulations were performed with maximum input activity. The gate-level simulations demonstrate the advantage of the proposed factoring technique in comparison with other existing methods from a low power perspective, for arbitrary examples. Though the benchmarks experimentation reports mixed results, the mean savings in total power and dynamic power for the factored solution over a non-factored solution were 6.11% and 5.85% respectively. In terms of leakage power, the average savings for the factored forms was significant to the tune of 23.48%. The factored solution is expected to better its non-factored counterpart in terms of the power-delay product as it is well-known that factoring, in general, yields a delay-efficient multi-level solution.

Keywords: Factorization, Set theory, Logic function, Standardcell based design, Low power.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1755
87 The Discovery and Application of Perspective Representation in Modern Italy

Authors: Matthias Stange

Abstract:

In the early modern period, a different image of man began to prevail in Europe. The focus was on the self-determined human being and his abilities. At first, these developments could be seen in Italian painting and architecture, which again oriented itself to the concepts and forms of antiquity. For example, through the discovery of perspective representation by Brunelleschi or later the orthogonal projection by Alberti, after the ancient knowledge of optics had been forgotten in the Middle Ages. The understanding of reality in the Middle Ages was not focused on the sensually perceptible world, but was determined by ecclesiastical dogmas. The empirical part of this study examines the rediscovery and development of perspective. With the paradigm of antiquity, the figure of the architect was also recognised again - the cultural man trained theoretically and practically in numerous subjects, as Vitruvius describes him. In this context, the role of the architect, the influence on the painting of the Quattrocento as well as the influence on architectural representation in the Baroque period are examined. Baroque is commonly associated with the idea of illusionistic appearance as opposed to the tangible reality presented in the Renaissance. The study has shown that the central perspective projection developed by Filippo Brunelleschi enabled another understanding of seeing and the dissemination of painted images. Brunelleschi's development made it possible to understand the sight of nature as a reflection of what is presented to the viewer's eye. Alberti later shortened Brunelleschi's central perspective representation for practical use in painting. In early modern Italian architecture and painting, these developments apparently supported each other. The pictorial representation of architecture initially served the development of an art form before it became established in building practice itself.

Keywords: Alberti, Brunelleschi, Central perspective projection, Orthogonal projection, Quattrocento, Baroque.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 93
86 Preparation of Polymer-Stabilized Magnetic Iron Oxide as Selective Drug Nanocarriers to Human Acute Myeloid Leukemia

Authors: Kheireddine El-Boubbou

Abstract:

Drug delivery to target human acute myeloid leukemia (AML) using a nanoparticulate chemotherapeutic formulation that can deliver drugs selectively to AML cancer is hugely needed. In this work, we report the development of a nanoformulation made of polymeric-stabilized multifunctional magnetic iron oxide nanoparticles (PMNP) loaded with the anticancer drug Doxorubicin (Dox) as a promising drug carrier to treat AML. Dox@PMNP conjugates simultaneously exhibited high drug content, maximized fluorescence, and excellent release properties. Nanoparticulate uptake and cell death following addition of Dox@PMNPs were then evaluated in different types of human AML target cells, as well as on normal human cells. While the unloaded MNPs were not toxic to any of the cells, Dox@PMNPs were found to be highly toxic to the different AML cell lines, albeit at different inhibitory concentrations (IC50 values), but showed very little toxicity towards the normal cells. In comparison, free Dox showed significant potency concurrently to all the cell lines, suggesting huge potentials for the use of Dox@PMNPs as selective AML anticancer cargos. Live confocal imaging, fluorescence and electron microscopy confirmed that Dox is indeed delivered to the nucleus in relatively short periods of time, causing apoptotic cell death. Importantly, this targeted payload may potentially enhance the effectiveness of the drug in AML patients and may further allow physicians to image leukemic cells exposed to Dox@PMNPs using MRI.

Keywords: Magnetic nanoparticles, drug delivery, acute myeloid leukemia, iron oxide, cancer nanotherapy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1100
85 LCA and Multi-Criteria Analysis of Fly Ash Concrete Pavements

Authors: M. Ondova, A. Estokova

Abstract:

Rapid industrialization results in increased use of natural resources bring along serious ecological and environmental imbalance due to the dumping of industrial wastes. Principles of sustainable construction have to be accepted with regard to the consumption of natural resources and the production of harmful emissions. Cement is a great importance raw material in the building industry and today is its large amount used in the construction of concrete pavements. Concerning raw materials cost and producing CO2 emission the replacing of cement in concrete mixtures with more sustainable materials is necessary. To reduce this environmental impact people all over the world are looking for a solution. Over a period of last ten years, the image of fly ash has completely been changed from a polluting waste to resource material and it can solve the major problems of cement use. Fly ash concretes are proposed as a potential approach for achieving substantial reductions in cement. It is known that it improves the workability of concrete, extends the life cycle of concrete roads, and reduces energy use and greenhouse gas as well as amount of coal combustion products that must be disposed in landfills.

Life cycle assessment also proved that a concrete pavement with fly ash cement replacement is considerably more environmentally friendly compared to standard concrete roads. In addition, fly ash is cheap raw material, and the costs saving are guaranteed. The strength properties, resistance to a frost or de-icing salts, which are important characteristics in the construction of concrete pavements, have reached the required standards as well. In terms of human health it can´t be stated that a concrete cover with fly ash could be dangerous compared with a cover without fly ash. Final Multi-criteria analysis also pointed that a concrete with fly ash is a clearly proper solution.

Keywords: Life cycle assessment, fly ash, waste, concrete pavements

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3459
84 Why Are Entrepreneurs Resistant to E-tools?

Authors: D. Ščeulovs, E. Gaile-Sarkane

Abstract:

Latvia is the fourth in the world by means of broadband internet speed. The total number of internet users in Latvia exceeds 70% of its population. The number of active mailboxes of the local internet e-mail service Inbox.lv accounts for 68% of the population and 97.6% of the total number of internet users. The Latvian portal Draugiem.lv is a phenomenon of social media, because 58.4 % of the population and 83.5% of internet users use it. A majority of Latvian company profiles are available on social networks, the most popular being Twitter.com. These and other parameters prove the fact consumers and companies are actively using the Internet. 

However, after the authors in a number of studies analyzed how enterprises are employing the e-environment, namely, e-environment tools, they arrived to the conclusions that are not as flattering as the aforementioned statistics. There is an obvious contradiction between the statistical data and the actual studies. As a result, the authors have posed a question: Why are entrepreneurs resistant to e-tools? In order to answer this question, the authors have addressed the Technology Acceptance Model (TAM). The authors analyzed each phase and determined several factors affecting the use of e-environment, reaching the main conclusion that entrepreneurs do not have a sufficient level of e-literacy (digital literacy). 

The authors employ well-established quantitative and qualitative methods of research: grouping, analysis, statistic method, factor analysis in SPSS 20  environment etc. 

The theoretical and methodological background of the research is formed by, scientific researches and publications, that from the mass media and professional literature, statistical information from legal institutions as well as information collected by the author during the survey.

Keywords: E-environment, e-environment tools, technology acceptance model, factors.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1494
83 An Efficient Approach for Shear Behavior Definition of Plant Stalk

Authors: M. R. Kamandar, J. Massah

Abstract:

The information of the impact cutting behavior of plants stalk plays an important role in the design and fabrication of plants cutting equipment. It is difficult to investigate a theoretical method for defining cutting properties of plants stalks because the cutting process is complex. Thus, it is necessary to set up an experimental approach to determine cutting parameters for a single stalk. To measure the shear force, shear energy and shear strength of plant stalk, a special impact cutting tester was fabricated. It was similar to an Izod impact cutting tester for metals but a cutting blade and data acquisition system were attached to the end of pendulum's arm. The apparatus was included four strain gages and a digital indicator to show the real-time cutting force of plant stalk. To measure the shear force and also testing the apparatus, two plants’ stalks, like buxus and privet, were selected. The samples (buxus and privet stalks) were cut under impact cutting process at four loading rates 1, 2, 3 and 4 m.s-1 and three internodes fifth, tenth and fifteenth by the apparatus. At buxus cutting analysis: the minimum value of cutting energy was obtained at fifth internode and loading rate 4 m.s-1 and the maximum value of shear energy was obtained at fifteenth internode and loading rate 1 m.s-1. At privet cutting analysis: the minimum value of shear consumption energy was obtained at fifth internode and loading rate: 4 m.s-1 and the maximum value of shear energy was obtained at fifteenth internode and loading rate: 1 m.s-1. The statistical analysis at both plants showed that the increase of impact cutting speed would decrease the shear consumption energy and shear strength. In two scenarios, the results showed that with increase the cutting speed, shear force would decrease.

Keywords: Buxus, privet, impact cutting, shear energy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 780
82 Effect of Exit Annular Area on the Flow Field Characteristics of an Unconfined Premixed Annular Swirl Burner

Authors: Vishnu Raj, Chockalingam Prathap

Abstract:

The objective of this study was to explore the impact of variation in the exit annular area on the local flow field features and the flame stability of an annular premixed swirl burner (unconfined) operated with a premixed n-butane air mixture at an equivalence ratio (Φ) = 1, 1 bar, and 300K. A swirl burner with an axial swirl generator having a swirl number of 1.5 was used. Three different burner heads were chosen to have the exit area increased from 100%, 160%, and 220% resulting in inner and outer diameters and cross-sectional areas as (1) 10 mm & 15 mm, 98 mm2 (2) 17.5 mm & 22.5 mm, 157 mm2 and (3) 25 mm & 30 mm, 216 mm2. The bulk velocity and Reynolds number based on the hydraulic diameter and unburned gas properties were kept constant at 12 m/s and 4000. (i) Planar Particle Image Velocimetry (PIV) with TiO2 seeding particles and (ii) CH* chemiluminescence was used to measure the velocity fields and reaction zones of the swirl flames at 5 Hz, respectively. Velocity fields and the jet spreading rates measured at the isothermal and reactive conditions revealed that the presence of a flame significantly altered the flow field in the radial direction due to the gas expansion. Important observations from the flame measurements were: the height and maximum width of the recirculation bubbles normalized by the hydraulic diameter, and the jet spreading angles for the flames for the three exit area cases were: (a) 4.52, 1.95, 34◦, (b) 6.78, 2.37, 26◦, and (c) 8.73, 2.32, 22◦. The lean blowout (LBO) was also measured, and the respective equivalence ratios were: 0.80, 0.92, and 0.82. LBO was relatively narrow for the 157 mm2 case. For this case, PIV measurements showed that Turbulent Kinetic Energy and turbulent intensity were relatively high compared to the other two cases, resulting in higher stretch rates and narrower LBO.

Keywords: Chemiluminescence, jet spreading rate, lean blow out, swirl flow.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 180
81 Design and Validation of an Aerodynamic Model of the Cessna Citation X Horizontal Stabilizer Using both OpenVSP and Digital Datcom

Authors: Marine Segui, Matthieu Mantilla, Ruxandra Mihaela Botez

Abstract:

This research is the part of a major project at the Research Laboratory in Active Controls, Avionics and Aeroservoelasticity (LARCASE) aiming to improve a Cessna Citation X aircraft cruise performance with an application of the morphing wing technology on its horizontal tail. However, the horizontal stabilizer of the Cessna Citation X turns around its span axis with an angle between -8 and 2 degrees. Within this range, the horizontal stabilizer generates certainly some unwanted drag. To cancel this drag, the LARCASE proposes to trim the aircraft with a horizontal stabilizer equipped by a morphing wing technology. This technology aims to optimize aerodynamic performances by changing the conventional horizontal tail shape during the flight. As a consequence, this technology will be able to generate enough lift on the horizontal tail to balance the aircraft without an unwanted drag generation. To conduct this project, an accurate aerodynamic model of the horizontal tail is firstly required. This aerodynamic model will finally allow precise comparison between a conventional horizontal tail and a morphed horizontal tail results. This paper presents how this aerodynamic model was designed. In this way, it shows how the 2D geometry of the horizontal tail was collected and how the unknown airfoil’s shape of the horizontal tail has been recovered. Finally, the complete horizontal tail airfoil shape was found and a comparison between aerodynamic polar of the real horizontal tail and the horizontal tail found in this paper shows a maximum difference of 0.04 on the lift or the drag coefficient which is very good. Aerodynamic polar data of the aircraft horizontal tail are obtained from the CAE Inc. level D research aircraft flight simulator of the Cessna Citation X.

Keywords: Aerodynamic, Cessna, Citation X, coefficient, Datcom, drag, lift, longitudinal, model, OpenVSP.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1458
80 Advanced Stochastic Models for Partially Developed Speckle

Authors: Jihad S. Daba (Jean-Pierre Dubois), Philip Jreije

Abstract:

Speckled images arise when coherent microwave, optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted by speckle noise is complicated by the nature of the noise and is not as straightforward as detection and estimation in additive noise. In this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series of Laguerre weighted exponential functions, resulting in a doubly stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form. It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.

Keywords: Doubly stochastic filtered process, Poisson point process, segmentation, speckle, ultrasound

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1707
79 Autonomous Robots- Visual Perception in Underground Terrains Using Statistical Region Merging

Authors: Omowunmi E. Isafiade, Isaac O. Osunmakinde, Antoine B. Bagula

Abstract:

Robots- visual perception is a field that is gaining increasing attention from researchers. This is partly due to emerging trends in the commercial availability of 3D scanning systems or devices that produce a high information accuracy level for a variety of applications. In the history of mining, the mortality rate of mine workers has been alarming and robots exhibit a great deal of potentials to tackle safety issues in mines. However, an effective vision system is crucial to safe autonomous navigation in underground terrains. This work investigates robots- perception in underground terrains (mines and tunnels) using statistical region merging (SRM) model. SRM reconstructs the main structural components of an imagery by a simple but effective statistical analysis. An investigation is conducted on different regions of the mine, such as the shaft, stope and gallery, using publicly available mine frames, with a stream of locally captured mine images. An investigation is also conducted on a stream of underground tunnel image frames, using the XBOX Kinect 3D sensors. The Kinect sensors produce streams of red, green and blue (RGB) and depth images of 640 x 480 resolution at 30 frames per second. Integrating the depth information to drivability gives a strong cue to the analysis, which detects 3D results augmenting drivable and non-drivable regions in 2D. The results of the 2D and 3D experiment with different terrains, mines and tunnels, together with the qualitative and quantitative evaluation, reveal that a good drivable region can be detected in dynamic underground terrains.

Keywords: Drivable Region Detection, Kinect Sensor, Robots' Perception, SRM, Underground Terrains.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1790
78 Training Undergraduate Engineering Students in Robotics and Automation through Model-Based Design Training: A Case Study at Assumption University of Thailand

Authors: Sajed A. Habib

Abstract:

Problem-based learning (PBL) is a student-centered pedagogy that originated in the medical field and has also been used extensively in other knowledge disciplines with recognized advantages and limitations. PBL has been used in various undergraduate engineering programs with mixed outcomes. The current fourth industrial revolution (digital era or Industry 4.0) has made it essential for many science and engineering students to receive effective training in advanced courses such as industrial automation and robotics. This paper presents a case study at Assumption University of Thailand, where a PBL-like approach was used to teach some aspects of automation and robotics to selected groups of undergraduate engineering students. These students were given some basic level training in automation prior to participating in a subsequent training session in order to solve technical problems with increased complexity. The participating students’ evaluation of the training sessions in terms of learning effectiveness, skills enhancement, and incremental knowledge following the problem-solving session was captured through a follow-up survey consisting of 14 questions and a 5-point scoring system. From the most recent training event, an overall 70% of the respondents indicated that their skill levels were enhanced to a much greater level than they had had before the training, whereas 60.4% of the respondents from the same event indicated that their incremental knowledge following the session was much greater than what they had prior to the training. The instructor-facilitator involved in the training events suggested that this method of learning was more suitable for senior/advanced level students than those at the freshmen level as certain skills to effectively participate in such problem-solving sessions are acquired over a period of time, and not instantly.

Keywords: Automation, industry 4.0, model-based design training, problem-based learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1033
77 Selective Encryption using ISMA Cryp in Real Time Video Streaming of H.264/AVC for DVB-H Application

Authors: Jay M. Joshi, Upena D. Dalal

Abstract:

Multimedia information availability has increased dramatically with the advent of video broadcasting on handheld devices. But with this availability comes problems of maintaining the security of information that is displayed in public. ISMA Encryption and Authentication (ISMACryp) is one of the chosen technologies for service protection in DVB-H (Digital Video Broadcasting- Handheld), the TV system for portable handheld devices. The ISMACryp is encoded with H.264/AVC (advanced video coding), while leaving all structural data as it is. Two modes of ISMACryp are available; the CTR mode (Counter type) and CBC mode (Cipher Block Chaining) mode. Both modes of ISMACryp are based on 128- bit AES algorithm. AES algorithms are more complex and require larger time for execution which is not suitable for real time application like live TV. The proposed system aims to gain a deep understanding of video data security on multimedia technologies and to provide security for real time video applications using selective encryption for H.264/AVC. Five level of security proposed in this paper based on the content of NAL unit in Baseline Constrain profile of H.264/AVC. The selective encryption in different levels provides encryption of intra-prediction mode, residue data, inter-prediction mode or motion vectors only. Experimental results shown in this paper described that fifth level which is ISMACryp provide higher level of security with more encryption time and the one level provide lower level of security by encrypting only motion vectors with lower execution time without compromise on compression and quality of visual content. This encryption scheme with compression process with low cost, and keeps the file format unchanged with some direct operations supported. Simulation was being carried out in Matlab.

Keywords: AES-128, CAVLC, H.264, ISMACryp

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2006