Search results for: file format
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 705

Search results for: file format

615 Deep Learning-Based Classification of 3D CT Scans with Real Clinical Data; Impact of Image format

Authors: Maryam Fallahpoor, Biswajeet Pradhan

Abstract:

Background: Artificial intelligence (AI) serves as a valuable tool in mitigating the scarcity of human resources required for the evaluation and categorization of vast quantities of medical imaging data. When AI operates with optimal precision, it minimizes the demand for human interpretations and, thereby, reduces the burden on radiologists. Among various AI approaches, deep learning (DL) stands out as it obviates the need for feature extraction, a process that can impede classification, especially with intricate datasets. The advent of DL models has ushered in a new era in medical imaging, particularly in the context of COVID-19 detection. Traditional 2D imaging techniques exhibit limitations when applied to volumetric data, such as Computed Tomography (CT) scans. Medical images predominantly exist in one of two formats: neuroimaging informatics technology initiative (NIfTI) and digital imaging and communications in medicine (DICOM). Purpose: This study aims to employ DL for the classification of COVID-19-infected pulmonary patients and normal cases based on 3D CT scans while investigating the impact of image format. Material and Methods: The dataset used for model training and testing consisted of 1245 patients from IranMehr Hospital. All scans shared a matrix size of 512 × 512, although they exhibited varying slice numbers. Consequently, after loading the DICOM CT scans, image resampling and interpolation were performed to standardize the slice count. All images underwent cropping and resampling, resulting in uniform dimensions of 128 × 128 × 60. Resolution uniformity was achieved through resampling to 1 mm × 1 mm × 1 mm, and image intensities were confined to the range of (−1000, 400) Hounsfield units (HU). For classification purposes, positive pulmonary COVID-19 involvement was designated as 1, while normal images were assigned a value of 0. Subsequently, a U-net-based lung segmentation module was applied to obtain 3D segmented lung regions. The pre-processing stage included normalization, zero-centering, and shuffling. Four distinct 3D CNN models (ResNet152, ResNet50, DensNet169, and DensNet201) were employed in this study. Results: The findings revealed that the segmentation technique yielded superior results for DICOM images, which could be attributed to the potential loss of information during the conversion of original DICOM images to NIFTI format. Notably, ResNet152 and ResNet50 exhibited the highest accuracy at 90.0%, and the same models achieved the best F1 score at 87%. ResNet152 also secured the highest Area under the Curve (AUC) at 0.932. Regarding sensitivity and specificity, DensNet201 achieved the highest values at 93% and 96%, respectively. Conclusion: This study underscores the capacity of deep learning to classify COVID-19 pulmonary involvement using real 3D hospital data. The results underscore the significance of employing DICOM format 3D CT images alongside appropriate pre-processing techniques when training DL models for COVID-19 detection. This approach enhances the accuracy and reliability of diagnostic systems for COVID-19 detection.

Keywords: deep learning, COVID-19 detection, NIFTI format, DICOM format

Procedia PDF Downloads 45
614 Myanmar Character Recognition Using Eight Direction Chain Code Frequency Features

Authors: Kyi Pyar Zaw, Zin Mar Kyu

Abstract:

Character recognition is the process of converting a text image file into editable and searchable text file. Feature Extraction is the heart of any character recognition system. The character recognition rate may be low or high depending on the extracted features. In the proposed paper, 25 features for one character are used in character recognition. Basically, there are three steps of character recognition such as character segmentation, feature extraction and classification. In segmentation step, horizontal cropping method is used for line segmentation and vertical cropping method is used for character segmentation. In the Feature extraction step, features are extracted in two ways. The first way is that the 8 features are extracted from the entire input character using eight direction chain code frequency extraction. The second way is that the input character is divided into 16 blocks. For each block, although 8 feature values are obtained through eight-direction chain code frequency extraction method, we define the sum of these 8 feature values as a feature for one block. Therefore, 16 features are extracted from that 16 blocks in the second way. We use the number of holes feature to cluster the similar characters. We can recognize the almost Myanmar common characters with various font sizes by using these features. All these 25 features are used in both training part and testing part. In the classification step, the characters are classified by matching the all features of input character with already trained features of characters.

Keywords: chain code frequency, character recognition, feature extraction, features matching, segmentation

Procedia PDF Downloads 290
613 To Determine the Effects of Regulatory Food Safety Inspections on the Grades of Different Categories of Retail Food Establishments across the Dubai Region

Authors: Shugufta Mohammad Zubair

Abstract:

This study explores the Effect of the new food System Inspection system also called the new inspection color card scheme on reduction of critical & major food safety violations in Dubai. Data was collected from all retail food service establishments located in two zones in the city. Each establishment was visited twice, once before the launch of the new system and one after the launch of the system. In each visit, the Inspection checklist was used as the evaluation tool for observation of the critical and major violations. The old format of the inspection checklist was concerned with scores based on the violations; but the new format of the checklist for the new inspection color card scheme is divided into administrative, general major and critical which gives a better classification for the inspectors to identify the critical and major violations of concerned. The study found that there has been a better and clear marking of violations after the launch of new inspection system wherein the inspectors are able to mark and categories the violations effectively. There had been a 10% decrease in the number of food establishment that was previously given A grade. The B & C grading were also considerably dropped by 5%.

Keywords: food inspection, risk assessment, color card scheme, violations

Procedia PDF Downloads 299
612 An Evaluation Model for Automatic Map Generalization

Authors: Quynhan Tran, Hong Fan, Quockhanh Pham

Abstract:

Automatic map generalization is a well-known problem in cartography. The development of map generalization research accompanied the development of cartography. The traditional map is plotted manually by cartographic experts. The paper studies none-scale automation generalization of resident polygons and house marker symbol, proposes methodology to evaluate the result maps based on minimal spanning tree. In this paper, the minimal spanning tree before and after map generalization is compared to evaluate whether the generalization result maintain the geographical distribution of features. The minimal spanning tree in vector format is firstly converted into a raster format and the grid size is 2mm (distance on the map). The statistical number of matching grid before and after map generalization and the ratio of overlapping grid to the total grids is calculated. Evaluation experiments are conduct to verify the results. Experiments show that this methodology can give an objective evaluation for the feature distribution and give specialist an hand while they evaluate result maps of none-scale automation generalization with their eyes.

Keywords: automatic cartography generalization, evaluation model, geographic feature distribution, minimal spanning tree

Procedia PDF Downloads 607
611 A Game-Based Product Modelling Environment for Non-Engineer

Authors: Guolong Zhong, Venkatesh Chennam Vijay, Ilias Oraifige

Abstract:

In the last 20 years, Knowledge Based Engineering (KBE) has shown its advantages in product development in different engineering areas such as automation, mechanical, civil and aerospace engineering in terms of digital design automation and cost reduction by automating repetitive design tasks through capturing, integrating, utilising and reusing the existing knowledge required in various aspects of the product design. However, in primary design stages, the descriptive information of a product is discrete and unorganized while knowledge is in various forms instead of pure data. Thus, it is crucial to have an integrated product model which can represent the entire product information and its associated knowledge at the beginning of the product design. One of the shortcomings of the existing product models is a lack of required knowledge representation in various aspects of product design and its mapping to an interoperable schema. To overcome the limitation of the existing product model and methodologies, two key factors are considered. First, the product model must have well-defined classes that can represent the entire product information and its associated knowledge. Second, the product model needs to be represented in an interoperable schema to ensure a steady data exchange between different product modelling platforms and CAD software. This paper introduced a method to provide a general product model as a generative representation of a product, which consists of the geometry information and non-geometry information, through a product modelling framework. The proposed method for capturing the knowledge from the designers through a knowledge file provides a simple and efficient way of collecting and transferring knowledge. Further, the knowledge schema provides a clear view and format on the data that needed to be gathered in order to achieve a unified knowledge exchange between different platforms. This study used a game-based platform to make product modelling environment accessible for non-engineers. Further the paper goes on to test use case based on the proposed game-based product modelling environment to validate the effectiveness among non-engineers.

Keywords: game-based learning, knowledge based engineering, product modelling, design automation

Procedia PDF Downloads 114
610 Implementation of Smart Card Automatic Fare Collection Technology in Small Transit Agencies for Standards Development

Authors: Walter E. Allen, Robert D. Murray

Abstract:

Many large transit agencies have adopted RFID technology and electronic automatic fare collection (AFC) or smart card systems, but small and rural agencies remain tied to obsolete manual, cash-based fare collection. Small countries or transit agencies can benefit from the implementation of smart card AFC technology with the promise of increased passenger convenience, added passenger satisfaction and improved agency efficiency. For transit agencies, it reduces revenue loss, improves passenger flow and bus stop data. For countries, further implementation into security, distribution of social services or currency transactions can provide greater benefits. However, small countries or transit agencies cannot afford expensive proprietary smart card solutions typically offered by the major system suppliers. Deployment of Contactless Fare Media System (CFMS) Standard eliminates the proprietary solution, ultimately lowering the cost of implementation. Acumen Building Enterprise, Inc. chose the Yuma County Intergovernmental Public Transportation Authority (YCIPTA) existing proprietary YCAT smart card system to implement CFMS. The revised system enables the purchase of fare product online with prepaid debit or credit cards using the Payment Gateway Processor. Open and interoperable smart card standards for transit have been developed. During the 90-day Pilot Operation conducted, the transit agency gathered the data from the bus AcuFare 200 Card Reader, loads (copies) the data to a USB Thumb Drive and uploads the data to the Acumen Host Processing Center for consolidation of the data into the transit agency master data file. The transition from the existing proprietary smart card data format to the new CFMS smart card data format was transparent to the transit agency cardholders. It was proven that open standards and interoperability design can work and reduce both implementation and operational costs for small transit agencies or countries looking to expand smart card technology. Acumen was able to avoid the implementation of the Payment Card Industry (PCI) Data Security Standards (DSS) which is expensive to develop and costly to operate on a continuing basis. Due to the substantial additional complexities of implementation and the variety of options presented to the transit agency cardholder, Acumen chose to implement only the Directed Autoload. To improve the implementation efficiency and the results for a similar undertaking, it should be considered that some passengers lack credit cards and are averse to technology. There are more than 1,300 small and rural agencies in the United States. This grows by 10 fold when considering small countries or rural locations throughout Latin American and the world. Acumen is evaluating additional countries, sites or transit agency that can benefit from the smart card systems. Frequently, payment card systems require extensive security procedures for implementation. The Project demonstrated the ability to purchase fare value, rides and passes with credit cards on the internet at a reasonable cost without highly complex security requirements.

Keywords: automatic fare collection, near field communication, small transit agencies, smart cards

Procedia PDF Downloads 253
609 Mobile Application for Construction Sites Management

Authors: A. Khelifi, M. Al Kaabi, B. Al Rawashdeh

Abstract:

The infrastructure is one of the most important pillars of the UAE, where it spends millions of dollars for investments in the construction sectors. The research done by Kuwait Finance House (KFH) Research showed clearly that the UAE investments in the construction sectors have exceeded 30 billion dollars in 2013. There are many construction companies in the UAE and each one of them is taking the responsibilities to build different infrastructures. The large scale construction projects consist of multi human activities which can affect the efficiency and productivity of the running projects. The Construction Administration System is developed to increase the efficiency and productivity at the construction sites. It runs on two platforms: web server and mobile phone and supports two main users: mobile user and institution employee. With Construction Administration Mobile Application the user can manage and control several projects, create several reports and send these reports in Portable Document Format (PDF) formats through emails, view the physical location of each project, capturing and save photos. An institution employee can use the system to view all existing workers and projects, send emails and view the progress of each project.

Keywords: construction sites, management, mobile application, Portable Document Format (PDF)

Procedia PDF Downloads 349
608 Trends in Language Testing in Primary Schools in River State, Nigeria

Authors: Okoh Chinasa, Asimuonye Augusta

Abstract:

This study investigated the trends in language testing in Primary Schools in Rivers State. English language past question papers were collected from four (4) Primary Schools in Onelga Local Government Area and Ahoada East Local Government Area. Four research questions guided the study. The study is aimed at finding out the appropriateness of test formats used for language testing and the language skills tested. The past question papers collected which served as the instrument were analyzed based on given criteria developed by the researchers in line with documentary frequency studies, a type of survey study. The study revealed that some of the four language skills were not adequately assessed and that the termly question papers were developed by a central examination body. From the past questions, it was observed that an imbalance exists in the test format used. The paper recommended that all the language skills should be tested using correct test formats to ensure that pupils were given a fair chance to show what they know and can do in English language and for teachers to be able to use the test results for effective decision making.

Keywords: discrete test, integrative test, testing approach, test format

Procedia PDF Downloads 391
607 Identifying the Goals of a Multicultural Curriculum for the Primary Education Course

Authors: Fatemeh Havas Beigi

Abstract:

The purpose of this study is to identify the objectives of a multicultural curriculum for the primary education period from the perspective of ethnic teachers and education experts and cultural professionals. The research paradigm is interpretive, the research approach is qualitative, the research strategy is content analysis, the sampling method is purposeful and it is a snowball, and the sample of informants in the research for Iranian ethnic teachers and experts until the theoretical saturation was estimated to be 67 people. The data collection tools used were based on semi-structured interviews and individual interviews and focal interviews were used to collect information. The data format was also in audio format and the first period coding and the second coding were used to analyze the data. Based on data analysis 11 Objective: Paying attention to ethnic equality, expanding educational opportunities and justice, peaceful coexistence, anti-ethnic and racial discrimination education, paying attention to human value and dignity, accepting religious diversity, getting to know ethnicities and cultures, promoting teaching-learning, fostering self-confidence, building national unity, and developing cultural commonalities for a multicultural curriculum were identified.

Keywords: objective, multicultural curriculum, connect, elementary education period

Procedia PDF Downloads 62
606 Teaching English in Low Resource-Environments: Problems and Prospects

Authors: Gift Chidi-Onwuta, Iwe Nkem Nkechinyere, Chikamadu Christabelle Chinyere

Abstract:

The teaching of English is a resource-driven activity that requires rich resource-classroom settings for the delivery of effective lessons and the acquisition of interpersonal skills for integration in a target-language environment. However, throughout the world, English is often taught in low-resource classrooms. This paper is aimed to reveal the common problems associated with teaching English in low-resource environments and the prospects for teachers who found themselves in such undefined teaching settings. Self-structured and validated questionnaire in a closed-ended format, open question format and scaling format was administered to teachers across five countries: Nigeria, Cameroun, Iraq, Turkey, and Sudan. The study adopts situational language teaching theory (SLTT), which emphasizes a performance improvement imperative. This study inclines to this model because it maintains that learning must be fun and enjoyable like playing a favorite sport, just as in real life. Since teaching resources make learning engaging, we found this model apt for the current study. The perceptions of teachers about accessibility and functionality of teaching material resources, the nature of teaching outcomes in resource-less environments, their levels of involvement in improvisation and the prospects associated with resource limitations were sourced. Data were analysed using percentages and presented in frequency tables. Results: showed that a greater number of teachers across these nations do not have access to sufficient productive resource materials that can aid effective English language teaching. Teaching outcomes, from the findings, are affected by low material resources; however, results show certain advantages to teaching English with limited resources: flexibility and autonomy with students and creativity and innovation amongst teachers. Results further revealed group work, story, critical thinking strategy, flex, cardboards and flashcards, dictation and dramatization as common teaching strategies, as well as materials adopted by teachers to overcome low resource-related challenges in classrooms.

Keywords: teaching materials, low-resource environments, English language teaching, situational language theory

Procedia PDF Downloads 104
605 Computational Fluid Dynamicsfd Simulations of Air Pollutant Dispersion: Validation of Fire Dynamic Simulator Against the Cute Experiments of the Cost ES1006 Action

Authors: Virginie Hergault, Siham Chebbah, Bertrand Frere

Abstract:

Following in-house objectives, Central laboratory of Paris police Prefecture conducted a general review on models and Computational Fluid Dynamics (CFD) codes used to simulate pollutant dispersion in the atmosphere. Starting from that review and considering main features of Large Eddy Simulation, Central Laboratory Of Paris Police Prefecture (LCPP) postulates that the Fire Dynamics Simulator (FDS) model, from National Institute of Standards and Technology (NIST), should be well suited for air pollutant dispersion modeling. This paper focuses on the implementation and the evaluation of FDS in the frame of the European COST ES1006 Action. This action aimed at quantifying the performance of modeling approaches. In this paper, the CUTE dataset carried out in the city of Hamburg, and its mock-up has been used. We have performed a comparison of FDS results with wind tunnel measurements from CUTE trials on the one hand, and, on the other, with the models results involved in the COST Action. The most time-consuming part of creating input data for simulations is the transfer of obstacle geometry information to the format required by SDS. Thus, we have developed Python codes to convert automatically building and topographic data to the FDS input file. In order to evaluate the predictions of FDS with observations, statistical performance measures have been used. These metrics include the fractional bias (FB), the normalized mean square error (NMSE) and the fraction of predictions within a factor of two of observations (FAC2). As well as the CFD models tested in the COST Action, FDS results demonstrate a good agreement with measured concentrations. Furthermore, the metrics assessment indicate that FB and NMSE meet the tolerance acceptable.

Keywords: numerical simulations, atmospheric dispersion, cost ES1006 action, CFD model, cute experiments, wind tunnel data, numerical results

Procedia PDF Downloads 106
604 Establishment of Landslide Warning System Using Surface or Sub-Surface Sensors Data

Authors: Neetu Tyagi, Sumit Sharma

Abstract:

The study illustrates the results of an integrated study done on Tangni landslide located on NH-58 at Chamoli, Uttarakhand. Geological, geo-morphological and geotechnical investigations were carried out to understand the mechanism of landslide and to plan further investigation and monitoring. At any rate, the movements were favored by continuous rainfall water infiltration from the zones where the phyllites/slates and Dolomites outcrop. The site investigations were carried out including the monitoring of landslide movements and of the water level fluctuations due to rainfall give us a better understanding of landslide dynamics that have been causing in time soil instability at Tangni landslide site. The Early Warning System (EWS) installed different types of sensors and all sensors were directly connected to data logger and raw data transfer to the Defence Terrain Research Laboratory (DTRL) server room with the help of File Transfer Protocol (FTP). The slip surfaces were found at depths ranging from 8 to 10 m from Geophysical survey and hence sensors were installed to the depth of 15m at various locations of landslide. Rainfall is the main triggering factor of landslide. In this study, the developed model of unsaturated soil slope stability is carried out. The analysis of sensors data available for one year, indicated the sliding surface of landslide at depth between 6 to 12m with total displacement up to 6cm per year recorded at the body of landslide. The aim of this study is to set the threshold and generate early warning. Local peoples already alert towards landslide, if they have any types of warning system.

Keywords: early warning system, file transfer protocol, geo-morphological, geotechnical, landslide

Procedia PDF Downloads 130
603 International Classification of Primary Care as a Reference for Coding the Demand for Care in Primary Health Care

Authors: Souhir Chelly, Chahida Harizi, Aicha Hechaichi, Sihem Aissaoui, Leila Ben Ayed, Maha Bergaoui, Mohamed Kouni Chahed

Abstract:

Introduction: The International Classification of Primary Care (ICPC) is part of the morbidity classification system. It had 17 chapters, and each is coded by an alphanumeric code: the letter corresponds to the chapter, the number to a paragraph in the chapter. The objective of this study is to show the utility of this classification in the coding of the reasons for demand for care in Primary health care (PHC), its advantages and limits. Methods: This is a cross-sectional descriptive study conducted in 4 PHC in Ariana district. Data on the demand for care during 2 days in the same week were collected. The coding of the information was done according to the CISP. The data was entered and analyzed by the EPI Info 7 software. Results: A total of 523 demands for care were investigated. The patients who came for the consultation are predominantly female (62.72%). Most of the consultants are young with an average age of 35 ± 26 years. In the ICPC, there are 7 rubrics: 'infections' is the most common reason with 49.9%, 'other diagnoses' with 40.2%, 'symptoms and complaints' with 5.5%, 'trauma' with 2.1%, 'procedures' with 2.1% and 'neoplasm' with 0.3%. The main advantage of the ICPC is the fact of being a standardized tool. It is very suitable for classification of the reasons for demand for care in PHC according to their specificity, capacity to be used in a computerized medical file of the PHC. Its current limitations are related to the difficulty of classification of some reasons for demand for care. Conclusion: The ICPC has been developed to provide healthcare with a coding reference that takes into account their specificity. The CIM is in its 10th revision; it would gain from revision to revision to be more efficient to be generalized and used by the teams of PHC.

Keywords: international classification of primary care, medical file, primary health care, Tunisia

Procedia PDF Downloads 236
602 The Use of Voice in Online Public Access Catalog as Faster Searching Device

Authors: Maisyatus Suadaa Irfana, Nove Eka Variant Anna, Dyah Puspitasari Sri Rahayu

Abstract:

Technological developments provide convenience to all the people. Nowadays, the communication of human with the computer is done via text. With the development of technology, human and computer communications have been conducted with a voice like communication between human beings. It provides an easy facility for many people, especially those who have special needs. Voice search technology is applied in the search of book collections in the OPAC (Online Public Access Catalog), so library visitors will find it faster and easier to find books that they need. Integration with Google is needed to convert the voice into text. To optimize the time and the results of searching, Server will download all the book data that is available in the server database. Then, the data will be converted into JSON format. In addition, the incorporation of some algorithms is conducted including Decomposition (parse) in the form of array of JSON format, the index making, analyzer to the result. It aims to make the process of searching much faster than the usual searching in OPAC because the data are directly taken to the database for every search warrant. Data Update Menu is provided with the purpose to enable users perform their own data updates and get the latest data information.

Keywords: OPAC, voice, searching, faster

Procedia PDF Downloads 318
601 An Observation Approach of Reading Order for Single Column and Two Column Layout Template

Authors: In-Tsang Lin, Chiching Wei

Abstract:

Reading order is an important task in many digitization scenarios involving the preservation of the logical structure of a document. From the paper survey, it finds that the state-of-the-art algorithm could not fulfill to get the accurate reading order in the portable document format (PDF) files with rich formats, diverse layout arrangement. In recent years, most of the studies on the analysis of reading order have targeted the specific problem of associating layout components with logical labels, while less attention has been paid to the problem of extracting relationships the problem of detecting the reading order relationship between logical components, such as cross-references. Over 3 years of development, the company Foxit has demonstrated the layout recognition (LR) engine in revision 20601 to eager for the accuracy of the reading order. The bounding box of each paragraph can be obtained correctly by the Foxit LR engine, but the result of reading-order is not always correct for single-column, and two-column layout format due to the table issue, formula issue, and multiple mini separated bounding box and footer issue. Thus, the algorithm is developed to improve the accuracy of the reading order based on the Foxit LR structure. In this paper, a creative observation method (Here called the MESH method) is provided here to open a new chance in the research of the reading-order field. Here two important parameters are introduced, one parameter is the number of the bounding box on the right side of the present bounding box (NRight), and another parameter is the number of the bounding box under the present bounding box (Nunder). And the normalized x-value (x/the whole width), the normalized y-value (y/the whole height) of each bounding box, the x-, and y- position of each bounding box were also put into consideration. Initial experimental results of single column layout format demonstrate a 19.33% absolute improvement in accuracy of the reading-order over 7 PDF files (total 150 pages) using our proposed method based on the LR structure over the baseline method using the LR structure in 20601 revision, which its accuracy of the reading-order is 72%. And for two-column layout format, the preliminary results demonstrate a 44.44% absolute improvement in accuracy of the reading-order over 2 PDF files (total 18 pages) using our proposed method based on the LR structure over the baseline method using the LR structure in 20601 revision, which its accuracy of the reading-order is 0%. Until now, the footer issue and a part of multiple mini separated bounding box issue can be solved by using the MESH method. However, there are still three issues that cannot be solved, such as the table issue, formula issue, and the random multiple mini separated bounding boxes. But the detection of the table position and the recognition of the table structure are out of the scope in this paper, and there is needed another research. In the future, the tasks are chosen- how to detect the table position in the page and to extract the content of the table.

Keywords: document processing, reading order, observation method, layout recognition

Procedia PDF Downloads 150
600 Monte Carlo Simulation Study on Improving the Flatting Filter-Free Radiotherapy Beam Quality Using Filters from Low- z Material

Authors: H. M. Alfrihidi, H.A. Albarakaty

Abstract:

Flattening filter-free (FFF) photon beam radiotherapy has increased in the last decade, which is enabled by advancements in treatment planning systems and radiation delivery techniques like multi-leave collimators. FFF beams have higher dose rates, which reduces treatment time. On the other hand, FFF beams have a higher surface dose, which is due to the loss of beam hardening effect caused by the presence of the flatting filter (FF). The possibility of improving FFF beam quality using filters from low-z materials such as steel and aluminium (Al) was investigated using Monte Carlo (MC) simulations. The attenuation coefficient of low-z materials for low-energy photons is higher than that of high-energy photons, which leads to the hardening of the FFF beam and, consequently, a reduction in the surface dose. BEAMnrc user code, based on Electron Gamma Shower (EGSnrc) MC code, is used to simulate the beam of a 6 MV True-Beam linac. A phase-space (phosphor) file provided by Varian Medical Systems was used as a radiation source in the simulation. This phosphor file was scored just above the jaws at 27.88 cm from the target. The linac from the jaw downward was constructed, and radiation passing was simulated and scored at 100 cm from the target. To study the effect of low-z filters, steel and Al filters with a thickness of 1 cm were added below the jaws, and the phosphor file was scored at 100 cm from the target. For comparison, the FF beam was simulated using a similar setup. (BEAM Data Processor (BEAMdp) is used to analyse the energy spectrum in the phosphorus files. Then, the dose distribution resulting from these beams was simulated in a homogeneous water phantom using DOSXYZnrc. The dose profile was evaluated according to the surface dose, the lateral dose distribution, and the percentage depth dose (PDD). The energy spectra of the beams show that the FFF beam is softer than the FF beam. The energy peaks for the FFF and FF beams are 0.525 MeV and 1.52 MeV, respectively. The FFF beam's energy peak becomes 1.1 MeV using a steel filter, while the Al filter does not affect the peak position. Steel and Al's filters reduced the surface dose by 5% and 1.7%, respectively. The dose at a depth of 10 cm (D10) rises by around 2% and 0.5% due to using a steel and Al filter, respectively. On the other hand, steel and Al filters reduce the dose rate of the FFF beam by 34% and 14%, respectively. However, their effect on the dose rate is less than that of the tungsten FF, which reduces the dose rate by about 60%. In conclusion, filters from low-z material decrease the surface dose and increase the D10 dose, allowing for a high-dose delivery to deep tumors with a low skin dose. Although using these filters affects the dose rate, this effect is much lower than the effect of the FF.

Keywords: flattening filter free, monte carlo, radiotherapy, surface dose

Procedia PDF Downloads 48
599 Spatial Analysis of Survival Pattern and Treatment Outcomes of Multi-Drug Resistant Tuberculosis (MDR-TB) Patients in Lagos, Nigeria

Authors: Akinsola Oluwatosin, Udofia Samuel, Odofin Mayowa

Abstract:

The study is aimed at assessing the Geographic Information System (GIS)-based spatial analysis of Survival Pattern and Treatment Outcomes of Multi-Drug Resistant Tuberculosis (MDR-TB) cases for Lagos, Nigeria, with an objective to inform priority areas for public health planning and resource allocation. Multi-drug resistant tuberculosis (MDR-TB) develops due to problems such as irregular drug supply, poor drug quality, inappropriate prescription, and poor adherence to treatment. The shapefile(s) for this study were already georeferenced to Minna datum. The patient’s information was acquired on MS Excel and later converted to . CSV file for easy processing to ArcMap from various hospitals. To superimpose the patient’s information the spatial data, the addresses was geocoded to generate the longitude and latitude of the patients. The database was used for the SQL query to the various pattern of the treatment. To show the pattern of disease spread, spatial autocorrelation analysis was used. The result was displayed in a graphical format showing the areas of dispersing, random and clustered of patients in the study area. Hot and cold spot analysis was analyzed to show high-density areas. The distance between these patients and the closest health facility was examined using the buffer analysis. The result shows that 22% of the points were successfully matched, while 15% were tied. However, the result table shows that a greater percentage of it was unmatched; this is evident in the fact that most of the streets within the State are unnamed, and then again, most of the patients are likely to supply the wrong addresses. MDR-TB patients of all age groups are concentrated within Lagos-Mainland, Shomolu, Mushin, Surulere, Oshodi-Isolo, and Ifelodun LGAs. MDR-TB patients between the age group of 30-47 years had the highest number and were identified to be about 184 in number. The outcome of patients on ART treatment revealed that a high number of patients (300) were not ART treatment while a paltry 45 patients were on ART treatment. The result shows the Z-score of the distribution is greater than 1 (>2.58), which means that the distribution is highly clustered at a significance level of 0.01.

Keywords: tuberculosis, patients, treatment, GIS, MDR-TB

Procedia PDF Downloads 124
598 Wave State of Self: Findings of Synchronistic Patterns in the Collective Unconscious

Authors: R. Dimitri Halley

Abstract:

The research within Jungian Psychology presented here is on the wave state of Self. What has been discovered via shared dreaming, independently correlating dreams across dreamers, is beyond the Self stage into the deepest layer or the wave state Self: the very quantum ocean, the Self archetype is embedded in. A quantum wave or rhyming of meaning constituting synergy across several dreamers was discovered in dreams and in extensively shared dream work with small groups at a post therapy stage. Within the format of shared dreaming, we find synergy patterns beyond what Jung called the Self archetype. Jung led us up to the phase of Individuation and delivered the baton to Von Franz to work out the next synchronistic stage, here proposed as the finding of the quantum patterns making up the wave state of Self. These enfolded synchronistic patterns have been found in group format of shared dreaming of individuals approximating individuation, and the unfolding of it is carried by belief and faith. The reason for this format and operating system is because beyond therapy and of living reality, we find no science – no thinking or even awareness in the therapeutic sense – but rather a state of mental processing resembling more like that of spiritual attitude. Thinking as such is linear and cannot contain the deepest layer of Self, the quantum core of the human being. It is self reflection which is the container for the process at the wave state of Self. Observation locks us in an outside-in reactive flow from a first-person perspective and hence toward the surface we see to believe, whereas here, the direction of focus shifts to inside out/intrinsic. The operating system or language at the wave level of Self is thus belief and synchronicity. Belief has up to now been almost the sole province of organized religions but was viewed by Jung as an inherent property in the process of Individuation. The shared dreaming stage of the synchronistic patterns forms a larger story constituting a deep connectivity unfolding around individual Selves. Dreams of independent dreamers form larger patterns that come together as puzzles forming a larger story, and in this sense, this group work level builds on Jung as a post individuation collective stage. Shared dream correlations will be presented, illustrating a larger story in terms of trails of shared synchronicity.

Keywords: belief, shared dreaming, synchronistic patterns, wave state of self

Procedia PDF Downloads 160
597 Hydrological Response of the Glacierised Catchment: Himalayan Perspective

Authors: Sonu Khanal, Mandira Shrestha

Abstract:

Snow and Glaciers are the largest dependable reserved sources of water for the river system originating from the Himalayas so an accurate estimate of the volume of water contained in the snowpack and the rate of release of water from snow and glaciers are, therefore, needed for efficient management of the water resources. This research assess the fusion of energy exchanges between the snowpack, air above and soil below according to mass and energy balance which makes it apposite than the models using simple temperature index for the snow and glacier melt computation. UEBGrid a Distributed energy based model is used to calculate the melt which is then routed by Geo-SFM. The model robustness is maintained by incorporating the albedo generated from the Landsat-7 ETM images on a seasonal basis for the year 2002-2003 and substrate map derived from TM. The Substrate file includes predominantly the 4 major thematic layers viz Snow, clean ice, Glaciers and Barren land. This approach makes use of CPC RFE-2 and MERRA gridded data sets as the source of precipitation and climatic variables. The subsequent model run for the year between 2002-2008 shows a total annual melt of 17.15 meter is generate from the Marshyangdi Basin of which 71% is contributed by the glaciers , 18% by the rain and rest being from the snow melt. The albedo file is decisive in governing the melt dynamics as 30% increase in the generated surface albedo results in the 10% decrease in the simulated discharge. The melt routed with the land cover and soil variables using Geo-SFM shows Nash-Sutcliffe Efficiency of 0.60 with observed discharge for the study period.

Keywords: Glacier, Glacier melt, Snowmelt, Energy balance

Procedia PDF Downloads 431
596 Real-Time Land Use and Land Information System in Homagama Divisional Secretariat Division

Authors: Kumara Jayapathma J. H. M. S. S., Dampegama S. D. P. J.

Abstract:

Lands are valuable & limited resource which constantly changes with the growth of the population. An efficient and good land management system is essential to avoid conflicts associated with lands. This paper aims to design the prototype model of a Mobile GIS Land use and Land Information System in real-time. Homagama Divisional Secretariat Division situated in the western province of Sri Lanka was selected as the study area. The prototype model was developed after reviewing related literature. The methodology was consisted of designing and modeling the prototype model into an application running on a mobile platform. The system architecture mainly consists of a Google mapping app for real-time updates with firebase support tools. Thereby, the method of implementation consists of front-end and back-end components. Software tools used in designing applications are Android Studio with JAVA based on GeoJSON File structure. Android Studio with JAVA in GeoJSON File Synchronize to Firebase was found to be the perfect mobile solution for continuously updating Land use and Land Information System (LIS) in real-time in the present scenario. The mobile-based land use and LIS developed in this study are multiple user applications catering to different hierarchy levels such as basic users, supervisory managers, and database administrators. The benefits of this mobile mapping application will help public sector field officers with non-GIS expertise to overcome the land use planning challenges with land use updated in real-time.

Keywords: Android, Firebase, GeoJSON, GIS, JAVA, JSON, LIS, Mobile GIS, real-time, REST API

Procedia PDF Downloads 201
595 A Study of Police Culture Themes Towards the Public Among South African Police Service

Authors: Nkosingiphile M. Mbhele, Jean Steyn

Abstract:

A focus group discussion was implemented, which comprised of senior South African Police Service managers and police academics in South Africa. The measurement of solidarity, isolation, and cynicism among functional South African Police Service officials and a thirty-item questionnaire came about by reviewing the literature. This research uses a survey format to assess the police culture theme of solidarity, isolation, and cynicism among South African Police Service officers in 9 South African provinces. Although a survey format is used in research, it engages in a quasi-experimental pre-test/post-test repeated measures research (longitudinal) design. Although there are differences among South African Police Service police (SAPS) officers, overall, there are signs of solidarity, isolation, and cynicism among SAPS members. Attitudes of solidarity, isolation, and cynicism are present among most police officials and have been presented from the start of training and held, maintained, or strengthened for the next years of their SAPS careers. This issue is problematic to society with regard to community-orientated policing since they have to interact with the members of the community. To author’s best knowledge, longitudinal studies of police culture are rare to find; not much has been researched on this topic. However, this paper offers to bridge that gap by providing answers to longitudinal police attitudes towards the public within the police culture themes of isolation and cynicism attitudes.

Keywords: South African police service, police culture, solidarity, isolation, cynicism, public

Procedia PDF Downloads 114
594 Human-factor and Ergonomics in Bottling Lines

Authors: Parameshwaran Nair

Abstract:

Filling and packaging lines for bottling of beverages into glass, PET or aluminum containers require specialized expertise and a different configuration of equipment like – Filler, Warmer, Labeller, Crater/Recrater, Shrink Packer, Carton Erector, Carton Sealer, Date Coder, Palletizer, etc. Over the period of time, the packaging industry has evolved from manually operated single station machines to highly automized high-speed lines. Human factor and ergonomics have gained significant consideration in this course of transformation. A pre-requisite for such bottling lines, irrespective of the container type and size, is to be suitable for multi-format applications. It should also be able to handle format changeovers with minimal adjustment. It should have variable capacity and speeds, for providing great flexibility of use in managing accumulation times as a function of production characteristics. In terms of layout as well, it should demonstrate flexibility for operator movement and access to machine areas for maintenance. Packaging technology during the past few decades has risen to these challenges by a series of major breakthroughs interspersed with periods of refinement and improvement. The milestones are many and varied and are described briefly in this paper. In order to have a brief understanding of the human factor and ergonomics in the modern packaging lines, this paper, highlights the various technologies, design considerations and statutory requirements in packaging equipment for different types of containers used in India.

Keywords: human-factor, ergonomics, bottling lines, automized high-speed lines

Procedia PDF Downloads 396
593 On Exploring Search Heuristics for improving the efficiency in Web Information Extraction

Authors: Patricia Jiménez, Rafael Corchuelo

Abstract:

Nowadays the World Wide Web is the most popular source of information that relies on billions of on-line documents. Web mining is used to crawl through these documents, collect the information of interest and process it by applying data mining tools in order to use the gathered information in the best interest of a business, what enables companies to promote theirs. Unfortunately, it is not easy to extract the information a web site provides automatically when it lacks an API that allows to transform the user-friendly data provided in web documents into a structured format that is machine-readable. Rule-based information extractors are the tools intended to extract the information of interest automatically and offer it in a structured format that allow mining tools to process it. However, the performance of an information extractor strongly depends on the search heuristic employed since bad choices regarding how to learn a rule may easily result in loss of effectiveness and/or efficiency. Improving search heuristics regarding efficiency is of uttermost importance in the field of Web Information Extraction since typical datasets are very large. In this paper, we employ an information extractor based on a classical top-down algorithm that uses the so-called Information Gain heuristic introduced by Quinlan and Cameron-Jones. Unfortunately, the Information Gain relies on some well-known problems so we analyse an intuitive alternative, Termini, that is clearly more efficient; we also analyse other proposals in the literature and conclude that none of them outperforms the previous alternative.

Keywords: information extraction, search heuristics, semi-structured documents, web mining.

Procedia PDF Downloads 307
592 A Literature Review Evaluating the Use of Online Problem-Based Learning and Case-Based Learning Within Dental Education

Authors: Thomas Turner

Abstract:

Due to the Covid-19 pandemic alternative ways of delivering dental education were required. As a result, many institutions moved teaching online. The impact of this is poorly understood. Is online problem-based learning (PBL) and case-based learning (CBL) effective and is it suitable in the post-pandemic era? PBL and CBL are both types of interactive, group-based learning which are growing in popularity within many dental schools. PBL was first introduced in the 1960’s and can be defined as learning which occurs from collaborative work to resolve a problem. Whereas CBL encourages learning from clinical cases, encourages application of knowledge and helps prepare learners for clinical practice. To evaluate the use of online PBL and CBL. A literature search was conducted using the CINAHL, Embase, PubMed and Web of Science databases. Literature was also identified from reference lists. Studies were only included from dental education. Seven suitable studies were identified. One of the studies found a high learner and facilitator satisfaction rate with online CBL. Interestingly one study found learners preferred CBL over PBL within an online format. A study also found, that within the context of distance learning, learners preferred a hybrid curriculum including PBL over a traditional approach. A further study pointed to the limitations of PBL within an online format, such as reduced interaction, potentially hindering the development of communication skills and the increased time and technology support required. An audience response system was also developed for use within CBL and had a high satisfaction rate. Interestingly one study found achievement of learning outcomes was correlated with the number of student and staff inputs within an online format. Whereas another study found the quantity of learner interactions were important to group performance, however the quantity of facilitator interactions was not. This review identified generally favourable evidence for the benefits of online PBL and CBL. However, there is limited high quality evidence evaluating these teaching methods within dental education and there appears to be limited evidence comparing online and faceto-face versions of these sessions. The importance of the quantity of learner interactions is evident, however the importance of the quantity of facilitator interactions appears to be questionable. An element to this may be down to the quality of interactions, rather than just quantity. Limitations of online learning regarding technological issues and time required for a session are also highlighted, however as learners and facilitators get familiar with online formats, these may become less of an issue. It is also important learners are encouraged to interact and communicate during these sessions, to allow for the development of communication skills. Interestingly CBL appeared to be preferred to PBL in an online format. This may reflect the simpler nature of CBL, however further research is required to explore this finding. Online CBL and PBL appear promising, however further research is required before online formats of these sessions are widely adopted in the post-pandemic era.

Keywords: case-based learning, online, problem-based learning, remote, virtual

Procedia PDF Downloads 46
591 C-eXpress: A Web-Based Analysis Platform for Comparative Functional Genomics and Proteomics in Human Cancer Cell Line, NCI-60 as an Example

Authors: Chi-Ching Lee, Po-Jung Huang, Kuo-Yang Huang, Petrus Tang

Abstract:

Background: Recent advances in high-throughput research technologies such as new-generation sequencing and multi-dimensional liquid chromatography makes it possible to dissect the complete transcriptome and proteome in a single run for the first time. However, it is almost impossible for many laboratories to handle and analysis these “BIG” data without the support from a bioinformatics team. We aimed to provide a web-based analysis platform for users with only limited knowledge on bio-computing to study the functional genomics and proteomics. Method: We use NCI-60 as an example dataset to demonstrate the power of the web-based analysis platform and data delivering system: C-eXpress takes a simple text file that contain the standard NCBI gene or protein ID and expression levels (rpkm or fold) as input file to generate a distribution map of gene/protein expression levels in a heatmap diagram organized by color gradients. The diagram is hyper-linked to a dynamic html table that allows the users to filter the datasets based on various gene features. A dynamic summary chart is generated automatically after each filtering process. Results: We implemented an integrated database that contain pre-defined annotations such as gene/protein properties (ID, name, length, MW, pI); pathways based on KEGG and GO biological process; subcellular localization based on GO cellular component; functional classification based on GO molecular function, kinase, peptidase and transporter. Multiple ways of sorting of column and rows is also provided for comparative analysis and visualization of multiple samples.

Keywords: cancer, visualization, database, functional annotation

Procedia PDF Downloads 586
590 Podcasting: A Tool for an Enhanced Learning Experience of Introductory Courses to Science and Engineering Students

Authors: Yaser E. Greish, Emad F. Hindawy, Maryam S. Al Nehayan

Abstract:

Introductory courses such as General Chemistry I, General Physics I and General Biology need special attention as students taking these courses are usually at their first year of the university. In addition to the language barrier for most of them, they also face other difficulties if these elementary courses are taught in the traditional way. Changing the routine method of teaching of these courses is therefore mandated. In this regard, podcasting of chemistry lectures was used as an add-on to the traditional and non-traditional methods of teaching chemistry to science and non-science students. Podcasts refer to video files that are distributed in a digital format through the Internet using personal computers or mobile devices. Pedagogical strategy is another way of identifying podcasts. Three distinct teaching approaches are evident in the current literature and include receptive viewing, problem-solving, and created video podcasts. The digital format and dispensing of video podcasts have stabilized over the past eight years, the type of podcasts vary considerably according to their purpose, degree of segmentation, pedagogical strategy, and academic focus. In this regard, the whole syllabus of 'General Chemistry I' course was developed as podcasts and were delivered to students throughout the semester. Students used the podcasted files extensively during their studies, especially as part of their preparations for exams. Feedback of students strongly supported the idea of using podcasting as it reflected its effect on the overall understanding of the subject, and a consequent improvement of their grades.

Keywords: podcasting, introductory course, interactivity, flipped classroom

Procedia PDF Downloads 240
589 Using Printouts as Social Media Evidence and Its Authentication in the Courtroom

Authors: Chih-Ping Chang

Abstract:

Different from traditional objective evidence, social media evidence has its own characteristics with easily tampering, recoverability, and cannot be read without using other devices (such as a computer). Simply taking a screenshot from social network sites must be questioned its original identity. When the police search and seizure digital information, a common way they use is to directly print out digital data obtained and ask the signature of the parties at the presence, without taking original digital data back. In addition to the issue on its original identity, this conduct to obtain evidence may have another two results. First, it will easily allege that is tampering evidence because the police wanted to frame the suspect and falsified evidence. Second, it is not easy to discovery hidden information. The core evidence associated with crime may not appear in the contents of files. Through discovery the original file, data related to the file, such as the original producer, creation time, modification date, and even GPS location display can be revealed from hidden information. Therefore, how to show this kind of evidence in the courtroom will be arguably the most important task for ruling social media evidence. This article, first, will introduce forensic software, like EnCase, TCT, FTK, and analyze their function to prove the identity with another digital data. Then turning back to the court, the second part of this article will discuss legal standard for authentication of social media evidence and application of that forensic software in the courtroom. As the conclusion, this article will provide a rethinking, that is, what kind of authenticity is this rule of evidence chase for. Does legal system automatically operate the transcription of scientific knowledge? Or furthermore, it wants to better render justice, not only under scientific fact, but through multivariate debating.

Keywords: federal rule of evidence, internet forensic, printouts as evidence, social media evidence, United States v. Vayner

Procedia PDF Downloads 267
588 Comparative Analysis of Canal Centering Ratio, Apical Transportation, and Remaining Dentin Thickness between Single File System Using Cone Beam Computed Tomography: An in vitro Study

Authors: Aditi Jain

Abstract:

Aim: To compare the canal transportation, centering ability and remaining dentin thickness of OneShape and WaveOne system using CBCT. Objective: To identify rotary system which respects original canal anatomy. Materials and Methods: Forty extracted human single-rooted premolars were used in the present study. Pre-instrumentation scans of all teeth were taken, canal curvatures were calculated, and the samples were randomly divided into two groups with twenty samples in each group, where Group 1 included WaveOne system and Group 2 Protaper rotary system. Post-instrumentation scans were performed, and the two scans were compared to determine canal transportation, centering ability and remaining dentin thickness at 1, 3, and 5 mm from the root apex. Results: Using Student’s unpaired t test results were as follows; for canal transportation Group 1 showed statistical significant difference at 3mm, 6mm and non-significant difference was obtained at 9mm but for Group 2 non-statistical significant difference was obtained at 3mm, 6mm, and 9mm. For centering ability and remaining dentin thickness Group 1 showed non-statistical significant difference at 3mm and 9mm, while statistical significant difference at 6mm was obtained. When comparison of remaining dentin thickness was done at three levels using two groups WaveOne and ProTaper. There was non-statistical significant difference between two groups. Conclusion: WaveOne single reciprocation file respects original canal anatomy better than ProTaper. WaveOne depicted the best centering ability.

Keywords: ShapeOne, WaveOne, transportation, centering ability, dentin thickness, CBCT (Cone Beam Computed Tomography)

Procedia PDF Downloads 172
587 A Patient Passport Application for Adults with Cystic Fibrosis

Authors: Tamara Vagg, Cathy Shortt, Claire Hickey, Joseph A. Eustace, Barry J. Plant, Sabin Tabirca

Abstract:

Introduction: Paper-based patient passports have been used advantageously for older patients, patients with diabetes, and patients with learning difficulties. However, these passports can experience issues with data security, patients forgetting to bring the passport, patients being over encumbered, and uncertainty with who is responsible for entering and managing data in this passport. These issues could be resolved by transferring the paper-based system to a convenient platform such as a smartphone application (app). Background: Life expectancy for some Cystic Fibrosis (CF) patients are rising and as such new complications and procedures are predicted. Subsequently, there is a need for education and management interventions that can benefit CF adults. This research proposes a CF patient passport to record basic medical information through a smartphone app which will allow CF adults access to their basic medical information. Aim: To provide CF patients with their basic medical information via mobile multimedia so that they can receive care when traveling abroad or between CF centres. Moreover, by recording their basic medical information, CF patients may become more aware of their own condition and more active in their health care. Methods: This app is designed by a CF multidisciplinary team to be a lightweight reflection of a hospital patient file. The passport app is created using PhoneGap so that it can be deployed for both Android and iOS devices. Data entered into the app is encrypted and stored locally only. The app is password protected and includes the ability to set reminders and a graph to visualise weight and lung function over time. The app is introduced to seven participants as part of a stress test. The participants are asked to test the performance and usability of the app and report any issues identified. Results: Feedback and suggestions received via this testing include the ability to reorder the list of clinical appointments via date, an open format of recording dates (in the event specifics are unknown), and a drop down menu for data which is difficult to enter (such as bugs found in mucus). The app is found to be usable and accessible and is now being prepared for a pilot study with adult CF patients. Conclusions: It is anticipated that such an app will be beneficial to CF adult patients when travelling abroad and between CF centres.

Keywords: Cystic Fibrosis, digital patient passport, mHealth, self management

Procedia PDF Downloads 219
586 Photoluminescence in Cerium Doped Fluorides Prepared by Slow Precipitation Method

Authors: Aarti Muley, S. J. Dhoblae

Abstract:

CaF₂ and BaF₂ doped with cerium were prepared by slow precipitation method with different molar concentration and different cerium concentration. Both the samples were also prepared by direct method for comparison. The XRD of BaF₂:Ce shows that it crystallizes to BCC structure. The peak matches with JCPDS file no. 4-0452. Also, The XRD pattern of CaF₂:Ce matches well with the JCPDS file number 75- 0363 and crystallized to BCC phase. In CaF₂, the double-humped photoluminescence spectra were observed at 320nm and 340nm when the sample was prepared by the direct precipitation method, and the ratio between these peaks is unity. However when the sample prepared by slow precipitation method the double-humped emission spectra of CaF₂:Ce was observed at 323nm and 340nm. The ratio between these peaks is 0.58, and the optimum concentration is obtained for 0.1 molar CaF₂ with Ce concentration 1.5%. When the cerium concentration is increased by 2% the peak at 323nm vanishes, and the emission was observed at 342nm with the shoulder at 360nm. In this case, the intensity reduces drastically. The excitation is observed at 305nm with a small peak at 254nm. One molar BaF₂ doped with 0.1% of cerium was synthesized by direct precipitation method gives double humped spectra at 308nm and 320nm, when it is prepared with slow precipitation method with the cerium concentration 0.05m%, 0.1m%, 0.15m%, 0.2m% the broad emission is observed around 325nm with the shoulder at 350nm. The excitation spectra are narrow and observed at 290nm. As the percentage of cerium is increased further again shift is observed. The emission spectra were observed at 360nm with a small peak at 330nm. The phenomenon of shifting of emission spectra at low concentration of cerium can directly relate with the particle size and reported for nanomaterials also.

Keywords: calcium fluoride, barium fluoride, photoluminescence, slow precipitation method

Procedia PDF Downloads 82