Search results for: array list
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1359

Search results for: array list

1119 Analysis of Steles with Libyan Inscriptions of Grande Kabylia, Algeria

Authors: Samia Ait Ali Yahia

Abstract:

Several steles with Libyan inscriptions were discovered in Grande Kabylia (Algeria), but very few researchers were interested in these inscriptions. Our work is to list, if possible all these steles in order to do a descriptive study of the corpus. The steles analysis will be focused on the iconographic and epigraphic level and on the different forms of Libyan characters in order to highlight the alphabet used by the Grande Kabylia.

Keywords: epigraphy, stele, Libyan inscription, Grande Kabylia

Procedia PDF Downloads 182
1118 Flexible and Color Tunable Inorganic Light Emitting Diode Array for High Resolution Optogenetic Devices

Authors: Keundong Lee, Dongha Yoo, Youngbin Tchoe, Gyu-Chul Yi

Abstract:

Light emitting diode (LED) array is an ideal optical stimulation tool for optogenetics, which controls inhibition and excitation of specific neurons with light-sensitive ion channels or pumps. Although a fiber-optic cable with an external light source, either a laser or LED mechanically connected to the end of the fiber-optic cable has widely been used for illumination on neural tissue, a new approach to use micro LEDs (µLEDs) has recently been demonstrated. The LEDs can be placed directly either on the cortical surface or within the deep brain using a penetrating depth probe. Accordingly, this method would not need a permanent opening in the skull if the LEDs are integrated with miniature electrical power source and wireless communication. In addition, multiple color generation from single µLED cell would enable to excite and/or inhibit neurons in localized regions. Here, we demonstrate flexible and color tunable µLEDs for the optogenetic device applications. The flexible and color tunable LEDs was fabricated using multifaceted gallium nitride (GaN) nanorod arrays with GaN nanorods grown on InxGa1−xN/GaN single quantum well structures (SQW) anisotropically formed on the nanorod tips and sidewalls. For various electroluminescence (EL) colors, current injection paths were controlled through a continuous p-GaN layer depending on the applied bias voltage. The electric current was injected through different thickness and composition, thus changing the color of light from red to blue that the LED emits. We believe that the flexible and color tunable µLEDs enable us to control activities of the neuron by emitting various colors from the single µLED cell.

Keywords: light emitting diode, optogenetics, graphene, flexible optoelectronics

Procedia PDF Downloads 189
1117 Estimation of Seismic Ground Motion and Shaking Parameters Based on Microtremor Measurements at Palu City, Central Sulawesi Province, Indonesia

Authors: P. S. Thein, S. Pramumijoyo, K. S. Brotopuspito, J. Kiyono, W. Wilopo, A. Furukawa, A. Setianto

Abstract:

In this study, we estimated the seismic ground motion parameters based on microtremor measurements at Palu City. Several earthquakes have struck along the Palu-Koro Fault during recent years. The USGS epicenter, magnitude Mw 6.3 event that occurred on January 23, 2005 caused several casualties. We conducted a microtremor survey to estimate the strong ground motion distribution during the earthquake. From this survey we produced a map of the peak ground acceleration, velocity, seismic vulnerability index and ground shear strain maps in Palu City. We performed single observations of microtremor at 151 sites in Palu City. We also conducted 8-site microtremors array investigation to gain a representative determination of the soil condition of subsurface structures in Palu City. From the array observations, Palu City corresponds to relatively soil condition with Vs ≤ 300 m/s, the predominant periods due to horizontal vertical ratios (HVSRs) are in the range of 0.4 to 1.8 s and the frequency are in the range of 0.7 to 3.3 Hz. Strong ground motions of the Palu area were predicted based on the empirical stochastic green’s function method. Peak ground acceleration and velocity becomes more than 400 gal and 30 kine in some areas, which causes severe damage for buildings in high probability. Microtremor survey results showed that in hilly areas had low seismic vulnerability index and ground shear strain, whereas in coastal alluvium was composed of material having a high seismic vulnerability and ground shear strain indication.

Keywords: Palu-Koro fault, microtremor, peak ground acceleration, peak ground velocity, seismic vulnerability index

Procedia PDF Downloads 386
1116 Analytical Solutions of Josephson Junctions Dynamics in a Resonant Cavity for Extended Dicke Model

Authors: S.I.Mukhin, S. Seidov, A. Mukherjee

Abstract:

The Dicke model is a key tool for the description of correlated states of quantum atomic systems, excited by resonant photon absorption and subsequently emitting spontaneous coherent radiation in the superradiant state. The Dicke Hamiltonian (DH) is successfully used for the description of the dynamics of the Josephson Junction (JJ) array in a resonant cavity under applied current. In this work, we have investigated a generalized model, which is described by DH with a frustrating interaction term. This frustrating interaction term is explicitly the infinite coordinated interaction between all the spin half in the system. In this work, we consider an array of N superconducting islands, each divided into two sub-islands by a Josephson Junction, taken in a charged qubit / Cooper Pair Box (CPB) condition. The array is placed inside the resonant cavity. One important aspect of the problem lies in the dynamical nature of the physical observables involved in the system, such as condensed electric field and dipole moment. It is important to understand how these quantities behave with time to define the quantum phase of the system. The Dicke model without frustrating term is solved to find the dynamical solutions of the physical observables in analytic form. We have used Heisenberg’s dynamical equations for the operators and on applying newly developed Rotating Holstein Primakoff (HP) transformation and DH we have arrived at the four coupled nonlinear dynamical differential equations for the momentum and spin component operators. It is possible to solve the system analytically using two-time scales. The analytical solutions are expressed in terms of Jacobi's elliptic functions for the metastable ‘bound luminosity’ dynamic state with the periodic coherent beating of the dipoles that connect the two double degenerate dipolar ordered phases discovered previously. In this work, we have proceeded the analysis with the extended DH with a frustrating interaction term. Inclusion of the frustrating term involves complexity in the system of differential equations and it gets difficult to solve analytically. We have solved semi-classical dynamic equations using the perturbation technique for small values of Josephson energy EJ. Because the Hamiltonian contains parity symmetry, thus phase transition can be found if this symmetry is broken. Introducing spontaneous symmetry breaking term in the DH, we have derived the solutions which show the occurrence of finite condensate, showing quantum phase transition. Our obtained result matches with the existing results in this scientific field.

Keywords: Dicke Model, nonlinear dynamics, perturbation theory, superconductivity

Procedia PDF Downloads 103
1115 Transcriptomine: The Nuclear Receptor Signaling Transcriptome Database

Authors: Scott A. Ochsner, Christopher M. Watkins, Apollo McOwiti, David L. Steffen Lauren B. Becnel, Neil J. McKenna

Abstract:

Understanding signaling by nuclear receptors (NRs) requires an appreciation of their cognate ligand- and tissue-specific transcriptomes. While target gene regulation data are abundant in this field, they reside in hundreds of discrete publications in formats refractory to routine query and analysis and, accordingly, their full value to the NR signaling community has not been realized. One of the mandates of the Nuclear Receptor Signaling Atlas (NURSA) is to facilitate access of the community to existing public datasets. Pursuant to this mandate we are developing a freely-accessible community web resource, Transcriptomine, to bring together the sum total of available expression array and RNA-Seq data points generated by the field in a single location. Transcriptomine currently contains over 25,000,000 gene fold change datapoints from over 1200 contrasts relevant to over 100 NRs, ligands and coregulators in over 200 tissues and cell lines. Transcriptomine is designed to accommodate a spectrum of end users ranging from the bench researcher to those with advanced bioinformatic training. Visualization tools allow users to build custom charts to compare and contrast patterns of gene regulation across different tissues and in response to different ligands. Our resource affords an entirely new paradigm for leveraging gene expression data in the NR signaling field, empowering users to query gene fold changes across diverse regulatory molecules, tissues and cell lines, target genes, biological functions and disease associations, and that would otherwise be prohibitive in terms of time and effort. Transcriptomine will be regularly updated with gene lists from future genome-wide expression array and expression-sequencing datasets in the NR signaling field.

Keywords: target gene database, informatics, gene expression, transcriptomics

Procedia PDF Downloads 249
1114 Large-Scale Screening for Membrane Protein Interactions Involved in Platelet-Monocyte Interactions

Authors: Yi Sun, George Ed Rainger, Steve P. Watson

Abstract:

Background: Beyond the classical roles in haemostasis and thrombosis, platelets are important in the initiation and development of various thrombo-inflammatory diseases. In atherosclerosis and deep vein thrombosis, for example, platelets bridge monocytes with endothelium and form heterotypic aggregates with monocytes in the circulation. This can alter monocyte phenotype by inducing their activation, stimulating adhesion and migration. These interactions involve cell surface receptor-ligand pairs on both cells. This list is likely incomplete as new interactions of importance to platelet biology are continuing to be discovered as illustrated by our discovery of PEAR-1 binding to FcεR1α. Results: We have developed a highly sensitive avidity-based assay to identify novel extracellular interactions among 126 recombinantly-expressed platelet cell surface and secreted proteins involved in platelet aggregation. In this study, we will use this method to identify novel platelet-monocyte interactions. We aim to identify ligands for orphan receptors and novel partners of well-known proteins. Identified interactions will be studied in preliminary functional assays to demonstrate relevance to the inflammatory processes supporting atherogenesis. Conclusions: Platelet-monocyte interactions are essential for the development of thromboinflammatory disease. Up until relatively recently, technologies only allow us to limit our studies on each individual protein interaction at a single time. These studies propose for the first time to study the cell surface platelet-monocyte interactions in a systematic large-scale approach using a reliable screening method we have developed. If successful, this will likely to identify previously unknown ligands for important receptors that will be investigated in details and also provide a list of novel interactions for the field. This should stimulate studies on developing alternative therapeutic strategies to treat vascular inflammatory disorders such as atherosclerosis, DVT and sepsis and other clinically important inflammatory conditions.

Keywords: membrane proteins, large-scale screening, platelets, recombinant expression

Procedia PDF Downloads 117
1113 The Effectiveness of a School-Based Addiction Prevention Program: Pilot Evaluation of Rajasthan Addiction Prevention Project

Authors: Sadhana Sharma, Neha Sharma, Hardik Khandelwal, Arti Sharma

Abstract:

Background: It is widely acknowledged globally that parents must advocate for their children's drug and substance abuse prevention. However, many parents find it difficult to advocate due to systemic and logistical barriers. Alternatives to introducing advocacy, awareness, and support for the prevention of drug and substance abuse to children could occur in schools. However, little research has been conducted on the development of advocates for substance abuse in school settings. Objective: to evaluate the effectiveness of a school-based addiction prevention and control created as part of the Rajasthan Addiction Prevention Project (a partnership between state-community initiative). Methods: We conducted an evaluation in this study to determine the impact of a RAPP on a primary outcome (substance abuse knowledge) and other outcomes (family–school partnership, empowerment, and support). Specifically, between September-December 2022, two schools participated in the intervention group (advocacy training), and two schools participated in the control group (waiting list). The RAPP designed specialised 2-hrs training to equip teachers-parents with the knowledge and skills necessary to advocate for their own children and those of other families. All participants were required to complete a pre- and post-survey. Results: The intervention group established school advocates in schools where trained parents volunteered to lead support groups for high-risk children. Compared to the participants in the wait list control group, those in the intervention group demonstrated greater education knowledge, P = 0.002, and self-mastery, P = 0.04, and decreased family–school partnership quality, P = 0.002.Conclusions: The experimental evaluation of school-based advocacy programme revealed positive effects on substance abuse that persist over time. The approach wa s deemed feasible and acceptable by both parents and the school.

Keywords: prevention, school based, addiction, advocacy

Procedia PDF Downloads 62
1112 Information Technology: Assessing Indian Realities Vis-à-Vis World Trade Organisation Disciplines

Authors: Saloni Khanderia

Abstract:

The World Trade Organisation’s (WTO) Information Technology Agreement (ITA), was concluded at the Singapore Ministerial Conference in 1996. The ITA is considered to be one of the biggest tariff-cutting deals because it eliminates all customs-related duties on the exportation of specific categories of information technology products to the territory of any other signatory to the Agreement. Over time, innovations in the information and communication technology (ICT) sector mandated the consideration of expanding the list of products covered by the ITA, which took place in the form of ITA-II negotiations during the WTO’s Nairobi Ministerial Conference. India, which was an original Member of the ITA-I, however, decided to opt-out of the negotiations to expand the list of products covered by the agreement. Instead, it preferred to give priority to its national policy initiative, namely the ‘Make-in-India’ programme [the MiI programme], which embarks upon fostering the domestic production of, inter alia, the ICT sector. India claims to have abstained from the ITA-II negotiations by stating that the zero-tariff regime created by the ITA-I debilitated its electronics-manufacturing sectors and on the contrary resulted in an over-reliance on imported electronic inputs. The author undertakes doctrinal research to examine India’s decision to opt-out of ITA-II negotiations, against the backdrop of the MiI Programme, which endeavours to improve productivity across-the-board. This paper accordingly scrutinises the tariff-cutting strategies of India to weigh the better alternative for India. Apropos, it examines whether initiatives like the MiI programme could plausibly resuscitate the ailing domestic electronics-manufacturing sector. The author opines that the country’s present decision to opt-out of ITA-II negotiations should be perceived as a welcome step. Thus, market-oriented reforms such as the MiI Programme, which focuses on indigenous innovation to improve domestic manufacturing in the ICT sector, should instead, in the present circumstances gain priority. Consequently, the MiI Programme would aid in moulding the country’s current tariff policy in a manner that will concurrently assist the promotion and sustenance of domestic manufacturing in the IT sector.

Keywords: electronics-manufacturing sector, information technology agreement, make in india programme, world trade organisation

Procedia PDF Downloads 208
1111 Youth Health Promotion Project for Indigenous People in Canada: Together against Bullying and Cyber-Dependence

Authors: Mohamed El Fares Djellatou, Fracoise Filion

Abstract:

The Ashukin program that means bridge in Naskapi or Atikamekw language, has been designed to offer a partnership between nursing students and an indigenous community. The students design a health promotion project tailored to the needs of the community. The issues of intimidation in primary school and cyber-dependence in high school were some concerns in a rural Atikamekw community. The goal of the project was to have a conversation with indigenous youths, aged 10-16 years old, on the challenges presented by intimidation and cyber dependence as well as promoting healthy relationships online and within the community. Methods: Multiple progressive inquiry questions (PIQs) were used to assess the feasibility and importance of this project for the Atikamekw nation, and to determine a plan to follow. The theoretical foundations to guide the conception of the project were the Population Health Promotion Model (PHPM), the First Nations Holistic Lifelong Learning Model, and the Medicine Wheel. A broad array of social determinants of health were addressed, including healthy childhood development, personal health practices, and coping skills, and education. The youths were encouraged to participate in interactive educational sessions, using PowerPoint presentations and pamphlets as the main effective strategies. Additional tools such as cultural artworks and physical activities were introduced to strengthen the inter-relational and team spirit within the Indigenous population. A quality assurance tool (QAT) was developed specifically to determine the appropriateness of these health promotion tools. Improvements were guided by the feedback issued by the indigenous schools’ teachers and social workers who filled the QATs. Post educational sessions, quantitative results have shown that 93.48% of primary school students were able to identify the different types of intimidation, 72.65% recognized more than two strategies, and 52.1% were able to list at least four resources to diffuse intimidation. On the other hand, around 75% of the adolescents were able to name at least three negative effects, and 50% listed three strategies to reduce cyber-dependence. This project was meant to create a bridge with the First Nation through health promotion, a population that is known to be disadvantaged due to systemic health inequity and disparities. Culturally safe care was proposed to deal with the two identified priority issues, and an educational toolkit was given to both schools to ensure the sustainability of the project. The project was self-financed through fundraising activities, and it yielded better results than expected.

Keywords: indigenous, first nation, bullying, cyber-dependence, internet addiction, intimidation, youth, adolescents, school, community nursing, health promotion

Procedia PDF Downloads 77
1110 Hardware Implementation on Field Programmable Gate Array of Two-Stage Algorithm for Rough Set Reduct Generation

Authors: Tomasz Grzes, Maciej Kopczynski, Jaroslaw Stepaniuk

Abstract:

The rough sets theory developed by Prof. Z. Pawlak is one of the tools that can be used in the intelligent systems for data analysis and processing. Banking, medicine, image recognition and security are among the possible fields of utilization. In all these fields, the amount of the collected data is increasing quickly, but with the increase of the data, the computation speed becomes the critical factor. Data reduction is one of the solutions to this problem. Removing the redundancy in the rough sets can be achieved with the reduct. A lot of algorithms of generating the reduct were developed, but most of them are only software implementations, therefore have many limitations. Microprocessor uses the fixed word length, consumes a lot of time for either fetching as well as processing of the instruction and data; consequently, the software based implementations are relatively slow. Hardware systems don’t have these limitations and can process the data faster than a software. Reduct is the subset of the decision attributes that provides the discernibility of the objects. For the given decision table there can be more than one reduct. Core is the set of all indispensable condition attributes. None of its elements can be removed without affecting the classification power of all condition attributes. Moreover, every reduct consists of all the attributes from the core. In this paper, the hardware implementation of the two-stage greedy algorithm to find the one reduct is presented. The decision table is used as an input. Output of the algorithm is the superreduct which is the reduct with some additional removable attributes. First stage of the algorithm is calculating the core using the discernibility matrix. Second stage is generating the superreduct by enriching the core with the most common attributes, i.e., attributes that are more frequent in the decision table. Described above algorithm has two disadvantages: i) generating the superreduct instead of reduct, ii) additional first stage may be unnecessary if the core is empty. But for the systems focused on the fast computation of the reduct the first disadvantage is not the key problem. The core calculation can be achieved with a combinational logic block, and thus add respectively little time to the whole process. Algorithm presented in this paper was implemented in Field Programmable Gate Array (FPGA) as a digital device consisting of blocks that process the data in a single step. Calculating the core is done by the comparators connected to the block called 'singleton detector', which detects if the input word contains only single 'one'. Calculating the number of occurrences of the attribute is performed in the combinational block made up of the cascade of the adders. The superreduct generation process is iterative and thus needs the sequential circuit for controlling the calculations. For the research purpose, the algorithm was also implemented in C language and run on a PC. The times of execution of the reduct calculation in a hardware and software were considered. Results show increase in the speed of data processing.

Keywords: data reduction, digital systems design, field programmable gate array (FPGA), reduct, rough set

Procedia PDF Downloads 189
1109 Biofungicides in Nursery Production

Authors: Miroslava Markovic, Snezana Rajkovic, Ljubinko Rakonjac, Aleksandar Lucic

Abstract:

Oak powdery mildew is a serious problem on seedlings in nurseries as well as on naturally and artificially introduced progeny. The experiments were set on oak seedlings in two nurseries located in Central Serbia, where control of oak powdery mildew Microsphaera alphitoides Griff. et Maubl. had been conducted through alternative protection measures by means of various dosages of AQ-10 biofungicide, with and without added polymer (which has so far never been used in this country for control of oak powdery mildew). Simultaneous testing was conducted on the efficiency of a chemical sulphur-based preparation (used in this area for many years as a measure of suppression of powdery mildews, without the possibility of developing resistance of the pathogen to the active matter). To date, the Republic of Serbia has registered no fungicides for suppression of pathogens in the forest ecosystems. In order to introduce proper use of new disease-fighting agents into a country, certain relevant principles, requirements and criteria prescribed by the Forest Stewardship Council (FSC) must be observed, primarily with respect to measures of assessment and mitigation of risks, the list of dangerous and highly dangerous pesticides with the possibility of alternative protection. One of the main goals of the research was adjustment of the protective measures to the FSC policy through selection of eco-toxicologically favourable fungicides, given the fact that only preparations named on the list of permitted active matters are approved for use in certified forests. The results of the research have demonstrated that AQ-10 biofungicide can be used as a part of integrated disease management programmes as an alternative, through application of several treatments during vegetation and combination with other active matters registered for these purposes, so as to curtail the use of standard fungicides for control of powdery mildews on oak seedlings in nurseries. The best results in suppression of oak powdery mildew were attained through use of AQ-10 biofungicide (dose 50 or 70g/ha) with added polymer Nu Film-17 (dose 1.0 or 1.5 l/ha). If the treatment is applied at the appropriate time, even fewer number of treatments and smaller doses will be just as efficient.

Keywords: oak powdery mildew, biofungicides, polymers, Microsphaera alphitoides

Procedia PDF Downloads 353
1108 Apoptosis Pathway Targeted by Thymoquinone in MCF7 Breast Cancer Cell Line

Authors: M. Marjaneh, M. Y. Narazah, H. Shahrul

Abstract:

Array-based gene expression analysis is a powerful tool to profile expression of genes and to generate information on therapeutic effects of new anti-cancer compounds. Anti-apoptotic effect of thymoquinone was studied in MCF7 breast cancer cell line using gene expression profiling with cDNA micro array. The purity and yield of RNA samples were determined using RNeasyPlus Mini kit. The Agilent RNA 6000 Nano LabChip kit evaluated the quantity of the RNA samples. AffinityScript RT oligo-dT promoter primer was used to generate cDNA strands. T7 RNA polymerase was used to convert cDNA to cRNA. The cRNA samples and human universal reference RNA were labelled with Cy-3-CTP and Cy-5-CTP, respectively. Feature Extraction and GeneSpring software analysed the data. The single experiment analysis revealed involvement of 64 pathways with up-regulated genes and 78 pathways with down-regulated genes. The MAPK and p38-MAPK pathways were inhibited due to the up-regulation of PTPRR gene. The inhibition of p38-MAPK suggested up-regulation of TGF-ß pathway. Inhibition of p38 - MAPK caused up-regulation of TP53 and down-regulation of Bcl2 genes indicating involvement of intrinsic apoptotic pathway. Down-regulation of CARD16 gene as an adaptor molecule regulated CASP1 and suggested necrosis-like programmed cell death and involvement of caspase in apoptosis. Furthermore, down-regulation of GPCR, EGF-EGFR signalling pathways suggested reduction of ER. Involvement of AhR pathway which control cytochrome P450 and glucuronidation pathways showed metabolism of Thymoquinone. The findings showed differential expression of several genes in apoptosis pathways with thymoquinone treatment in estrogen receptor-positive breast cancer cells.

Keywords: cDNA microarray, thymoquinone, CARD16, PTPRR, CASP10

Procedia PDF Downloads 322
1107 Merging of Results in Distributed Information Retrieval Systems

Authors: Larbi Guezouli, Imane Azzouz

Abstract:

This work is located in the domain of distributed information retrieval ‘DIR’. A simplified view of the DIR requires a multi-search in a set of collections, which forces the system to analyze results found in these collections, and merge results back before sending them to the user in a single list. Our work is to find a fusion method based on the relevance score of each result received from collections and the relevance of the local search engine of each collection.

Keywords: information retrieval, distributed IR systems, merging results, datamining

Procedia PDF Downloads 305
1106 Characterization of Probability Distributions through Conditional Expectation of Pair of Generalized Order Statistics

Authors: Zubdahe Noor, Haseeb Athar

Abstract:

In this article, first a relation for conditional expectation is developed and then is used to characterize a general class of distributions F(x) = 1-e^(-ah(x)) through conditional expectation of difference of pair of generalized order statistics. Some results are reduced for particular cases. In the end, a list of distributions is presented in the form of table that are compatible with the given general class.

Keywords: generalized order statistics, order statistics, record values, conditional expectation, characterization

Procedia PDF Downloads 436
1105 Development of a Computer Based, Nutrition and Fitness Programme and Its Effect on Nutritional Status and Fitness of Obese Adults

Authors: Richa Soni, Vibha Bhatnagar, N. K. Jain

Abstract:

This study was conducted to develop a computer mediated programme for weight management and physical fitness and examining its efficacy in reducing weight and improving physical fitness in obese adults. A user friendly, computer based programme was developed to provide a simple, quick, easy and user-friendly method of assessing energy balance at individual level. The programme had four main sections viz. personal Profile, know about your weight, fitness and food exchange list. The computer programme was developed to provide facilities of creating individual profile, tracking meal and physical activities, suggesting nutritional and exercise requirements, planning calorie specific menus, keeping food diaries and revising the diet and exercise plans if needed. The programme was also providing information on obesity, underweight, physical fitness. An exhaustive food exchange list was also given in the programme to assist user to make right food choice decisions. The developed programme was evaluated by a panel of 15 experts comprising endocrinologists, nutritionists and diet counselors. Suggestions given by the experts were paned down and the entire programme was modified in light of suggestions given by the panel members and was reevaluated by the same panel of experts. For assessing the impact of the programme 22 obese subjects were selected purposively and randomly assigned to intervention group (n=12) and no information control group. (n=10). The programme group was asked to strictly follow the programme for one month. Significant reduction in the intake of energy, fat and carbohydrates was observed while intake of fruits, green leafy vegetables was increased. The programme was also found to be effective in reducing body weight, body fat percent and body fat mass whereas total body water and physical fitness scores improved significantly. There was no significant alteration observed in any parameters in the control group.

Keywords: body composition, body weight, computer programme, physical fitness

Procedia PDF Downloads 260
1104 Sensor and Sensor System Design, Selection and Data Fusion Using Non-Deterministic Multi-Attribute Tradespace Exploration

Authors: Matthew Yeager, Christopher Willy, John Bischoff

Abstract:

The conceptualization and design phases of a system lifecycle consume a significant amount of the lifecycle budget in the form of direct tasking and capital, as well as the implicit costs associated with unforeseeable design errors that are only realized during downstream phases. Ad hoc or iterative approaches to generating system requirements oftentimes fail to consider the full array of feasible systems or product designs for a variety of reasons, including, but not limited to: initial conceptualization that oftentimes incorporates a priori or legacy features; the inability to capture, communicate and accommodate stakeholder preferences; inadequate technical designs and/or feasibility studies; and locally-, but not globally-, optimized subsystems and components. These design pitfalls can beget unanticipated developmental or system alterations with added costs, risks and support activities, heightening the risk for suboptimal system performance, premature obsolescence or forgone development. Supported by rapid advances in learning algorithms and hardware technology, sensors and sensor systems have become commonplace in both commercial and industrial products. The evolving array of hardware components (i.e. sensors, CPUs, modular / auxiliary access, etc…) as well as recognition, data fusion and communication protocols have all become increasingly complex and critical for design engineers during both concpetualization and implementation. This work seeks to develop and utilize a non-deterministic approach for sensor system design within the multi-attribute tradespace exploration (MATE) paradigm, a technique that incorporates decision theory into model-based techniques in order to explore complex design environments and discover better system designs. Developed to address the inherent design constraints in complex aerospace systems, MATE techniques enable project engineers to examine all viable system designs, assess attribute utility and system performance, and better align with stakeholder requirements. Whereas such previous work has been focused on aerospace systems and conducted in a deterministic fashion, this study addresses a wider array of system design elements by incorporating both traditional tradespace elements (e.g. hardware components) as well as popular multi-sensor data fusion models and techniques. Furthermore, statistical performance features to this model-based MATE approach will enable non-deterministic techniques for various commercial systems that range in application, complexity and system behavior, demonstrating a significant utility within the realm of formal systems decision-making.

Keywords: multi-attribute tradespace exploration, data fusion, sensors, systems engineering, system design

Procedia PDF Downloads 155
1103 The Cave Paintings of Libyc Inscriptions of Tifra, Kabylia, Algeria

Authors: Samia Ait Ali Yahia

Abstract:

The Tifra site is one of 54 sites with rock paintings discovered in Kabylia (Algeria). It consists of two shelters: Ifran I and Ifran II. From an aesthetic point of view, these two shelters appear poor. It shows a human silhouette, a hand, enigmatic designs and especially Libyc inscriptions. The paint used, is the natural red ocher. Today, these paintings are threatened by the frequentation of tourists to the sites as well as by the degradation which result from it. It is therefore vital to us to list and analyze these paintings before they disappear. The analysis of these paintings will be focused on the epigraphic and iconographic level and their meanings.

Keywords: cave painting, Libyc inscription, conservation, valorization

Procedia PDF Downloads 111
1102 Thermal Hydraulic Analysis of Sub-Channels of Pressurized Water Reactors with Hexagonal Array: A Numerical Approach

Authors: Md. Asif Ullah, M. A. R. Sarkar

Abstract:

This paper illustrates 2-D and 3-D simulations of sub-channels of a Pressurized Water Reactor (PWR) having hexagonal array of fuel rods. At a steady state, the temperature of outer surface of the cladding of fuel rod is kept about 1200°C. The temperature of this isothermal surface is taken as boundary condition for simulation. Water with temperature of 290°C is given as a coolant inlet to the primary water circuit which is pressurized upto 157 bar. Turbulent flow of pressurized water is used for heat removal. In 2-D model, temperature, velocity, pressure and Nusselt number distributions are simulated in a vertical sectional plane through the sub-channels of a hexagonal fuel rod assembly. Temperature, Nusselt number and Y-component of convective heat flux along a line in this plane near the end of fuel rods are plotted for different Reynold’s number. A comparison between X-component and Y-component of convective heat flux in this vertical plane is analyzed. Hexagonal fuel rod assembly has three types of sub-channels according to geometrical shape whose boundary conditions are different too. In 3-D model, temperature, velocity, pressure, Nusselt number, total heat flux magnitude distributions for all the three sub-channels are studied for a suitable Reynold’s number. A horizontal sectional plane is taken from each of the three sub-channels to study temperature, velocity, pressure, Nusselt number and convective heat flux distribution in it. Greater values of temperature, Nusselt number and Y-component of convective heat flux are found for greater Reynold’s number. X-component of convective heat flux is found to be non-zero near the bottom of fuel rod and zero near the end of fuel rod. This indicates that the convective heat transfer occurs totally along the direction of flow near the outlet. As, length to radius ratio of sub-channels is very high, simulation for a short length of the sub-channels are done for graphical interface advantage. For the simulations, Turbulent Flow (K-Є ) module and Heat Transfer in Fluids (ht) module of COMSOL MULTIPHYSICS 5.0 are used.

Keywords: sub-channels, Reynold’s number, Nusselt number, convective heat transfer

Procedia PDF Downloads 343
1101 Liquefaction Potential Assessment Using Screw Driving Testing and Microtremor Data: A Case Study in the Philippines

Authors: Arturo Daag

Abstract:

The Philippine Institute of Volcanology and Seismology (PHIVOLCS) is enhancing its liquefaction hazard map towards a detailed probabilistic approach using SDS and geophysical data. Target sites for liquefaction assessment are public schools in Metro Manila. Since target sites are in highly urbanized-setting, the objective of the project is to conduct both non-destructive geotechnical studies using Screw Driving Testing (SDFS) combined with geophysical data such as refraction microtremor array (ReMi), 3 component microtremor Horizontal to Vertical Spectral Ratio (HVSR), and ground penetrating RADAR (GPR). Initial test data was conducted in liquefaction impacted areas from the Mw 6.1 earthquake in Central Luzon last April 22, 2019 Province of Pampanga. Numerous accounts of liquefaction events were documented areas underlain by quaternary alluvium and mostly covered by recent lahar deposits. SDS estimated values showed a good correlation to actual SPT values obtained from available borehole data. Thus, confirming that SDS can be an alternative tool for liquefaction assessment and more efficient in terms of cost and time compared to SPT and CPT. Conducting borehole may limit its access in highly urbanized areas. In order to extend or extrapolate the SPT borehole data, non-destructive geophysical equipment was used. A 3-component microtremor obtains a subsurface velocity model in 1-D seismic shear wave velocity of the upper 30 meters of the profile (Vs30). For the ReMi, 12 geophone array with 6 to 8-meter spacing surveys were conducted. Microtremor data were computed through the Factor of Safety, which is the quotient of Cyclic Resistance Ratio (CRR) and Cyclic Stress Ratio (CSR). Complementary GPR was used to study the subsurface structure and used to inferred subsurface structures and groundwater conditions.

Keywords: screw drive testing, microtremor, ground penetrating RADAR, liquefaction

Procedia PDF Downloads 167
1100 Extremal Laplacian Energy of Threshold Graphs

Authors: Seyed Ahmad Mojallal

Abstract:

Let G be a connected threshold graph of order n with m edges and trace T. In this talk we give a lower bound on Laplacian energy in terms of n, m, and T of G. From this we determine the threshold graphs with the first four minimal Laplacian energies. We also list the first 20 minimal Laplacian energies among threshold graphs. Let σ=σ(G) be the number of Laplacian eigenvalues greater than or equal to average degree of graph G. Using this concept, we obtain the threshold graphs with the largest and the second largest Laplacian energies.

Keywords: Laplacian eigenvalues, Laplacian energy, threshold graphs, extremal graphs

Procedia PDF Downloads 358
1099 Development of a Social Assistive Robot for Elderly Care

Authors: Edwin Foo, Woei Wen, Lui, Meijun Zhao, Shigeru Kuchii, Chin Sai Wong, Chung Sern Goh, Yi Hao He

Abstract:

This presentation presents an elderly care and assistive social robot development work. We named this robot JOS and he is restricted to table top operation. JOS is designed to have a maximum volume of 3600 cm3 with its base restricted to 250 mm and his mission is to provide companion, assist and help the elderly. In order for JOS to accomplish his mission, he will be equipped with perception, reaction and cognition capability. His appearance will be not human like but more towards cute and approachable type. JOS will also be designed to be neutral gender. However, the robot will still have eyes, eyelid and a mouth. For his eyes and eyelids, they will be built entirely with Robotis Dynamixel AX18 motor. To realize this complex task, JOS will be also be equipped with micro-phone array, vision camera and Intel i5 NUC computer and a powered by a 12 V lithium battery that will be self-charging. His face is constructed using 1 motor each for the eyelid, 2 motors for the eyeballs, 3 motors for the neck mechanism and 1 motor for the lips movement. The vision senor will be house on JOS forehead and the microphone array will be somewhere below the mouth. For the vision system, Omron latest OKAO vision sensor is used. It is a compact and versatile sensor that is only 60mm by 40mm in size and operates with only 5V supply. In addition, OKAO vision sensor is capable of identifying the user and recognizing the expression of the user. With these functions, JOS is able to track and identify the user. If he cannot recognize the user, JOS will ask the user if he would want him to remember the user. If yes, JOS will store the user information together with the capture face image into a database. This will allow JOS to recognize the user the next time the user is with JOS. In addition, JOS is also able to interpret the mood of the user through the facial expression of the user. This will allow the robot to understand the user mood and behavior and react according. Machine learning will be later incorporated to learn the behavior of the user so as to understand the mood of the user and requirement better. For the speech system, Microsoft speech and grammar engine is used for the speech recognition. In order to use the speech engine, we need to build up a speech grammar database that captures the commonly used words by the elderly. This database is built from research journals and literature on elderly speech and also interviewing elderly what do they want to robot to assist them with. Using the result from the interview and research from journal, we are able to derive a set of common words the elderly frequently used to request for the help. It is from this set that we build up our grammar database. In situation where there is more than one person near JOS, he is able to identify the person who is talking to him through an in-house developed microphone array structure. In order to make the robot more interacting, we have also included the capability for the robot to express his emotion to the user through the facial expressions by changing the position and movement of the eyelids and mouth. All robot emotions will be in response to the user mood and request. Lastly, we are expecting to complete this phase of project and test it with elderly and also delirium patient by Feb 2015.

Keywords: social robot, vision, elderly care, machine learning

Procedia PDF Downloads 418
1098 Implementation of Statistical Parameters to Form an Entropic Mathematical Models

Authors: Gurcharan Singh Buttar

Abstract:

It has been discovered that although these two areas, statistics, and information theory, are independent in their nature, they can be combined to create applications in multidisciplinary mathematics. This is due to the fact that where in the field of statistics, statistical parameters (measures) play an essential role in reference to the population (distribution) under investigation. Information measure is crucial in the study of ambiguity, assortment, and unpredictability present in an array of phenomena. The following communication is a link between the two, and it has been demonstrated that the well-known conventional statistical measures can be used as a measure of information.

Keywords: probability distribution, entropy, concavity, symmetry, variance, central tendency

Procedia PDF Downloads 134
1097 Reading and Writing Memories in Artificial and Human Reasoning

Authors: Ian O'Loughlin

Abstract:

Memory networks aim to integrate some of the recent successes in machine learning with a dynamic memory base that can be updated and deployed in artificial reasoning tasks. These models involve training networks to identify, update, and operate over stored elements in a large memory array in order, for example, to ably perform question and answer tasks parsing real-world and simulated discourses. This family of approaches still faces numerous challenges: the performance of these network models in simulated domains remains considerably better than in open, real-world domains, wide-context cues remain elusive in parsing words and sentences, and even moderately complex sentence structures remain problematic. This innovation, employing an array of stored and updatable ‘memory’ elements over which the system operates as it parses text input and develops responses to questions, is a compelling one for at least two reasons: first, it addresses one of the difficulties that standard machine learning techniques face, by providing a way to store a large bank of facts, offering a way forward for the kinds of long-term reasoning that, for example, recurrent neural networks trained on a corpus have difficulty performing. Second, the addition of a stored long-term memory component in artificial reasoning seems psychologically plausible; human reasoning appears replete with invocations of long-term memory, and the stored but dynamic elements in the arrays of memory networks are deeply reminiscent of the way that human memory is readily and often characterized. However, this apparent psychological plausibility is belied by a recent turn in the study of human memory in cognitive science. In recent years, the very notion that there is a stored element which enables remembering, however dynamic or reconstructive it may be, has come under deep suspicion. In the wake of constructive memory studies, amnesia and impairment studies, and studies of implicit memory—as well as following considerations from the cognitive neuroscience of memory and conceptual analyses from the philosophy of mind and cognitive science—researchers are now rejecting storage and retrieval, even in principle, and instead seeking and developing models of human memory wherein plasticity and dynamics are the rule rather than the exception. In these models, storage is entirely avoided by modeling memory using a recurrent neural network designed to fit a preconceived energy function that attains zero values only for desired memory patterns, so that these patterns are the sole stable equilibrium points in the attractor network. So although the array of long-term memory elements in memory networks seem psychologically appropriate for reasoning systems, they may actually be incurring difficulties that are theoretically analogous to those that older, storage-based models of human memory have demonstrated. The kind of emergent stability found in the attractor network models more closely fits our best understanding of human long-term memory than do the memory network arrays, despite appearances to the contrary.

Keywords: artificial reasoning, human memory, machine learning, neural networks

Procedia PDF Downloads 235
1096 Smart Books as a Supporting Tool for Developing Skills of Designing and Employing Webquest 2.0

Authors: Huda Alyami

Abstract:

The present study aims to measure the effectiveness of an "Interactive eBook" in order to develop skills of designing and employing webquests for female intern teachers. The study uses descriptive analytical methodology as well as quasi-experimental methodology. The sample of the study consists of (30) female intern teachers from the Department of Special Education (in the tracks of Gifted Education and Learning Difficulties), during the first semester of the academic year 2015, at King Abdul-Aziz University in Jeddah city. The sample is divided into (15) female intern teachers for the experimental group, and (15) female intern teachers for the control group. A set of qualitative and quantitative tools have been prepared and verified for the study, embodied in: a list of the designing webquests' skills, a list of the employing webquests' skills, a webquests' knowledge achievement test, a product rating card, an observation card, and an interactive ebook. The study concludes the following results: 1. After pre-control, there are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of the experimental and the control groups in the post measurement of the webquests' knowledge achievement test, in favor of the experimental group. 2. There are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of experimental and control groups in the post measurement of the product rating card in favor of the experimental group. 3. There are statistically significant differences, at the significance level of (α ≤ 0.05), between the mean scores of experimental and control groups in the post measurement of the observation card for the experimental group. In the light of the previous findings, the study recommends the following: taking advantage of interactive ebooks when teaching all educational courses for various disciplines at the university level, creating educational participative platforms to share educational interactive ebooks for various disciplines at the local and regional levels. The study suggests conducting further qualitative studies on the effectiveness of interactive ebooks, in addition to conducting studies on the use of (Web 2.0) in webquests.

Keywords: interactive eBook, webquest, design, employing, develop skills

Procedia PDF Downloads 160
1095 A POX Controller Module to Collect Web Traffic Statistics in SDN Environment

Authors: Wisam H. Muragaa, Kamaruzzaman Seman, Mohd Fadzli Marhusin

Abstract:

Software Defined Networking (SDN) is a new norm of networks. It is designed to facilitate the way of managing, measuring, debugging and controlling the network dynamically, and to make it suitable for the modern applications. Generally, measurement methods can be divided into two categories: Active and passive methods. Active measurement method is employed to inject test packets into the network in order to monitor their behaviour (ping tool as an example). Meanwhile the passive measurement method is used to monitor the traffic for the purpose of deriving measurement values. The measurement methods, both active and passive, are useful for the collection of traffic statistics, and monitoring of the network traffic. Although there has been a work focusing on measuring traffic statistics in SDN environment, it was only meant for measuring packets and bytes rates for non-web traffic. In this study, a feasible method will be designed to measure the number of packets and bytes in a certain time, and facilitate obtaining statistics for both web traffic and non-web traffic. Web traffic refers to HTTP requests that use application layer; while non-web traffic refers to ICMP and TCP requests. Thus, this work is going to be more comprehensive than previous works. With a developed module on POX OpenFlow controller, information will be collected from each active flow in the OpenFlow switch, and presented on Command Line Interface (CLI) and wireshark interface. Obviously, statistics that will be displayed on CLI and on wireshark interfaces include type of protocol, number of bytes and number of packets, among others. Besides, this module will show the number of flows added to the switch whenever traffic is generated from and to hosts in the same statistics list. In order to carry out this work effectively, our Python module will send a statistics request message to the switch requesting its current ports and flows statistics in every five seconds; while the switch will reply with the required information in a message called statistics reply message. Thus, POX controller will be notified and updated with any changes could happen in the entire network in a very short time. Therefore, our aim of this study is to prepare a list for the important statistics elements that are collected from the whole network, to be used for any further researches; particularly, those that are dealing with the detection of the network attacks that cause a sudden rise in the number of packets and bytes like Distributed Denial of Service (DDoS).

Keywords: mininet, OpenFlow, POX controller, SDN

Procedia PDF Downloads 196
1094 Novel Design of Quantum Dot Arrays to Enhance Near-Fields Excitation Resonances

Authors: Nour Hassan Ismail, Abdelmonem Nassar, Khaled Baz

Abstract:

Semiconductor crystals smaller than about 10 nm, known as quantum dots, have properties that differ from large samples, including a band gap that becomes larger for smaller particles. These properties create several applications for quantum dots. In this paper, new shapes of quantum dot arrays are used to enhance the photo physical properties of gold nano-particles. This paper presents a study of the effect of nano-particles shape, array, and size on their absorption characteristics.

Keywords: quantum dots, nano-particles, LSPR

Procedia PDF Downloads 450
1093 Using Corpora in Semantic Studies of English Adjectives

Authors: Oxana Lukoshus

Abstract:

The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.

Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies

Procedia PDF Downloads 294
1092 A Closer Look on Economic and Fiscal Incentives for Digital TV Industry

Authors: Yunita Anwar, Maya Safira Dewi

Abstract:

With the increasing importance on digital TV industry, there must be several incentives given to support the growth of the industry. Prior research have found mixed findings of economic and fiscal incentives to economic growth, which means these incentives do not necessarily boost the economic growth while providing support to a particular industry. Focusing on a setting of digital TV transition in Indonesia, this research will conduct document analysis to analyze incentives have been given in other country and incentives currently available in Indonesia. Our results recommend that VAT exemption and local tax incentives could be considered to be added to the incentives list available for digital TV industry.

Keywords: Digital TV transition, Economic Incentives, Fiscal Incentives, Policy.

Procedia PDF Downloads 294
1091 Attempt to Reuse Used-PCs as Distributed Storage

Authors: Toshiya Kawato, Shin-ichi Motomura, Masayuki Higashino, Takao Kawamura

Abstract:

Storage for storing data is indispensable. If a storage capacity becomes insufficient, we can increase its capacity by adding new disks. It is, however, difficult to add a new disk when a budget is not enough. On the other hand, there are many unused idle resources such as used personal computers despite those use value. In order to solve those problems, used personal computers can be reused as storage. In this paper, we attempt to reuse used-PCs as a distributed storage. First, we list up the characteristics of used-PCs and design a storage system that utilizes its characteristics. Next, we experimentally implement an auto-construction system that automatically constructs a distributed storage environment in used-PCs.

Keywords: distributed storage, used personal computer, idle resource, auto construction

Procedia PDF Downloads 221
1090 Wavelets Contribution on Textual Data Analysis

Authors: Habiba Ben Abdessalem

Abstract:

The emergence of giant set of textual data was the push that has encouraged researchers to invest in this field. The purpose of textual data analysis methods is to facilitate access to such type of data by providing various graphic visualizations. Applying these methods requires a corpus pretreatment step, whose standards are set according to the objective of the problem studied. This step determines the forms list contained in contingency table by keeping only those information carriers. This step may, however, lead to noisy contingency tables, so the use of wavelet denoising function. The validity of the proposed approach is tested on a text database that offers economic and political events in Tunisia for a well definite period.

Keywords: textual data, wavelet, denoising, contingency table

Procedia PDF Downloads 255