Search results for: distributed algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3804

Search results for: distributed algorithms

1014 In-Depth Analysis on Sequence Evolution and Molecular Interaction of Influenza Receptors (Hemagglutinin and Neuraminidase)

Authors: Dong Tran, Thanh Dac Van, Ly Le

Abstract:

Hemagglutinin (HA) and Neuraminidase (NA) play an important role in host immune evasion across influenza virus evolution process. The correlation between HA and NA evolution in respect to epitopic evolution and drug interaction has yet to be investigated. In this study, combining of sequence to structure evolution and statistical analysis on epitopic/binding site specificity, we identified potential therapeutic features of HA and NA that show specific antibody binding site of HA and specific binding distribution within NA active site of current inhibitors. Our approach introduces the use of sequence variation and molecular interaction to provide an effective strategy in establishing experimental based distributed representations of protein-protein/ligand complexes. The most important advantage of our method is that it does not require complete dataset of complexes but rather directly inferring feature interaction from sequence variation and molecular interaction. Using correlated sequence analysis, we additionally identified co-evolved mutations associated with maintaining HA/NA structural and functional variability toward immunity and therapeutic treatment. Our investigation on the HA binding specificity revealed unique conserved stalk domain interacts with unique loop domain of universal antibodies (CR9114, CT149, CR8043, CR8020, F16v3, CR6261, F10). On the other hand, NA inhibitors (Oseltamivir, Zaninamivir, Laninamivir) showed specific conserved residue contribution and similar to that of NA substrate (sialic acid) which can be exploited for drug design. Our study provides an important insight into rational design and identification of novel therapeutics targeting universally recognized feature of influenza HA/NA.

Keywords: influenza virus, hemagglutinin (HA), neuraminidase (NA), sequence evolution

Procedia PDF Downloads 131
1013 Cognitive Methods for Detecting Deception During the Criminal Investigation Process

Authors: Laid Fekih

Abstract:

Background: It is difficult to detect lying, deception, and misrepresentation just by looking at verbal or non-verbal expression during the criminal investigation process, as there is a common belief that it is possible to tell whether a person is lying or telling the truth just by looking at the way they act or behave. The process of detecting lies and deception during the criminal investigation process needs more studies and research to overcome the difficulties facing the investigators. Method: The present study aimed to identify the effectiveness of cognitive methods and techniques in detecting deception during the criminal investigation. It adopted the quasi-experimental method and covered a sample of (20) defendants distributed randomly into two homogeneous groups, an experimental group of (10) defendants be subject to criminal investigation by applying cognitive techniques to detect deception and a second experimental group of (10) defendants be subject to the direct investigation method. The tool that used is a guided interview based on models of investigative questions according to the cognitive deception detection approach, which consists of three techniques of Vrij: imposing the cognitive burden, encouragement to provide more information, and ask unexpected questions, and the Direct Investigation Method. Results: Results revealed a significant difference between the two groups in term of lie detection accuracy in favour of defendants be subject to criminal investigation by applying cognitive techniques, the cognitive deception detection approach produced superior total accuracy rates both with human observers and through an analysis of objective criteria. The cognitive deception detection approach produced superior accuracy results in truth detection: 71%, deception detection: 70% compared to a direct investigation method truth detection: 52%; deception detection: 49%. Conclusion: The study recommended if practitioners use a cognitive deception detection technique, they will correctly classify more individuals than when they use a direct investigation method.

Keywords: the cognitive lie detection approach, deception, criminal investigation, mental health

Procedia PDF Downloads 35
1012 Impact of E-Resources and Its Acceessability by Faculty and Research Scholars of Academic Libraries: A Case Study

Authors: M. Jaculine Mary

Abstract:

Today electronic resources are considered as an integral part of information sources to impart efficient services to the people aspiring to acquire knowledge in different fields. E-resources are those resources which include documents in e-format that can be accessed via the Internet in a digital library environment. The present study focuses on accessibility and use of e-resources by faculty and research scholars of academic libraries of Coimbatore, TamilNadu, India. The main objectives are to identify their purpose of using e-resources, know the users’ Information and Communication Technology (ICT) skills, identify satisfaction level of availability of e-resources, use of different e-resources, overall user satisfaction of using e-resources, impact of e-resources on their research and problems faced by them in the access of e-resources. The research methodology adopted to collect data for this study includes analysis of survey reports carried out by distributing questionnaires to the users. The findings of the research are based on the study of responses received from questionnaires distributed to a sample population of 200 users. Among the 200 respondents, 55 percent of research students and 45 percent of faculty members were users of e-resources. It was found that a majority of the users agreed that relevant, updated information at a fast pace had influenced them to use e-resources. Most of the respondents were of the view that more numbers of computers in the library would facilitate quick learning. Academic libraries have to take steps to arrange various training and orientation programmes for research students and faculty members to use the availability of e-resources. This study helps the librarian in planning and development of e-resources to provide modern services to their users of libraries. The study recommends that measures should be taken to increase the accessibility level of e-resource services among the information seekers for increasing the best usage of available electronic resources in the academic libraries.

Keywords: academic libraries, accessibility, electronic resources, satisfaction level, survey

Procedia PDF Downloads 100
1011 Cirrhosis Mortality Prediction as Classification using Frequent Subgraph Mining

Authors: Abdolghani Ebrahimi, Diego Klabjan, Chenxi Ge, Daniela Ladner, Parker Stride

Abstract:

In this work, we use machine learning and novel data analysis techniques to predict the one-year mortality of cirrhotic patients. Data from 2,322 patients with liver cirrhosis are collected at a single medical center. Different machine learning models are applied to predict one-year mortality. A comprehensive feature space including demographic information, comorbidity, clinical procedure and laboratory tests is being analyzed. A temporal pattern mining technic called Frequent Subgraph Mining (FSM) is being used. Model for End-stage liver disease (MELD) prediction of mortality is used as a comparator. All of our models statistically significantly outperform the MELD-score model and show an average 10% improvement of the area under the curve (AUC). The FSM technic itself does not improve the model significantly, but FSM, together with a machine learning technique called an ensemble, further improves the model performance. With the abundance of data available in healthcare through electronic health records (EHR), existing predictive models can be refined to identify and treat patients at risk for higher mortality. However, due to the sparsity of the temporal information needed by FSM, the FSM model does not yield significant improvements. To the best of our knowledge, this is the first work to apply modern machine learning algorithms and data analysis methods on predicting one-year mortality of cirrhotic patients and builds a model that predicts one-year mortality significantly more accurate than the MELD score. We have also tested the potential of FSM and provided a new perspective of the importance of clinical features.

Keywords: machine learning, liver cirrhosis, subgraph mining, supervised learning

Procedia PDF Downloads 106
1010 Exploring the Use of Drones for Corn Borer Management: A Case Study in Central Italy

Authors: Luana Centorame, Alessio Ilari, Marco Giustozzi, Ester Foppa Pedretti

Abstract:

Maize is one of the most important agricultural cash crops in the world, involving three different chains: food, feed, and bioenergy production. Nowadays, the European corn borer (ECB), Ostrinia nubilalis, to the best of the author's knowledge, is the most important pest to control for maize growers. The ECB is harmful to maize; young larvae are responsible for minor damage to the leaves, while the most serious damage is tunneling by older larvae that burrow into the stock. Soon after, larvae can affect cobs, and it was found that ECB can foster mycotoxin contamination; this is why it is crucial to control it. There are multiple control methods available: agronomic, biological, and microbiological means, agrochemicals, and genetically modified plants. Meanwhile, the European Union’s policy focuses on the transition to sustainable supply chains and translates into the goal of reducing the use of agrochemicals by 50%. The current work aims to compare the agrochemical treatment of ECB and biological control through beneficial insects released by drones. The methodology used includes field trials of both chemical and biological control, considering a farm in central Italy as a case study. To assess the mechanical and technical efficacy of drones with respect to standard machinery, the available literature was consulted. The findings are positive because drones allow them to get in the field promptly, in difficult conditions and with lower costs if compared to traditional techniques. At the same time, it is important to consider the limits of drones regarding pilot certification, no-fly zones, etc. In the future, it will be necessary to deepen the topic with the real application in the field of both systems, expanding the scenarios in which drones can be used and the type of material distributed.

Keywords: beneficial insects, corn borer management, drones, precision agriculture

Procedia PDF Downloads 63
1009 Fully Automated Methods for the Detection and Segmentation of Mitochondria in Microscopy Images

Authors: Blessing Ojeme, Frederick Quinn, Russell Karls, Shannon Quinn

Abstract:

The detection and segmentation of mitochondria from fluorescence microscopy are crucial for understanding the complex structure of the nervous system. However, the constant fission and fusion of mitochondria and image distortion in the background make the task of detection and segmentation challenging. In the literature, a number of open-source software tools and artificial intelligence (AI) methods have been described for analyzing mitochondrial images, achieving remarkable classification and quantitation results. However, the availability of combined expertise in the medical field and AI required to utilize these tools poses a challenge to its full adoption and use in clinical settings. Motivated by the advantages of automated methods in terms of good performance, minimum detection time, ease of implementation, and cross-platform compatibility, this study proposes a fully automated framework for the detection and segmentation of mitochondria using both image shape information and descriptive statistics. Using the low-cost, open-source python and openCV library, the algorithms are implemented in three stages: pre-processing, image binarization, and coarse-to-fine segmentation. The proposed model is validated using the mitochondrial fluorescence dataset. Ground truth labels generated using a Lab kit were also used to evaluate the performance of our detection and segmentation model. The study produces good detection and segmentation results and reports the challenges encountered during the image analysis of mitochondrial morphology from the fluorescence mitochondrial dataset. A discussion on the methods and future perspectives of fully automated frameworks conclude the paper.

Keywords: 2D, binarization, CLAHE, detection, fluorescence microscopy, mitochondria, segmentation

Procedia PDF Downloads 331
1008 Distributive Justice through Constitution

Authors: Rohtash

Abstract:

Academically, the concept of Justice in the literature is vast, and theories are voluminous and definitions are numerous but it is very difficult to define. Through the ages, justice has been evolving and developing reasoning that how individuals and communities do the right thing that is just and fair to all in that society. Justice is a relative and dynamic concept, not absolute one. It is different in different societies based on their morality and ethics. The idea of justice cannot arise from a single morality but interaction of competing moralities and contending perspectives. Justice is the conditional and circumstantial term. Therefore, justice takes different meanings in different contexts. Justice is the application of the Laws. It is a values-based concept in order to protect the rights and liberties of the people. It is a socially created concept that has no physical reality. It exists in society on the basis of the spirit of sharing by the communities and members of society. The conception of justice in society or among communities and individuals is based on their social coordination. It can be effective only when people’s judgments are based on collective reasoning. Their behavior is shaped by social values, norms and laws. People must accept, share and respect the set of principles for delivering justice. Thus justice can be a reasonable solution to conflicts and to coordinate behavior in society. The subject matter of distributive justice is the Public Good and societal resources that should be evenly distributed among the different sections of society on the principles developed and established by the State through legislation, public policy and Executive orders. The Socioeconomic transformation of the society is adopted by the constitution within the limit of its morality and gives a new dimension to transformative justice. Therefore, both Procedural and Transformative justice is part of Distributive justice. Distributive justice is purely an economic phenomenon. It concerns the allocation of resources among the communities and individuals. The subject matter of distributive justice is the distribution of rights, responsibilities, burdens and benefits in society on the basis of the capacity and capability of individuals.

Keywords: distributive justice, constitutionalism, institutionalism, constitutional morality

Procedia PDF Downloads 40
1007 Macroeconomic Policy Coordination and Economic Growth Uncertainty in Nigeria

Authors: Ephraim Ugwu, Christopher Ehinomen

Abstract:

Despite efforts by the Nigerian government to harmonize the macroeconomic policy implementations by establishing various committees to resolve disputes between the fiscal and monetary authorities, it is still evident that the federal government had continued its expansionary policy by increasing spending, thus creating huge budget deficit. This study evaluates the effect of macroeconomic policy coordination on economic growth uncertainty in Nigeria from 1980 to 2020. Employing the Auto regressive distributed lag (ARDL) bound testing procedures, the empirical results shows that the error correction term, ECM(-1), indicates a negative sign and is significant statistically with the t-statistic value of (-5.612882 ). Therefore, the gap between long run equilibrium value and the actual value of the dependent variable is corrected with speed of adjustment equal to 77% yearly. The long run coefficient results showed that the estimated coefficients of the intercept term indicates that other things remains the same (ceteris paribus), the economics growth uncertainty will continue reduce by 7.32%. The coefficient of the fiscal policy variable, PUBEXP, indicates a positive sign and significant statistically. This implies that as the government expenditure increases by 1%, economic growth uncertainty will increase by 1.67%. The coefficient of monetary policy variable MS also indicates a positive sign and insignificant statistically. The coefficients of merchandise trade variable, TRADE and exchange rate EXR show negative signs and significant statistically. This indicate that as the country’s merchandise trade and the rate of exchange increases by 1%, the economic growth uncertainty reduces by 0.38% and 0.06%, respectively. This study, therefore, advocate for proper coordination of monetary, fiscal and exchange rate policies in order to actualize the goal of achieving a stable economic growth.

Keywords: macroeconomic, policy coordination, growth uncertainty, ARDL, Nigeria

Procedia PDF Downloads 53
1006 Using Baculovirus Expression Vector System to Express Envelop Proteins of Chikungunya Virus in Insect Cells and Mammalian Cells

Authors: Tania Tzong, Chao-Yi Teng, Tzong-Yuan Wu

Abstract:

Currently, Chikungunya virus (CHIKV) transmitted to humans by Aedes mosquitoes has distributed from Africa to Southeast Asia, South America, and South Europe. However, little is known about the antigenic targets for immunity, and there are no licensed vaccines or specific antiviral treatments for the disease caused by CHIKV. Baculovirus has been recognized as a novel vaccine vector with attractive characteristic features of an optional vaccine delivery vehicle. This approach provides the safety and efficacy of CHIKV vaccine. In this study, bi-cistronic recombinant baculoviruses vAc-CMV-CHIKV26S-Rhir-EGFP and vAc-CMV-pH-CHIKV26S-Lir-EGFP were produced. Both recombinant baculovirus can express EGFP reporter gene in insect cells to facilitate the recombinant virus isolation and purification. Examination of vAc-CMV-CHIKV26S-Rhir-EGFP and vAc-CMV-pH-CHIKV26S-Lir-EGFP showed that this recombinant baculovirus could induce syncytium formation in insect cells. Unexpectedly, the immunofluorescence assay revealed the expression of E1 and E2 of CHIKV structural proteins in insect cells infected by vAc-CMV-CHIKV26S-Rhir-EGFP. This result may imply that the CMV promoter can induce the transcription of CHIKV26S in insect cells. There are also E1 and E2 expression in mammalian cells transduced by vAc-CMV-CHIKV26S-Rhir-EGFP and vAc-CMV-pH-CHIKV26S-Lir-EGFP. The expression of E1 and E2 proteins of insect and mammalian cells was validated again by Western blot analysis. The vector construction with dual tandem promoters, which is polyhedrin and CMV promoter, has higher expression of the E1 and E2 of CHIKV structural proteins than the vector construction with CMV promoter only. Most of the E1 and E2 proteins expressed in mammalian cells were glycosylated. In the future, the expression of structural proteins of CHIKV in mammalian cells is expected can form virus-like particle, so it could be used as a vaccine for chikungunya virus.

Keywords: chikungunya virus, virus-like particle, vaccines, baculovirus expression vector system

Procedia PDF Downloads 394
1005 Design of Microwave Building Block by Using Numerical Search Algorithm

Authors: Haifeng Zhou, Tsungyang Liow, Xiaoguang Tu, Eujin Lim, Chao Li, Junfeng Song, Xianshu Luo, Ying Huang, Lianxi Jia, Lianwee Luo, Qing Fang, Mingbin Yu, Guoqiang Lo

Abstract:

With the development of technology, countries gradually allocated more and more frequency spectrums for civilization and commercial usage, especially those high radio frequency bands indicating high information capacity. The field effect becomes more and more prominent in microwave components as frequency increases, which invalidates the transmission line theory and complicate the design of microwave components. Here a modeling approach based on numerical search algorithm is proposed to design various building blocks for microwave circuits to avoid complicated impedance matching and equivalent electrical circuit approximation. Concretely, a microwave component is discretized to a set of segments along the microwave propagation path. Each of the segment is initialized with random dimensions, which constructs a multiple-dimension parameter space. Then numerical searching algorithms (e.g. Pattern search algorithm) are used to find out the ideal geometrical parameters. The optimal parameter set is achieved by evaluating the fitness of S parameters after a number of iterations. We had adopted this approach in our current projects and designed many microwave components including sharp bends, T-branches, Y-branches, microstrip-to-stripline converters and etc. For example, a stripline 90° bend was designed in 2.54 mm x 2.54 mm space for dual-band operation (Ka band and Ku band) with < 0.18 dB insertion loss and < -55 dB reflection. We expect that this approach can enrich the tool kits for microwave designers.

Keywords: microwave component, microstrip and stripline, bend, power division, the numerical search algorithm.

Procedia PDF Downloads 349
1004 Application of Regularized Spatio-Temporal Models to the Analysis of Remote Sensing Data

Authors: Salihah Alghamdi, Surajit Ray

Abstract:

Space-time data can be observed over irregularly shaped manifolds, which might have complex boundaries or interior gaps. Most of the existing methods do not consider the shape of the data, and as a result, it is difficult to model irregularly shaped data accommodating the complex domain. We used a method that can deal with space-time data that are distributed over non-planner shaped regions. The method is based on partial differential equations and finite element analysis. The model can be estimated using a penalized least squares approach with a regularization term that controls the over-fitting. The model is regularized using two roughness penalties, which consider the spatial and temporal regularities separately. The integrated square of the second derivative of the basis function is used as temporal penalty. While the spatial penalty consists of the integrated square of Laplace operator, which is integrated exclusively over the domain of interest that is determined using finite element technique. In this paper, we applied a spatio-temporal regression model with partial differential equations regularization (ST-PDE) approach to analyze a remote sensing data measuring the greenness of vegetation, measure by an index called enhanced vegetation index (EVI). The EVI data consist of measurements that take values between -1 and 1 reflecting the level of greenness of some region over a period of time. We applied (ST-PDE) approach to irregular shaped region of the EVI data. The approach efficiently accommodates the irregular shaped regions taking into account the complex boundaries rather than smoothing across the boundaries. Furthermore, the approach succeeds in capturing the temporal variation in the data.

Keywords: irregularly shaped domain, partial differential equations, finite element analysis, complex boundray

Procedia PDF Downloads 111
1003 High Phosphate-Containing Foods and Beverages: Perceptions of the Future Healthcare Providers on Their Harmful Effect in Excessive Consumption

Authors: ATM Emdadul Haque

Abstract:

Phosphorus is an essential nutrient which is regularly consumed with food and exists in the body as phosphate. Phosphate is an important component of cellular structures and needed for bone mineralization. Excessive accumulation of phosphate is an important driving factor of mortality in chronic renal failure patients; of relevance, these patients are usually provided health care by doctors, nurses, and pharmacists. Hence, this study was planned to determine the level of awareness of the future healthcare providers about the phosphate-containing foods and beverages and to access their knowledge on the harmful effects of excess phosphate consumption. A questionnaire was developed and distributed among the year-1 medical, nursing and pharmacy students. 432 medical, nursing and pharmacy students responded with age ranging from 18-24 years. About 70% of the respondents were female with a majority (90.7%) from Malay ethnicity. Among the respondents, 29.9% were medical, 35.4% were the pharmacy and 34.7% were nursing students. 79.2% students knew that phosphate was an important component of the body, but only 61.8% knew that consuming too much phosphate could be harmful to the body. Despite 97% of the students knew that carbonated soda contained high sugar, surprisingly 77% of them did not know the presence of high phosphate in the same soda drinks; in the similar line of observation, 67% did not know the presence of it in the fast food. However, it was encouraging that 94% of the students wanted to know more about the effects of phosphate consumption, 74.3% were willing to give up drinking soda and eating fast food, and 52% considered taking green coconut water instead of soda drinks. It is, therefore, central to take an educational initiative to increase the awareness of the future healthcare providers about phosphate-containing food and its harmful effects in excessive consumptions.

Keywords: high phosphate containing foods and beverages, excessive consumption, future health care providers, phosphorus

Procedia PDF Downloads 331
1002 Different Motor Inhibition Processes in Action Selection Stage: A Study with Spatial Stroop Paradigm

Authors: German Galvez-Garcia, Javier Albayay, Javiera Peña, Marta Lavin, George A. Michael

Abstract:

The aim of this research was to investigate whether the selection of the actions needs different inhibition processes during the response selection stage. In Experiment 1, we compared the magnitude of the Spatial Stroop effect, which occurs in response selection stage, in two motor actions (lifting vs reaching) when the participants performed both actions in the same block or in different blocks (mixed block vs. pure blocks).Within pure blocks, we obtained faster latencies when lifting actions were performed, but no differences in the magnitude of the Spatial Stroop effect were observed. Within mixed block, we obtained faster latencies as well as bigger-magnitude for Spatial Stroop effect when reaching actions were performed. We concluded that when no action selection is required (the pure blocks condition), inhibition works as a unitary system, whereas in the mixed block condition, where action selection is required, different inhibitory processes take place within a common processing stage. In Experiment 2, we investigated this common processing stage in depth by limiting participants’ available resources, requiring them to engage in a concurrent auditory task within a mixed block condition. The Spatial Stroop effect interacted with Movement as it did in Experiment 1, but it did not significantly interact with available resources (Auditory task x Spatial Stroop effect x Movement interaction). Thus, we concluded that available resources are distributed equally to both inhibition processes; this reinforces the likelihood of there being a common processing stage in which the different inhibitory processes take place.

Keywords: inhibition process, motor processes, selective inhibition, dual task

Procedia PDF Downloads 355
1001 Harmonic Assessment and Mitigation in Medical Diagonesis Equipment

Authors: S. S. Adamu, H. S. Muhammad, D. S. Shuaibu

Abstract:

Poor power quality in electrical power systems can lead to medical equipment at healthcare centres to malfunction and present wrong medical diagnosis. Equipment such as X-rays, computerized axial tomography, etc. can pollute the system due to their high level of harmonics production, which may cause a number of undesirable effects like heating, equipment damages and electromagnetic interferences. The conventional approach of mitigation uses passive inductor/capacitor (LC) filters, which has some drawbacks such as, large sizes, resonance problems and fixed compensation behaviours. The current trends of solutions generally employ active power filters using suitable control algorithms. This work focuses on assessing the level of Total Harmonic Distortion (THD) on medical facilities and various ways of mitigation, using radiology unit of an existing hospital as a case study. The measurement of the harmonics is conducted with a power quality analyzer at the point of common coupling (PCC). The levels of measured THD are found to be higher than the IEEE 519-1992 standard limits. The system is then modelled as a harmonic current source using MATLAB/SIMULINK. To mitigate the unwanted harmonic currents a shunt active filter is developed using synchronous detection algorithm to extract the fundamental component of the source currents. Fuzzy logic controller is then developed to control the filter. The THD without the active power filter are validated using the measured values. The THD with the developed filter show that the harmonics are now within the recommended limits.

Keywords: power quality, total harmonics distortion, shunt active filters, fuzzy logic

Procedia PDF Downloads 448
1000 Performance of the New Laboratory-Based Algorithm for HIV Diagnosis in Southwestern China

Authors: Yanhua Zhao, Chenli Rao, Dongdong Li, Chuanmin Tao

Abstract:

The Chinese Centers for Disease Control and Prevention (CCDC) issued a new laboratory-based algorithm for HIV diagnosis on April 2016, which initially screens with a combination HIV-1/HIV-2 antigen/antibody fourth-generation immunoassay (IA) followed, when reactive, an HIV-1/HIV-2 undifferentiated antibody IA in duplicate. Reactive specimens with concordant results undergo supplemental tests with western blots, or HIV-1 nucleic acid tests (NATs) and non-reactive specimens with discordant results receive HIV-1 NATs or p24 antigen tests or 2-4 weeks follow-up tests. However, little data evaluating the application of the new algorithm have been reported to date. The study was to evaluate the performance of new laboratory-based HIV diagnostic algorithm in an inpatient population of Southwest China over the initial 6 months by compared with the old algorithm. Plasma specimens collected from inpatients from May 1, 2016, to October 31, 2016, are submitted to the laboratory for screening HIV infection performed by both the new HIV testing algorithm and the old version. The sensitivity and specificity of the algorithms and the difference of the categorized numbers of plasmas were calculated. Under the new algorithm for HIV diagnosis, 170 of the total 52 749 plasma specimens were confirmed as positively HIV-infected (0.32%). The sensitivity and specificity of the new algorithm were 100% (170/170) and 100% (52 579/52 579), respectively; while 167 HIV-1 positive specimens were identified by the old algorithm with sensitivity 98.24% (167/170) and 100% (52 579/52 579), respectively. Three acute HIV-1 infections (AHIs) and two early HIV-1 infections (EHIs) were identified by the new algorithm; the former was missed by old procedure. Compared with the old version, the new algorithm produced fewer WB-indeterminate results (2 vs. 16, p = 0.001), which led to fewer follow-up tests. Therefore, the new HIV testing algorithm is more sensitive for detecting acute HIV-1 infections with maintaining the ability to verify the established HIV-1 infections and can dramatically decrease the greater number of WB-indeterminate specimens.

Keywords: algorithm, diagnosis, HIV, laboratory

Procedia PDF Downloads 370
999 Optimizing Super Resolution Generative Adversarial Networks for Resource-Efficient Single-Image Super-Resolution via Knowledge Distillation and Weight Pruning

Authors: Hussain Sajid, Jung-Hun Shin, Kum-Won Cho

Abstract:

Image super-resolution is the most common computer vision problem with many important applications. Generative adversarial networks (GANs) have promoted remarkable advances in single-image super-resolution (SR) by recovering photo-realistic images. However, high memory requirements of GAN-based SR (mainly generators) lead to performance degradation and increased energy consumption, making it difficult to implement it onto resource-constricted devices. To relieve such a problem, In this paper, we introduce an optimized and highly efficient architecture for SR-GAN (generator) model by utilizing model compression techniques such as Knowledge Distillation and pruning, which work together to reduce the storage requirement of the model also increase in their performance. Our method begins with distilling the knowledge from a large pre-trained model to a lightweight model using different loss functions. Then, iterative weight pruning is applied to the distilled model to remove less significant weights based on their magnitude, resulting in a sparser network. Knowledge Distillation reduces the model size by 40%; pruning then reduces it further by 18%. To accelerate the learning process, we employ the Horovod framework for distributed training on a cluster of 2 nodes, each with 8 GPUs, resulting in improved training performance and faster convergence. Experimental results on various benchmarks demonstrate that the proposed compressed model significantly outperforms state-of-the-art methods in terms of peak signal-to-noise ratio (PSNR), structural similarity index measure (SSIM), and image quality for x4 super-resolution tasks.

Keywords: single-image super-resolution, generative adversarial networks, knowledge distillation, pruning

Procedia PDF Downloads 49
998 Greenhouse Controlled with Graphical Plotting in Matlab

Authors: Bruno R. A. Oliveira, Italo V. V. Braga, Jonas P. Reges, Luiz P. O. Santos, Sidney C. Duarte, Emilson R. R. Melo, Auzuir R. Alexandria

Abstract:

This project aims to building a controlled greenhouse, or for better understanding, a structure where one can maintain a given range of temperature values (°C) coming from radiation emitted by an incandescent light, as previously defined, characterizing as a kind of on-off control and a differential, which is the plotting of temperature versus time graphs assisted by MATLAB software via serial communication. That way it is possible to connect the stove with a computer and monitor parameters. In the control, it was performed using a PIC 16F877A microprocessor which enabled convert analog signals to digital, perform serial communication with the IC MAX232 and enable signal transistors. The language used in the PIC's management is Basic. There are also a cooling system realized by two coolers 12V distributed in lateral structure, being used for venting and the other for exhaust air. To find out existing temperature inside is used LM35DZ sensor. Other mechanism used in the greenhouse construction was comprised of a reed switch and a magnet; their function is in recognition of the door position where a signal is sent to a buzzer when the door is open. Beyond it exist LEDs that help to identify the operation which the stove is located. To facilitate human-machine communication is employed an LCD display that tells real-time temperature and other information. The average range of design operating without any major problems, taking into account the limitations of the construction material and structure of electrical current conduction, is approximately 65 to 70 ° C. The project is efficient in these conditions, that is, when you wish to get information from a given material to be tested at temperatures not as high. With the implementation of the greenhouse automation, facilitating the temperature control and the development of a structure that encourages correct environment for the most diverse applications.

Keywords: greenhouse, microcontroller, temperature, control, MATLAB

Procedia PDF Downloads 371
997 Design and Development of Fleet Management System for Multi-Agent Autonomous Surface Vessel

Authors: Zulkifli Zainal Abidin, Ahmad Shahril Mohd Ghani

Abstract:

Agent-based systems technology has been addressed as a new paradigm for conceptualizing, designing, and implementing software systems. Agents are sophisticated systems that act autonomously across open and distributed environments in solving problems. Nevertheless, it is impractical to rely on a single agent to do all computing processes in solving complex problems. An increasing number of applications lately require multiple agents to work together. A multi-agent system (MAS) is a loosely coupled network of agents that interact to solve problems that are beyond the individual capacities or knowledge of each problem solver. However, the network of MAS still requires a main system to govern or oversees the operation of the agents in order to achieve a unified goal. We had developed a fleet management system (FMS) in order to manage the fleet of agents, plan route for the agents, perform real-time data processing and analysis, and issue sets of general and specific instructions to the agents. This FMS should be able to perform real-time data processing, communicate with the autonomous surface vehicle (ASV) agents and generate bathymetric map according to the data received from each ASV unit. The first algorithm is developed to communicate with the ASV via radio communication using standard National Marine Electronics Association (NMEA) protocol sentences. Next, the second algorithm will take care of the path planning, formation and pattern generation is tested using various sample data. Lastly, the bathymetry map generation algorithm will make use of data collected by the agents to create bathymetry map in real-time. The outcome of this research is expected can be applied on various other multi-agent systems.

Keywords: autonomous surface vehicle, fleet management system, multi agent system, bathymetry

Procedia PDF Downloads 225
996 Solubility and Dissolution Enhancement of Poorly Soluble Drugs Using Biosericin

Authors: Namdeo Jadhav, Nitin Salunkhe

Abstract:

Currently, sericin is being treated as waste of sericulture industry, especially at reeling process. Looking at prospective physicochemical properties, an attempt has been made to explore pharmaceutical applications of sericin waste in fabrication of medicated solid dispersions. Solid dispersions (SDs) of poorly soluble drugs (Lornoxicam, Meloxicam & Felodipine) were prepared by spray drying, solvent evaporation, ball milling and physical kneading in mass ratio of drug: sericin (1:0.5, 1:1, 1:1.5, 1:2, 1:2.5 and 1:3 w/w) and were investigated by solubility, ATR-FTIR, XRD and DSC, micromeritics and tablettability, surface morphology and in-vitro dissolution. It has been observed that sericin improves solubility of drugs by 8 to 10 times compared to pure drugs. The presence of hydrogen bonding between drugs and sericin was confirmed from the ATR-FTIR spectra. Amongst these methods, spray dried (1:2 w/w) SDs showed fully amorphous state representing molecularly distributed drug as confirmed from XRD and DSC study. Spray dried meloxicam SDs showed better compressibility and compactibility. The microphotograph of spray dried batches of lornoxicam (SDLX) and meloxicam SDs (SDMX) showed bowl shaped, and bowl plus spherical particles respectively, while spray dried felodipine SDs (SDFL) showed spherical shape. The SDLX, SDMX and SDFL (1:2 w/w) displayed better dissolution performance than other methods. Conclusively, hydrophilic matrix of sericin can be used to deliver poor water soluble drugs and its aerodynamic shape may show a great potential for various drug deliveries. If established as pharmaceutical excipient, sericin holds a potential to revolutionise economics of pharmaceutical industry, and sericulture farming, especially of Asian countries.

Keywords: biosericin, poorly soluble drugs, solid dispersion, solubility and dissolution improvement

Procedia PDF Downloads 217
995 Fight against Money Laundering with Optical Character Recognition

Authors: Saikiran Subbagari, Avinash Malladhi

Abstract:

Anti Money Laundering (AML) regulations are designed to prevent money laundering and terrorist financing activities worldwide. Financial institutions around the world are legally obligated to identify, assess and mitigate the risks associated with money laundering and report any suspicious transactions to governing authorities. With increasing volumes of data to analyze, financial institutions seek to automate their AML processes. In the rise of financial crimes, optical character recognition (OCR), in combination with machine learning (ML) algorithms, serves as a crucial tool for automating AML processes by extracting the data from documents and identifying suspicious transactions. In this paper, we examine the utilization of OCR for AML and delve into various OCR techniques employed in AML processes. These techniques encompass template-based, feature-based, neural network-based, natural language processing (NLP), hidden markov models (HMMs), conditional random fields (CRFs), binarizations, pattern matching and stroke width transform (SWT). We evaluate each technique, discussing their strengths and constraints. Also, we emphasize on how OCR can improve the accuracy of customer identity verification by comparing the extracted text with the office of foreign assets control (OFAC) watchlist. We will also discuss how OCR helps to overcome language barriers in AML compliance. We also address the implementation challenges that OCR-based AML systems may face and offer recommendations for financial institutions based on the data from previous research studies, which illustrate the effectiveness of OCR-based AML.

Keywords: anti-money laundering, compliance, financial crimes, fraud detection, machine learning, optical character recognition

Procedia PDF Downloads 104
994 Numerical Simulations of Acoustic Imaging in Hydrodynamic Tunnel with Model Adaptation and Boundary Layer Noise Reduction

Authors: Sylvain Amailland, Jean-Hugh Thomas, Charles Pézerat, Romuald Boucheron, Jean-Claude Pascal

Abstract:

The noise requirements for naval and research vessels have seen an increasing demand for quieter ships in order to fulfil current regulations and to reduce the effects on marine life. Hence, new methods dedicated to the characterization of propeller noise, which is the main source of noise in the far-field, are needed. The study of cavitating propellers in closed-section is interesting for analyzing hydrodynamic performance but could involve significant difficulties for hydroacoustic study, especially due to reverberation and boundary layer noise in the tunnel. The aim of this paper is to present a numerical methodology for the identification of hydroacoustic sources on marine propellers using hydrophone arrays in a large hydrodynamic tunnel. The main difficulties are linked to the reverberation of the tunnel and the boundary layer noise that strongly reduce the signal-to-noise ratio. In this paper it is proposed to estimate the reflection coefficients using an inverse method and some reference transfer functions measured in the tunnel. This approach allows to reduce the uncertainties of the propagation model used in the inverse problem. In order to reduce the boundary layer noise, a cleaning algorithm taking advantage of the low rank and sparse structure of the cross-spectrum matrices of the acoustic and the boundary layer noise is presented. This approach allows to recover the acoustic signal even well under the boundary layer noise. The improvement brought by this method is visible on acoustic maps resulting from beamforming and DAMAS algorithms.

Keywords: acoustic imaging, boundary layer noise denoising, inverse problems, model adaptation

Procedia PDF Downloads 297
993 The Opportunities and Challenges of Adopting International Financial Reporting Standards in Saudi Capital Market

Authors: Abdullah Almulhim

Abstract:

The International Accounting Standards Board (IASB) was established in 2001 to develop International Financial Reporting Standards (IFRS) that bring transparency, accountability, and efficiency to financial markets around the world. In addition, the IFRS provide a unified accounting language, which is especially important in the era of globalization. However, the establishment of a single set of high-quality international accounting standards is a matter of growing importance, as participants in the increasingly integrated world capital market demand comparability and transparency of financial reporting worldwide. Saudi Arabia became the 149th member of the World Trade Organization (WTO) on 11 December 2005, which has increased the need to convert to IFRS. Currently, the Saudi Arabian Monetary Authority (SAMA) requires banks and insurance companies in Saudi Arabia to report under IFRS Standards. However, until the end of 2016, SOCPA standards were applied to all other companies, listed and unlisted. From 2017, listed Saudi companies would be required to report under IFRS Standards as adopted by SOCPA effective 2017. This paper is to investigate the expected benefits gained and highlight the challenges faced by adopting IFRS by the listed companies in the Saudi Stock Exchange. Questionnaires were used as the main method of data collection. They were distributed to listed companies in the Saudi Capital Market. Data obtained through the questionnaires have been imported into SPSS statistical software for analysis. The expected results of this study will show the benefits of adopting IFRS by Saudi Listed Companies. However, this study will investigate the challenges faced by adopting IFRS by the listed companies in the Saudi Arabian Stock Market. Findings will be discussed later upon completion of initial analysis.

Keywords: challenges, IAS, IFRS, opportunities, Saudi, SOCPA

Procedia PDF Downloads 212
992 Long-Term Durability of Roller-Compacted Concrete Pavement

Authors: Jun Hee Lee, Young Kyu Kim, Seong Jae Hong, Chamroeun Chhorn, Seung Woo Lee

Abstract:

Roller-compacted concrete pavement (RCCP), an environmental friendly pavement of which load carry capacity benefitted from both hydration and aggregate interlock from roller compacting, demonstrated a superb structural performance for a relatively small amount of water and cement content. Even though an excellent structural performance can be secured, it is required to investigate roller-compacted concrete (RCC) under environmental loading and its long-term durability under critical conditions. In order to secure long-term durability, an appropriate internal air-void structure is required for this concrete. In this study, a method for improving the long-term durability of RCCP is suggested by analyzing the internal air-void structure and corresponding durability of RCC. The method of improving the long-term durability involves measurements of air content, air voids, and air-spacing factors in RCC that experiences changes in terms of type of air-entraining agent and its usage amount. This test is conducted according to the testing criteria in ASTM C 457, 672, and KS F 2456. It was found that the freezing-thawing and scaling resistances of RCC without any chemical admixture was quite low. Interestingly, an improvement of freezing-thawing and scaling resistances was observed for RCC with appropriate the air entraining (AE) agent content; Relative dynamic elastic modulus was found to be more than 80% for those mixtures. In RCC with AE agent mixtures, large amount of air was distributed within a range of 2% to 3%, and an air void spacing factor ranging between 200 and 300 μm (close to 250 μm, recommended by PCA) was secured. The long-term durability of RCC has a direct relationship with air-void spacing factor, and thus it can only be secured by ensuring the air void spacing factor through the inclusion of the AE in the mixture.

Keywords: durability, RCCP, air spacing factor, surface scaling resistance test, freezing and thawing resistance test

Procedia PDF Downloads 222
991 Artificial Neural Network Modeling of a Closed Loop Pulsating Heat Pipe

Authors: Vipul M. Patel, Hemantkumar B. Mehta

Abstract:

Technological innovations in electronic world demand novel, compact, simple in design, less costly and effective heat transfer devices. Closed Loop Pulsating Heat Pipe (CLPHP) is a passive phase change heat transfer device and has potential to transfer heat quickly and efficiently from source to sink. Thermal performance of a CLPHP is governed by various parameters such as number of U-turns, orientations, input heat, working fluids and filling ratio. The present paper is an attempt to predict the thermal performance of a CLPHP using Artificial Neural Network (ANN). Filling ratio and heat input are considered as input parameters while thermal resistance is set as target parameter. Types of neural networks considered in the present paper are radial basis, generalized regression, linear layer, cascade forward back propagation, feed forward back propagation; feed forward distributed time delay, layer recurrent and Elman back propagation. Linear, logistic sigmoid, tangent sigmoid and Radial Basis Gaussian Function are used as transfer functions. Prediction accuracy is measured based on the experimental data reported by the researchers in open literature as a function of Mean Absolute Relative Deviation (MARD). The prediction of a generalized regression ANN model with spread constant of 4.8 is found in agreement with the experimental data for MARD in the range of ±1.81%.

Keywords: ANN models, CLPHP, filling ratio, generalized regression, spread constant

Procedia PDF Downloads 257
990 Experimental Studies of Sigma Thin-Walled Beams Strengthen by CFRP Tapes

Authors: Katarzyna Rzeszut, Ilona Szewczak

Abstract:

The review of selected methods of strengthening of steel structures with carbon fiber reinforced polymer (CFRP) tapes and the analysis of influence of composite materials on the steel thin-walled elements are performed in this paper. The study is also focused to the problem of applying fast and effective strengthening methods of the steel structures made of thin-walled profiles. It is worth noting that the issue of strengthening the thin-walled structures is a very complex, due to inability to perform welded joints in this type of elements and the limited ability to applying mechanical fasteners. Moreover, structures made of thin-walled cross-section demonstrate a high sensitivity to imperfections and tendency to interactive buckling, which may substantially contribute to the reduction of critical load capacity. Due to the lack of commonly used and recognized modern methods of strengthening of thin-walled steel structures, authors performed the experimental studies of thin-walled sigma profiles strengthened with CFRP tapes. The paper presents the experimental stand and the preliminary results of laboratory test concerning the analysis of the effectiveness of the strengthening steel beams made of thin-walled sigma profiles with CFRP tapes. The study includes six beams made of the cold-rolled sigma profiles with height of 140 mm, wall thickness of 2.5 mm, and a length of 3 m, subjected to the uniformly distributed load. Four beams have been strengthened with carbon fiber tape Sika CarboDur S, while the other two were tested without strengthening to obtain reference results. Based on the obtained results, the evaluation of the accuracy of applied composite materials for strengthening of thin-walled structures was performed.

Keywords: CFRP tapes, sigma profiles, steel thin-walled structures, strengthening

Procedia PDF Downloads 278
989 Distribution and Habitat Preference of Red Panda (Ailurus Fulgens Fulgens) in Jumla District, Nepal

Authors: Saroj Panthi, Sher Singh Thagunna

Abstract:

Reliable and sufficient information regarding status, distribution and habitat preference of red panda (Ailurus fulgens fulgens) is lacking in Nepal. The research activities on red panda in the mid-western Nepal are very limited, so the status of red panda in the region is quite unknown. The study conducted during May, 2013 in three Village Development Committees (VDCs) namely Godhemahadev, Malikathata and Tamti of Jumla district was an important step for providing vital information including distribution and habitat preference of this species. The study included the reconnaissance, key informants survey, interviews, and consultation for the most potential area identification, opportunistic survey comprising the direct observation and indirect sign count method for the presence and distribution, habitat assessment consisting vegetation sampling and ocular estimation. The study revealed the presence of red panda in three forests namely Bahirepatan, Imilchadamar and Tyakot of Godhemahadev, Tamti and Malikathata VDCs respectively. The species was found distributed between 2880 and 3244 m with an average dropping encounter rate of 1.04 per hour of searching effort and 12 pellets per dropping. Red panda mostly preferred the habitat in the elevation range of 2900 - 3000 m with southwest facing steep slopes (36˚ - 45˚), associated with water sources at the distance of ≤100 m. Trees such as Acer spp., Betula utilis and Quercus semecarpifolia, shrub species of Elaeagnus parvifolia, Drepanostachyum spp. and Jasminum humile, and the herbs like Polygonatum cirrhifolium, Fragaria nubicola and Galium asperifolium were found to be the most preferred species by red panda. The red panda preferred the habitat with dense crown coverage ( >20% - 100%) and 31% - 50% ground cover. Fallen logs (39%) were the most preferred substrate used for defecation.

Keywords: distribution, habitat preference, jumla, red panda

Procedia PDF Downloads 283
988 Development of a Sequential Multimodal Biometric System for Web-Based Physical Access Control into a Security Safe

Authors: Babatunde Olumide Olawale, Oyebode Olumide Oyediran

Abstract:

The security safe is a place or building where classified document and precious items are kept. To prevent unauthorised persons from gaining access to this safe a lot of technologies had been used. But frequent reports of an unauthorised person gaining access into security safes with the aim of removing document and items from the safes are pointers to the fact that there is still security gap in the recent technologies used as access control for the security safe. In this paper we try to solve this problem by developing a multimodal biometric system for physical access control into a security safe using face and voice recognition. The safe is accessed by the combination of face and speech pattern recognition and also in that sequential order. User authentication is achieved through the use of camera/sensor unit and a microphone unit both attached to the door of the safe. The user face was captured by the camera/sensor while the speech was captured by the use of the microphone unit. The Scale Invariance Feature Transform (SIFT) algorithm was used to train images to form templates for the face recognition system while the Mel-Frequency Cepitral Coefficients (MFCC) algorithm was used to train the speech recognition system to recognise authorise user’s speech. Both algorithms were hosted in two separate web based servers and for automatic analysis of our work; our developed system was simulated in a MATLAB environment. The results obtained shows that the developed system was able to give access to authorise users while declining unauthorised person access to the security safe.

Keywords: access control, multimodal biometrics, pattern recognition, security safe

Procedia PDF Downloads 295
987 StockTwits Sentiment Analysis on Stock Price Prediction

Authors: Min Chen, Rubi Gupta

Abstract:

Understanding and predicting stock market movements is a challenging problem. It is believed stock markets are partially driven by public sentiments, which leads to numerous research efforts to predict stock market trend using public sentiments expressed on social media such as Twitter but with limited success. Recently a microblogging website StockTwits is becoming increasingly popular for users to share their discussions and sentiments about stocks and financial market. In this project, we analyze the text content of StockTwits tweets and extract financial sentiment using text featurization and machine learning algorithms. StockTwits tweets are first pre-processed using techniques including stopword removal, special character removal, and case normalization to remove noise. Features are extracted from these preprocessed tweets through text featurization process using bags of words, N-gram models, TF-IDF (term frequency-inverse document frequency), and latent semantic analysis. Machine learning models are then trained to classify the tweets' sentiment as positive (bullish) or negative (bearish). The correlation between the aggregated daily sentiment and daily stock price movement is then investigated using Pearson’s correlation coefficient. Finally, the sentiment information is applied together with time series stock data to predict stock price movement. The experiments on five companies (Apple, Amazon, General Electric, Microsoft, and Target) in a duration of nine months demonstrate the effectiveness of our study in improving the prediction accuracy.

Keywords: machine learning, sentiment analysis, stock price prediction, tweet processing

Procedia PDF Downloads 121
986 Service Interactions Coordination Using a Declarative Approach: Focuses on Deontic Rule from Semantics of Business Vocabulary and Rules Models

Authors: Nurulhuda A. Manaf, Nor Najihah Zainal Abidin, Nur Amalina Jamaludin

Abstract:

Coordinating service interactions are a vital part of developing distributed applications that are built up as networks of autonomous participants, e.g., software components, web services, online resources, involve a collaboration between a diverse number of participant services on different providers. The complexity in coordinating service interactions reflects how important the techniques and approaches require for designing and coordinating the interaction between participant services to ensure the overall goal of a collaboration between participant services is achieved. The objective of this research is to develop capability of steering a complex service interaction towards a desired outcome. Therefore, an efficient technique for modelling, generating, and verifying the coordination of service interactions is developed. The developed model describes service interactions using service choreographies approach and focusing on a declarative approach, advocating an Object Management Group (OMG) standard, Semantics of Business Vocabulary and Rules (SBVR). This model, namely, SBVR model for service choreographies focuses on a declarative deontic rule expressing both obligation and prohibition, which can be more useful in working with coordinating service interactions. The generated SBVR model is then be formulated and be transformed into Alloy model using Alloy Analyzer for verifying the generated SBVR model. The transformation of SBVR into Alloy allows to automatically generate the corresponding coordination of service interactions (service choreography), hence producing an immediate instance of execution that satisfies the constraints of the specification and verifies whether a specific request can be realised in the given choreography in the generated choreography.

Keywords: service choreography, service coordination, behavioural modelling, complex interactions, declarative specification, verification, model transformation, semantics of business vocabulary and rules, SBVR

Procedia PDF Downloads 117
985 Information Literacy Skills of Legal Practitioners in Khyber Pakhtunkhwa-Pakistan: An Empirical Study

Authors: Saeed Ullah Jan, Shaukat Ullah

Abstract:

Purpose of the study: The main theme of this study is to explore the information literacy skills of the law practitioners in Khyber Pakhtunkhwa-Pakistan under the heading "Information Literacy Skills of Legal Practitioners in Khyber Pakhtunkhwa-Pakistan: An Empirical Study." Research Method and Procedure: To conduct this quantitative study, the simple random sample approach is used. An adapted questionnaire is distributed among 254 lawyers of Dera Ismail Khan through personal visits and electronic means. The data collected is analyzed through SPSS (Statistical Package for Social Sciences) software. Delimitations of the study: The study is delimited to the southern district of Khyber Pakhtunkhwa: Dera Ismael Khan. Key Findings: Most of the lawyers of District Dera Ismail Khan of Khyber Pakhtunkhwa can recognize and understand the needed information. A large number of lawyers are capable of presenting information in both written and electronic forms. They are not comfortable with different legal databases and using various searching and keyword techniques. They have less knowledge of Boolean operators for locating online information. Conclusion and Recommendations: Efforts should be made to arrange refresher courses and training workshops on the utilization of different legal databases and different search techniques for retrieval of information sources. This practice will enhance the information literacy skills of lawyers, which will ultimately result in a better legal system in Pakistan. Practical implication(s): The findings of the study will motivate the policymakers and authorities of legal forums to restructure the information literacy programs to fulfill the lawyers' information needs. Contribution to the knowledge: No significant work has been done on the lawyers' information literacy skills in Khyber Pakhtunkhwa-Pakistan. It will bring a clear picture of the information literacy skills of law practitioners and address the problems faced by them during the seeking process.

Keywords: information literacy-Pakistan, infromation literacy-lawyers, information literacy-lawyers-KP, law practitioners-Pakistan

Procedia PDF Downloads 113