Department of Computer Science
Permanent URI for this community
Browse
Browsing Department of Computer Science by Issue Date
Now showing 1 - 20 of 103
Results Per Page
Sort Options
Item Genetic Algorithms and Machine Learning(Springer Nature, 1988) Goldberg, David E.; Holland, John H.Item MAC security and security overhead analysis in the IEEE 802.15.4 wireless sensor networks(Springer, 2006) Xiao, Yang; Chen, Hsiao-Hwa; Sun, Bo; Wang, Ruhai; Sethi, Sakshi; University of Alabama Tuscaloosa; National Sun Yat Sen University; Texas State University System; Lamar UniversitySensor networks have many applications. However, with limited resources such as computation capability and memory, they are vulnerable to many kinds of attacks. The IEEE 802.15.4 specification defines medium access control ( MAC) layer and physical layer for wireless sensor networks. In this paper, we propose a security overhead analysis for the MAC layer in the IEEE 802.15.4 wireless sensor networks. Furthermore, we survey security mechanisms defined in the specification including security objectives, security suites, security modes, encryption, authentication, and so forth. Then, security vulnerabilities and attacks are identified. Some security enhancements are proposed to improve security and to prevent these attacks such as same-nonce attack, denial-of-service attack, reply-protection attack, ACK attack, and so forth. Our results show that, for example, with 128-bit key length and 100 MIPS, encryption overhead is 10.28 mu s per block, and with 100 MIPS and 1500-byte payload, the encryption overhead is as high as 5782.5 mu sItem Wireless network security(Hindawi, 2006) Xiao, Yang; Lin, Yi-Bing; Du, Ding-Zhu; University of Alabama Tuscaloosa; National Yang Ming Chiao Tung University; University of Texas System; University of Texas DallasItem Continuous drug infusion for diabetes therapy: A closed-loop control system design(Springer, 2007) Chen, Jiming; Cao, Kejie; Sun, Youxian; Xiao, Yang; Su, Xu (Kevin); Zhejiang University; University of Alabama Tuscaloosa; University of Texas System; University of Texas at San Antonio (UTSA)While a typical way for diabetes therapy is discrete insulin infusion based on long-time interval measurement, in this paper, we design a closed-loop control system for continuous drug infusion to improve the traditional discrete methods and make diabetes therapy automatic in practice. By exploring the accumulative function of drug to insulin, a continuous injection model is proposed. Based on this model, proportional-integral-derivative (PID) and fuzzy logic controllers are designed to tackle a control problem of the resulting highly nonlinear plant. Even with serious disturbance of glucose, such as nutrition absorption at meal time, the proposed scheme can perform well in simulation experiments. Copyright (c) 2008.Jiming Chen et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.Item Temperature-aware routing for Telemedicine applications in embedded biomedical sensor networks(Springer, 2008) Takahashi, Daisuke; Xiao, Yang; Hu, Fei; Chen, Jiming; Sun, Youxian; University of Alabama Tuscaloosa; Rochester Institute of Technology; Zhejiang UniversityBiomedical sensors, called invivo sensors, are implanted in human bodies, and cause some harmful effects on surrounding body tissues. Particularly, temperature rise of the invivo sensors is dangerous for surrounding tissues, and a high temperature may damage them from a long term monitoring. In this paper, we propose a thermal-aware routing algorithm, called least total-route-temperature (LTRT) protocol, in which nodes temperatures are converted into graph weights, and minimum temperature routes are obtained. Furthermore, we provide an extensive simulation evaluation for comparing several other related schemes. Simulation results show the advantages of the proposed scheme.Item Which environment is more suitable for novice programmers: editor/command line/console environment vs. Integrated Development Environment(University of Alabama Libraries, 2009) Dillon, Edward, Jr.; Brown, Marcus E.; University of Alabama TuscaloosaWhen novice programmers begin programming, they face many problems due to the lack of programming experience. Integrated Development Environments are used as a way to help novices become more effective at learning to program. The question is whether or not such an environment is more effective when compared to a command line/console environment. Therefore, this study tried to address this question by performing interviews with students who were using these environments. This study was composed of two groups of undergraduate students who were taking courses in Computer Science. Group one consisted of students who were involved in a course sequence beginning with the Microsoft Visual Studio IDE, then a command line environment for the last course in the sequence. The second group started programming with a command line environment. Interviews were conducted with both groups in order to gain information about these environments. The information retrieved showed that the Microsoft Visual Studio IDE is favored based on the students' responses to the questions. However, there was not enough significant differences amongst the results to say that an IDE in general is better than a command line environment. It was the intent that this information provided not only background information but also served as potential foundational evidence for determining which environment may be more suitable for novice programmers to use for programming. This information will also be used as a basis for further research and studies in this area.Item Algorithms with applications in robotics(University of Alabama Libraries, 2009) Munteanu, Bogdan; Borie, Richard B.; University of Alabama TuscaloosaMany real world applications which involve computational steps are closely tied to theoretical computer science. In order for these systems to be efficiently deployed and used, a thorough analysis is required in advance. This dissertation deals with several real world problems related to the field of Robotics, which can be mathematically modeled and analyzed. One of these problems is known as the pursuit evasion problem and involves the use of independent automated robots to capture a fugitive hiding in a building or a cave system. This is an extensively studied game theory and combinatorics problem which has multiple variations. It can be modeled as a graph and the goal is to minimize the cost of capturing the evader. We deal with two completely different variations of this problem: a vision based variant, in which the robots have limited vision and thus can react when the fugitive is in line of sight; and a no-vision variant, in which the robots do not have any knowledge about the fugitive. Another problem we deal with is the problem of neighbor discovery in wireless networks using directional antennas. This is another problem which received a growing interest in the last years. Our approach to solving this problem, as well as the model, is different from the other results that have been previously published in the literature. Besides modeling and formally analyzing these problems, our focus in this dissertation is to design efficient algorithms that solve them either completely or partially.Item Botnet: Classification, Attacks, Detection, Tracing, and Preventive Measures(Springer, 2009) Liu, Jing; Xiao, Yang; Ghaboosi, Kaveh; Deng, Hongmei; Zhang, Jingyuan; University of Alabama Tuscaloosa; University of Oulu; Intelligent Automation Inc.Botnets become widespread in wired and wireless networks, whereas the relevant research is still in the initial stage. In this paper, a survey of botnets is provided. We first discuss fundamental concepts of botnets, including formation and exploitation, lifecycle, and two major kinds of topologies. Several related attacks, detection, tracing, and countermeasures, are then introduced, followed by recent research work and possible future challenges. Copyright (C) 2009 Jing Liu et al.Item CV-NICS: a lightweight solution to the correspondence problem(University of Alabama Libraries, 2009) Jay, Graylin Trevor; Smith, Randy K.; University of Alabama TuscaloosaIn this dissertation, I present a novel approach for solving the correspondence problem using basic statistical classification techniques. While metrics such as Pearson's rho or cosine similarity would not be powerful enough to solve the correspondence problem directly, their performance can be enhanced by augmenting the scene with random color static via a projector. Over time, this noise increases the statistical independence of imaged points not in correspondence. This allows the reduction of the correspondence problem to a simple similarity search of temporal features. Extensive experiments have shown the approach to be as effective as more complex structured light techniques at producing very dense correspondence data for a variety of scenes. The approach differentiates itself from traditional structured lighting by not relying on known camera or projector geometries, and by allowing relatively lax capturing conditions. Due to the statistically oriented nature of the approach and unlike more recognition focused techniques, the approach is naturally amenable to quality assessment and analysis. This dissertation provides a background on the correspondence problem, presents empirical and analytical results regarding the new technique, and reviews the related work in the literature.Item Network security: design, analysis and tradeoff evaluation(University of Alabama Libraries, 2009) Olteanu, Alina; Xiao, Yang; University of Alabama TuscaloosaEnergy efficiency is an essential requirement for all wireless devices. Recent developments in wireless sensor networks (WSNs), wireless local area networks (WLANs) and wireless personal area networks (WPANs) have raised a demand for energy-efficient algorithms and energy-efficient medium access control (MAC) protocols. When considering security in this context, additional overhead is added to the network and efforts must to be made to minimize the extra load while at the same time achieving the desired level of security. Security attacks in the Internet are linked to a different set of vulnerabilities. The complex architecture of the Internet spanning over different administrative domains and legal systems makes it easy for attackers to conceal the source of the attack and preserve their anonymity. This dissertation addresses several important issues in network security and performance including intrusion detection, cipher design, security overhead analysis and tracing, as follows. We first propose a model for intrusion detection in WSNs, which optimizes network coverage and detection while minimizing the number of sensors and energy consumption. We then integrate a security mechanism into the sensor network in order to achieve secure communication. Specifically, we propose a lightweight block cipher based on a multiple recursive generator (MRG) which is suitable for WSN and RFID where power consumption, bandwidth, memory and storage space are critical. Next, we consider security in WLANs and WPANs and we apply the advanced encryption standard (AES) cipher to ensure secure transmission of frames. We integrate AES encryption at the MAC layer of 802.11 WLANs and 802.15.3 UWB WPANs, respectively, and study the overhead introduced by AES in this context. Finally, we analyze a type of security attack in the Internet where the intruder uses a chain of host machines before attacking the target. We discuss two mechanisms for tracing intruders in the Internet, one based on thumbprinting, and the other on a timestamping technique of transmission activities.Item ProtoGENI security: threats to resources and run-time interactions(University of Alabama Libraries, 2010) Shalini, Fnu; Xiao, Yang; University of Alabama TuscaloosaThe ever extending threats to Internet community, and financial, physical, mental, social damages because of it forced researchers to rethink about the architecture, components, and services of future Internet. Security was not the concern at the time of development of existing architecture of Internet There is a very high probability to attack current Internet without being caught which supports the proliferation of cyber crimes. Security is one of the prime objectives of future Internet which is highly obscure term. This is the challenge for ever to maintain the security of the Internet as attackers probably have higher intelligence and determination to break the security. GENI is a virtual lab to provide all necessary resources and environment closer to expected future Internet so that researchers can test the innovative ideas to develop a more secure, accountable, usable, and manageable future Internet. ProtoGENI is a prototype of GENI and it is in function to test network research ideas. ProtoGENI requires a rigorous observation and implementation improvements to achieve intended security in GENI. Security of ProtoGENI is crucial as experiment results can provide a false picture of security capabilities if they are being tested in an environment which can be manipulated by malicious users or not consistent in its performance. It can affect the security of whole system drastically, and destroying the whole effort of developing a secure future Internet. This work is an effort to test and observe the existing security mechanism and functioning of the ProtoGENI system, and to find out the exploitable attacking loophole. The initial experiments, results, and observations provide a detailed functioning and security problems which can be utilized to improve the overall ProtoGENI security architecture. Though Security is a process and not a product, this work is to provide the current security issues and suggestions to improve security settings involving all components which work together to utilize ProtoGENI facilities for testing innovative ideas for developing future Internet. Threats to ProtoGENI resources and runtime interactions are in focus for this research work. It explores the existing functioning and possible security weaknesses to cause a non-functional, semi non-functional or malfunctioned system. There are many observations during executing experiments which affect the performance of the system. These observations can assist to improve the overall ProtoGENI functionality. Results indicate that there are threats to resources and run-time interactions between ProtoGENI components. Non-availability and non-usability of resources can affect the network experiments severely. Cross-experiment communication is also possible in wireless Emulab experiments. Initial Wireless communication analysis on Emulab provides details of wireless traffic behavior and traffic interferences. Overall security at host machine can be enhanced by modifying default security settings including SSH port number and root login rights. An alternative solution is provided to solve default XMLPC server settings to establish initial setup for executing ProtoGENI experiments. These findings are subject to time-line to the progress of ProtoGENI and GENI projects. This work can assist novice ProtoGENI researchers to understand the basic functionality, associated problems, and possible solutions. These initial findings for security issues in existing ProtoGENI system and observations will assist to improve the overall security functionality of ProtoGENI.Item Eagle eye: an accountable logging framework for distributed systems(University of Alabama Libraries, 2010) Kathiresshan, Nandhakumar; Xiao, Yang; University of Alabama TuscaloosaSecurity in computer systems has been a major concern since the very beginning. Although security has been addressed in various aspects, accountability is a main stream of security which is lacking in today's computer systems. The ability to not just detect errors but also to find the reason for the failure and also the system in charge is crucial. In this thesis, studies on the various accountability tactics available and how each one of them contributes towards providing strong accountability are performed. The various merits and tradeoffs are also studied. Accountability in distributed systems is a main issue which has to be dealt with more effectively and efficiently. This thesis introduces Eagle Eye, which is a novel approach to overlay accountability over distributed systems. It is a standalone application which does not merge with the application which is being monitored. Eagle Eye works by maintaining secure log files of all the packets being sent and received. Faults are detected using these recorded log files on-demand and periodically. Eagle Eye can be used with a wide variety of applications as it only requires that the results are deterministic and not arbitrary. Eagle Eye was applied to three different protocols over peer-to-peer system and over the network file system and was analyzed.Item On labeled paths(University of Alabama Libraries, 2010) Wiegand, Nathan; Borie, Richard B.; University of Alabama TuscaloosaLabeled graph theory is the marriage of two common problem domains to computer science -- graph theory and automata theory. Though each has been independently studied in depth, there has been little investigation of their intersection, the labeled paths. This dissertation examines three results in the area of labeled path problems. The first result presents an empirical analysis of two context-free labeled all-pairs shortest-path algorithms using MapReduce as the experimental platform. The second and third results examine labeled paths in the context of formal languages beyond the context-free languages. The second result is a lower bound on the length of the longest shortest path when the formal language constraining the path is a member of the control language hierarchy. Finally, the third result presents a labeled all-pairs shortest-path algorithm for each level of the infinite K Linear-Hierarchy.Item UbiMice: a fluid interaction model for multicomputer workspace(University of Alabama Libraries, 2010) Liu, Li; Zhang, Jingyuan; University of Alabama TuscaloosaNowadays it is not uncommon for people to have multiple computers in a workspace. A multicomputer workspace favors a user to perform multiple various tasks. At the same time, there is a short of fluid interaction model for this workspace while excessive mice and keyboards that are adjacent to multiple computers bring chaos to the workspace causing a cluttered desktop. A user has to switch back and forth to interact with different workstations. Furthermore, information is often needed to be exchanged among computers every now and then. For information to cross the boundary between two computers, several technologies such as file sharing have already been developed. However, these technologies incommode hands to move between different keyboards and/or mice, which are interruptive. This dissertation proposes a fluid interaction model called UbiMice for the multicomputer workspace to address these issues. In the UbiMice model, co-located computers form a multicomputer workspace. A user needs only one set of input devices to interact with any computer in the workspace. This eradicates gaps of operating multiple input devices and eliminates desktop clutter brought by nimiety keyboards and mice. An interacting focus is represented by a cursor. The cursor can move from computer to computer, interact with any computer, and carry information among the computers. This proposed model also allows multiple cursors used by multiple users simultaneously for collaborative working. A security mechanism is also provided with the model to protect information from unauthorized access in the multi-user case. The UbiMice model enables to build a seamless workspace from multiple computers. Traditional input devices are augmented with the capability of serving multiple computers. Users can interact with the workspace intuitively as if they were interacting with a single computer. This dissertation will study the architecture of UbiMice model together with a proof-of-concept implementation. The model finds many novel applications in different settings and benefits a wide range of user groups.Item Creation of crash-countermeasure police patrol routes targeting hotspot road segments(University of Alabama Libraries, 2010) Steil, Dana Andrew; Parrish, Allen Scott; University of Alabama TuscaloosaThis dissertation addresses the problem of expressing, executing, evaluating, and engaging patrol routing algorithms that target event hotspots on roadways. An "event hotspot" is a location that is over-represented in some event occurrence, such as crashes, citations or any other event of interest. Recommended patrol routes can be used by organizations such as police agencies, emergency medical responders, and taxi services who patrol roadway segments at proper times to assist or deter their target events. Patrol routing algorithms are used to specify the movements of simulated mobile agents on a roadway system. The patrol algorithms are first expressed using TURN, Technique for Ultimate Route Navigation, our extensible domain specific language (DSL) created for this purpose. Algorithms specified using TURN syntax are then executed and evaluated in a custom simulation environment. Patrol routing algorithms deemed _t for a specific context are then engaged by users via a web-based geographic information systems (GIS) portal. In this dissertation details of the patrol routing model are followed by two case studies. The first case study evaluates agent response times to events when dispatched from region-based staging points. The second case study evaluates several nondeterministic highway patrol routing algorithms according to four metrics: response times, network coverage, hotspot coverage, and hotspot exposure. The case study results demonstrate the applicability of the patrol routing system.Item A data mining approach to identify perpetrators: an integration framework and case studies(University of Alabama Libraries, 2010) Ding, Li; Dixon, Brandon; University of Alabama TuscaloosaData mining and social network analysis have been widely used in law enforcement to solve crimes. Research questions such as strength of ties in social networks, crime pattern discovery and prioritizing offenders have been studied in this area. However, most of those studies failed to consider the noisy nature of the data. The techniques they proposed only have been applied to small scale data sets. Therefore, it is an important task to design a framework that can work on large scale data sets and tolerance noisy data. In this dissertation, we built an integrated crime detection framework that combined two data mining techniques: decision tree and genetic algorithm and graph theories to solve the problems we pointed out. Our crime pattern analysis is based on all offenders of the state of Alabama in the past 50 years. Our constructed social network contains all Alabama residents. It allows us to fully evaluate the proposed models. Two case studies have been conducted to evaluate the framework. One is based on 625 inmates released from Madison county jail in 2004. Our experimental results show that our recommended risk level has strong correlation in predicting future offense. Another case study is based on the 100 real police reports. The experimental results show that the median ranking of arrestees remains at the top 3% of the return list.Item Cooperation paradigms for overcoming communication limitations in multirobot wide area coverage(University of Alabama Libraries, 2011) Wellman, Briana Lowe; Anderson, Monica D.; University of Alabama TuscaloosaMulti-robot systems are an important research topic in wide area coverage applications such as hazardous waste clean-up, bomb detection, surveillance, and search and rescue missions. They can work in parallel and complete tasks faster than a single robot. Communications can support cooperation to speed up execution, reduce duplication, and prevent interference. Communication among team members is achieved explicitly or implicitly. In explicit communication, messages are intentionally transmitted and received from robot to robot. In implicit communication, robots observe the environment and other robot actions. Although many systems use explicit communications, in exploration of large, open areas (e.g. stadiums and parks), persistent intra-team digital communications is not guaranteed. Therefore, alternative approaches that do not rely upon message passing throughout exploration are needed. Novel contributions of overcoming communication limitations in wide area coverage include: (1) insight on how information shared between robots that are close has more influence on immediate action selection than information shared between robots that are farther apart. Spatial and temporal locality can be instrumental in determining relevance in subsequent action selection; (2) an approach in which observation leverages spatial and temporal locality to infer state rather than rely on digital messaging; and (3) an approach in which robots use spatial rendezvous to exchange information instead of continuously passing messages. Robots explore an environment in sectors, or designated areas, and periodically meet to communicate map information of what they explored. Simulations and physical experiments were conducted and results suggest both approaches can serve as alternatives to cooperation based on continuous point-to-point communications.Item Log analysis technique: Picviz(University of Alabama Libraries, 2011) Sen, Shraddha Pradip; Xiao, Yang; University of Alabama TuscaloosaLog data that is generated during a number of processes such as Networking, Web surfing, Failures, etc. is quite large. Such log data are supposed to be processed and analyzed so that it can be used to improve the quality of the software, improve its performance, proactive fault detection and handling. There are a number of log analysis techniques that have been presented over the years. Picviz is one such technique. Picviz is a parallel co-ordinate plot which is used to display huge amounts of data for security purposes. The data can be displayed in multiple dimensions using the parallel coordinate system. The primary goal of this software is to ease the analysis of data and finding correlation among the various variables. Since the software deals with huge amounts of data, representing the information all at once creates an image with congested or clustered lines. This makes it difficult to distinguish between lines and obtain any kind of information from the image, which is the main objective of the software. The image that is generated is not clear enough to find or detect any kind of correlation among the various variables. This research work describes two methods (plugins) to solve this problem; the idea is to group lines into sets and represent each set as a single line. This reduces the total number of lines in the figure and makes it easily readable and understandable. The two methods described below are: Grouping based on Comparison of data and Grouping Consecutive data.Item Code refactoring under constraints(University of Alabama Libraries, 2011) Liang, Yan; Kraft, Nicholas A.; Smith, Randy K.; University of Alabama TuscaloosaCode refactoring is the process of changing the internal structure of the program without changing its external behaviors. Most refactoring tools ensure behavior preservation by enforcing preconditions that must hold for the refactoring to be valid. However, their approaches have three drawbacks that make the refactoring results far from satisfactory and reduce the utilization of refactoring tools in practice. Firstly, programmers are not sure how code will be changed by those tools due to the invisible refactoring rules hidden behind the interfaces of tools. Secondly, current refactoring tools have limited extensibility to accommodate new refactorings. Lastly, most refactoring tools lack mechanisms to allow programmer to specify their own preconditions to indicate as to which properties of a program are of interest. We consider refactoring a code change activity that, as with other constraints imposed on code during software development and maintenance such as naming rules, should be visible, easily extensible, and adaptable. It should also combine the developers' opinions, implementation styles of existing code and other good coding practice. We propose a model-based approach to precondition specification and checking in which preconditions can be declared explicitly and dynamically against the designated program metamodel, and verified against concrete program models. This dissertation applies the approach of model-based refactoring precondition specification and checking on C++ source code refactoring. Based on the analysis of primitive refactorings, we design a C++ language metamodel to support constraint specification and code inspection for refactoring purposes. We then specify preconditions of 18 primitive refactorings against the metamodel, with primary concerns on syntax error prevention and semantic preservation. The impact of a programmer's perspective on these specifications is discussed. As another example to demonstrate the importance and necessities of supporting visible, extensible and adaptable precondition specification and checking, we use template method and singleton patterns to discuss how design patterns can affect refactoring decisions. We set up an experimental environment in which we build the language metamodel, develop a program model extraction tool and simulate the process of precondition specification and verification following the proposed approach.Item Combining information retrieval modules and structural information for source code bug localization and feature location(University of Alabama Libraries, 2011) Shao, Peng; Smith, Randy K.; Kraft, Nicholas A.; University of Alabama TuscaloosaBug localization and feature location in source code are software evolution tasks in which developers use information about a bug or feature present in a software system to locate the source code elements, such as classes or methods. These classes or methods must be modified either to correct the bug or implement a feature. Automating bug localization and feature location are necessary due to the size and complexity of modern software systems. Recently, researchers have developed static bug localization and feature location techniques using information retrieval techniques, such as latent semantic indexing (LSI), to model lexical information, such as identifiers and comments, from source code. This research presents a new technique, LSICG, which combines LSI modeling lexical information and call graphs to modeling structural information. The output is a list of methods ranked in descending order by likelihood of requiring modification to correct the bug or implement the feature under consideration. Three case studies including comparison of LSI and LSICG at method level and class level of granularity on 25 features in JavaHMO, 35 bugs in Rhino, 3 features and 6 bugs in jEdit demonstrate that The LSICG technique provides improved performance compared to LSI alone.