一、计算机专业综合知识

  1. 介绍一下你对软件架构的理解
  2. 介绍一下当前主流的软件开发平台,并简单介绍自己参与过开发且感触最深的软件项目
  3. 介绍一下对无线传感网络及其应用的了解
  4. 介绍一下对当前有线网络新技术及其应用
  5. 介绍一下你对Web2.0的了解
  6. 请以计算机技术的视角提出自己的创业梦想,并分析其中存在的技术难题
  7. 介绍一下你对自己所报考研究方向的认识以及期望

二、专业英语翻译

In information technology, big data is a collection of data sets so large and complex that is becomes difficult to process using on-hand database management tools or traditional data processing applications . The challenges include capture, curation, storage, search, sharing, analysis, and visualization. The trend to larger data sets is due to the additional information derivable from analysis of a single large set of related data, as compared to separate smaller sets with the same total amount of data, allowing correlations to be found to “spot business trends, determine quality of research, prevent diseases, link legal citations, combat crime, and determine real-time roadway traffic conditions.”

在信息技术中,大数据是非常庞大和复杂的数据集的集合,使用现有的数据库管理工具或传统的数据处理应用程序很难对其进行处理。挑战包括捕获,管理,存储,搜索,共享,分析和可视化。较大数据集的趋势是由于与具有相同数据总量的单独较小数据集相比,可以从对单个大型相关数据集的分析中获得更多信息,从而可以发现“发现业务趋势,研究质量,预防疾病,关联法律引用,打击犯罪并确定实时道路交通状况。”

As of 2012, limits on the size of data sets that are feasible to process in a reasonable amount of time were on the order of exabytes of data. Scientists regularly encounter limitations due to large data sets in many areas, including meteorology, genomics, connectomics, complex physics simulations, and biological and environmental research. The limitations also affect Internet search, finance and business informatics. Data sets grow in size in part because they are increasingly being gathered by ubiquitous information-sensing mobile devices, aerial sensory technologies (remote sensing), software logs, cameras, microphones, radio-frequency identification readers, and wireless sensor networks. The world’s technological per-capita capacity to store information has roughly doubled every 40 months since the 1980s; as of 2012, every day 2.5 quintillion (2.5x1028) bytes of data were created. The challenge for large enterprises is determining who should own big data initiatives that straddle the entire organization.

截至2012年,在合理的时间内可以处理的数据集大小限制为EB级。由于许多领域的大型数据集,科学家经常遇到限制,包括气象学,基因组学,连接学,复杂的物理模拟以及生物学和环境研究。这些限制也影响到Internet搜索,金融和商业信息学。数据集之所以会增长,部分原因在于它们越来越多地被无处不在的信息感应移动设备,空中传感技术(远程感应),软件日志,照相机,麦克风,射频识别读取器和无线传感器网络收集。自1980年代以来,世界人均存储信息的技术能力大约每40个月增加一倍;截至2012年,每天创建2.5亿字节(2.5x1028)字节的数据。大型企业面临的挑战是确定谁应该拥有跨越整个组织的大数据计划。