73. Microsoft SharePoint
Microsoft SharePoint Server provides many different types of logs, many of which are configurable. Logs are written to files, databases, and the Windows EventLog. NXLog can be configured to collect these logs, as is shown in the following sections.
See Monitoring and Reporting in SharePoint Server on TechNet for more information about SharePoint logging.
73.1. Diagnostic Logs
SharePoint diagnostic logs are handled by the Unified Logging Service (ULS), the primary logging mechanism in SharePoint. The ULS writes events to the Windows EventLog and to trace log files. The EventLog and trace log levels of each category or subcategory can be adjusted individually.
The trace log files are generated by and stored locally on each server running
SharePoint in the farm, using file names containing the server hostname and
timestamp (HOSTNAME-YYYYMMDD-HHMM.log
). SharePoint trace logs are created at
regular intervals and whenever there is an IISRESET. It is common for many trace
logs to be generated within a 24-hour period.
If configured in the farm settings, each SharePoint server also writes trace logs to the logging database. These logs are written by the Diagnostic Data Provider: Trace Log job. NXLog can be configured to collect these logs from the logging database.
For more information about diagnostic logging, see Configure diagnostic logging in SharePoint Server on TechNet.
73.1.1. ULS Trace Log Format
The Unified Logging Service (ULS) trace log files are tab-delimited.
Timestamp ⇥Process ⇥TID ⇥Area ⇥Category ⇥EventID⇥Level ⇥Message ⇥Correlation
10/12/2017 16:02:18.30 ⇥hostcontrollerservice.exe (0x0948) ⇥0x191C⇥SharePoint Foundation ⇥Topology ⇥aup1c⇥Medium ⇥Current app domain: hostcontrollerservice.exe (1)
10/12/2017 16:02:18.30 ⇥OWSTIMER.EXE (0x11B8) ⇥0x1AB4⇥SharePoint Foundation ⇥Config DB ⇥azcxo⇥Medium ⇥SPPersistedObjectCollectionCache: Missed memory and file cache, falling back to SQL query. CollectionType=Children, ObjectType=Microsoft.SharePoint.Administration.SPWebApplication, CollectionParentId=30801f0f-cca6-40bc-9f30-5a4608bbb420, Object Count=1, Stack= at Microsoft.SharePoint.Administration.SPPersistedObjectCollectionCache.Get[T](SPPersistedObjectCollection`1 collection) at Microsoft.SharePoint.Administration.SPConfigurationDatabase.Microsoft.SharePoint.Administration.ISPPersistedStoreProvider.GetBackingList[U](SPPersistedObjectCollection`1 persistedCollection) at Microsoft.SharePoint.Administration.SPPersistedObjectCollection`1.get_BackingList() at Microsoft.SharePoint.Administration.SPPersistedObjectCollection`1.<GetEnumeratorImpl>d__0.MoveNext() at Microsoft.Sh...
10/12/2017 16:02:18.30*⇥OWSTIMER.EXE (0x11B8) ⇥0x1AB4⇥SharePoint Foundation ⇥Config DB ⇥azcxo⇥Medium ⇥...arePoint.Utilities.SPServerPerformanceInspector.GetLocalWebApplications() at Microsoft.SharePoint.Utilities.SPServerPerformanceInspector..ctor() at Microsoft.SharePoint.Utilities.SPServerPerformanceInspector..cctor() at Microsoft.SharePoint.Administration.SPTimerStore.InitializeTimer(Int64& cacheVersion, Object& jobDefinitions, Int32& timerMode, Guid& serverId, Boolean& isServerBusy) at Microsoft.SharePoint.Administration.SPNativeConfigurationProvider.InitializeTimer(Int64& cacheVersion, Object& jobDefinitions, Int32& timerMode, Guid& serverId, Boolean& isServerBusy)
The ULS log file contains the following fields.
-
Timestamp: When the event was logged, in local time
-
Process: Image name of the process logging its activity followed by its process ID (PID) inside parentheses
-
TID: Thread ID
-
Area: Component that produced event (SharePoint Portal Server, SharePoint Server Search, etc.)
-
Category: Detailed category of the event (Topology, Taxonomy, User Profiles, etc.)
-
EventID: Internal Event ID
-
Level: Log level of message (Critical, Unexpected, High, etc.)
-
Message: The message from the application
-
Correlation: Unique GUID-based ID, generated for each request received by the SharePoint server (unique to each request, not each error)
As shown by the second and third events in the log sample above, long messages
span multiple records. In this case, the timestamp of each subsequent record is
followed by an asterisk (*
). However, trace log messages are not guaranteed to
appear consecutively within the trace log. See
Writing to the Trace Log
on MSDN.
73.1.2. Configuring Diagnostic Logging
Adjust the log levels, trace log retention policy, and trace log location as follows.
Warning
|
The diagnostic logging settings are farm-wide. |
-
Log in to Central Administration and go to
. -
In the Event Throttling section, use the checkboxes to select a set of categories or subcategories for which to modify the logging level. Expand categories as necessary to view the corresponding subcategories.
-
Set the event log and trace log levels for the selected categories or subcategories.
WarningOnly select the verbose level for troubleshooting, as a large number of logs will be generated. -
To set other levels for other categories or subcategories, click OK and repeat from step 1.
-
In the Trace Log section, adjust the trace log path and retention policy as required. The specified log location must exist on all servers in the farm.
-
Click OK to apply the settings.
Further steps are required to enable writing trace logs to the logging database. For configuring the logging database itself (server, name, and authentication), see the Configuring Usage Logging section.
-
Log in to Central Administration and go to
. -
Click on the Diagnostic Data Provider: Trace Log job.
-
Click the Enable button to enable the job.
-
Open the Diagnostic Data Provider: Trace Log job again and click Run Now to run the job immediately.
73.1.3. Collecting Diagnostic Logs
The xm_csv module can be used to parse the tab-delimited trace log files on the local server.
This configuration collects logs from the ULS trace log files and uses
xm_csv to parse them. $EventTime
and $Hostname
fields are added
to the event record. Each event is converted to JSON format and written to file.
Note
|
The defined SHAREPOINT_LOGS path should be set to the trace log file
directory configured in the Configuring Diagnostic Logging section.
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
define SHAREPOINT_LOGS C:\Program Files\Common Files\microsoft shared\Web Server \
Extensions\16\LOGS
<Extension json>
Module xm_json
</Extension>
<Extension uls_parser>
Module xm_csv
Fields Timestamp, Process, TID, Area, Category, EventID, Level, Message, \
Correlation
Delimiter \t
</Extension>
<Input trace_file>
Module im_file
# Use a file mask to read from ULS trace log files only
File '%SHAREPOINT_LOGS%\*-????????-????.log'
<Exec>
# Drop header lines and empty lines
if $raw_event =~ /^(\xEF\xBB\xBF|Timestamp)/ drop();
else
{
# Remove extra spaces
$raw_event =~ s/ +(?=\t)//g;
# Parse with uls_parser instance defined above
uls_parser->parse_csv();
# Set $EventTime field (second precision only)
$EventTime = strptime($Timestamp, "%m/%d/%Y %H:%M:%S");
# Add $Hostname field
$Hostname = hostname_fqdn();
}
</Exec>
</Input>
<Output out>
Module om_file
File 'C:\logs\uls.json'
Exec to_json();
</Output>
{
"EventReceivedTime": "2017-10-12 16:02:20",
"SourceModuleName": "uls",
"SourceModuleType": "im_file",
"Timestamp": "10/12/2017 16:02:18.30",
"Process": "hostcontrollerservice.exe (0x0948)",
"TID": "0x191C",
"Area": "SharePoint Foundation",
"Category": "Topology",
"EventID": "aup1c",
"Level": "Medium",
"Message": "Current app domain: hostcontrollerservice.exe (1)",
"EventTime": "2017-10-12 16:02:18",
"Hostname": "WIN-SHARE.test.com"
}
The im_odbc module can be used to collect diagnostic logs from the farm-wide logging database.
The following Input configuration collects logs from the ULSTraceLog view in the WSS_UsageApplication database.
Note
|
The
datetime
data type is not timezone-aware, and the timestamps are stored
in UTC. Therefore, an offset is applied when setting the $EventTime
field in the configuration below.
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
<Input trace_db>
Module im_odbc
ConnectionString Driver={ODBC Driver 13 for SQL Server};\
SERVER=SHARESERVE1;DATABASE=WSS_UsageApplication;\
Trusted_Connection=yes
IdType timestamp
# With ReadFromLast and MaxIdSQL, NXLog will start reading from the last
# record when reading from the database for the first time.
#ReadFromLast TRUE
#MaxIdSQL SELECT MAX(LogTime) AS maxid FROM dbo.ULSTraceLog
SQL SELECT LogTime AS id, * FROM dbo.ULSTraceLog \
WHERE LogTime > CAST(? AS datetime)
<Exec>
# Set $EventTime with correct time zone, remove incorrect fields
$EventTime = parsedate(strftime($id, '%Y-%m-%d %H:%M:%SZ'));
delete($id);
delete($LogTime);
</Exec>
</Input>
See the Windows EventLog section below for an example configuration that reads events from the Windows EventLog.
73.2. Usage and Health Data Logs
SharePoint also collects usage and health data to show how it is used. The
system generates health and administrative reports from these logs. Usage and
health data logs are written as tab-delimited data to various *.usage
files in
the configured log location path, and also to the logging database.
FarmId⇥UserLogin⇥SiteSubscriptionId⇥TimestampUtc⇥CorrelationId⇥Action⇥Target⇥Details
42319181-e881-44f1-b422-d7ab5f8b0117⇥TEST\Administrator⇥00000000-0000-0000-0000-000000000000⇥2017-10-17 23:15:26.667⇥00000000-0000-0000-0000-000000000000⇥Administration.Feature.Install⇥AccSrvRestrictedList⇥{"Id":"a4d4ee2c-a6cb-4191-ab0a-21bb5bde92fb"}
42319181-e881-44f1-b422-d7ab5f8b0117⇥TEST\Administrator⇥00000000-0000-0000-0000-000000000000⇥2017-10-17 23:15:26.839⇥00000000-0000-0000-0000-000000000000⇥Administration.Feature.Install⇥ExpirationWorkflow⇥{"Id":"c85e5759-f323-4efb-b548-443d2216efb5"}
For more information, see Overview of monitoring in SharePoint Server on TechNet.
73.2.1. Configuring Usage Logging
Usage and health data collection can be enabled and configured as follows. For more information about configuring usage and health data logging, see Configure usage and health data collection in SharePoint Server on TechNet.
Warning
|
The usage and health data collection settings are farm-wide. |
-
Log in to Central Administration and go to
. -
In the Usage Data Collection section, check Enable usage data collection to enable it.
-
In the Event Selection section, use the checkboxes to select the required event categories. It is recommended that only those categories be enabled for which regular reports are required.
-
In the Usage Data Collection Settings section, specify the path for the usage log files. The specified log location must exist on all servers in the farm.
-
In the Health Data Collection section, check Enable health data collection to enable it. Click Health Logging Schedule to edit the job definitions for the Microsoft SharePoint Foundation Timer service.
-
Click the Log Collection Schedule link to edit the job definitions for the Microsoft SharePoint Foundation Usage service.
-
In the Logging Database Server section, adjust the authentication method as required. To change the database server and name, see Log usage data in a different logging database by using Windows PowerShell on TechNet.
-
Click OK to apply the settings.
73.2.2. Collecting Usage Logs
The xm_csv module can be used to parse the tab-delimited usage and health log files on the local server.
This configuration collects logs from the AdministrativeActions usage log file
(see
Using
Administrative Actions logging in SharePoint Server 2016 on TechNet)
and uses xm_csv to parse them. $EventTime
and $Hostname
fields
are added to the event record. Each event is converted to JSON format and
written to file.
Note
|
The defined SHAREPOINT_LOGS path should be set to the trace log file
directory configured in the Configuring Diagnostic Logging section.
|
Note
|
Unlike the diagnostic/trace logs, the various usage/health data categories generate logs with differing field sets. Therefore it is not practical to parse multiple types of usage/health logs with a single xm_csv parser. |
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
define SHAREPOINT_LOGS C:\Program Files\Common Files\microsoft shared\Web Server \
Extensions\16\LOGS
<Extension json>
Module xm_json
</Extension>
<Extension admin_actions_parser>
Module xm_csv
Fields FarmId, UserLogin, SiteSubscriptionId, TimestampUtc, \
CorrelationId, Action, Target, Details
Delimiter \t
</Extension>
<Input admin_actions_file>
Module im_file
# Use a file mask to read from the USAGE files only
File '%SHAREPOINT_LOGS%\AdministrativeActions\*.usage'
<Exec>
# Drop header lines and empty lines
if $raw_event =~ /^(\xEF\xBB\xBF|FarmId)/ drop();
else
{
# Parse with parser instance defined above
admin_actions_parser->parse_csv();
# Set $EventTime field
$EventTime = parsedate($TimestampUtc + "Z");
# Add $Hostname field
$Hostname = hostname_fqdn();
}
</Exec>
</Input>
<Output out>
Module om_file
File 'C:\logs\uls.json'
Exec to_json();
</Output>
{
"EventReceivedTime": "2017-10-17 20:46:14",
"SourceModuleName": "admin_actions",
"SourceModuleType": "im_file",
"FarmId": "42319181-e881-44f1-b422-d7ab5f8b0117",
"UserLogin": "TEST\\Administrator",
"SiteSubscriptionId": "00000000-0000-0000-0000-000000000000",
"TimestampUtc": "2017-10-17 23:15:26.667",
"CorrelationId": "00000000-0000-0000-0000-000000000000",
"Action": "Administration.Feature.Install",
"Target": "AccSrvRestrictedList",
"Details": {
"Id": "a4d4ee2c-a6cb-4191-ab0a-21bb5bde92fb"
},
"EventTime": "2017-10-17 16:15:26",
"Hostname": "WIN-SHARE.test.com"
}
The im_odbc module can be used to collect usage and health logs from the farm-wide logging database.
The following Input configuration collects Administrative Actions logs from the AdministrativeActions view in the WSS_UsageApplication database.
Note
|
The
datetime
data type is not timezone-aware, and the timestamps are stored
in UTC. Therefore, an offset is applied when setting the $EventTime
field in the configuration below.
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
<Input admin_actions_db>
Module im_odbc
ConnectionString Driver={ODBC Driver 13 for SQL Server};\
SERVER=SHARESERVE1;DATABASE=WSS_UsageApplication;\
Trusted_Connection=yes
IdType timestamp
# With ReadFromLast and MaxIdSQL, NXLog will start reading from the last
# record when reading from the database for the first time.
#ReadFromLast TRUE
#MaxIdSQL SELECT MAX(LogTime) AS maxid FROM dbo.AdministrativeActions
SQL SELECT LogTime AS id, * FROM dbo.AdministrativeActions \
WHERE LogTime > CAST(? AS datetime)
<Exec>
# Set $EventTime with correct time zone, remove incorrect fields
$EventTime = parsedate(strftime($id, '%Y-%m-%d %H:%M:%SZ'));
delete($id);
delete($LogTime);
</Exec>
</Input>
See the Windows EventLog section for an example configuration that reads events from the Windows EventLog.
73.3. Audit Logs
SharePoint Information Management provides an audit feature that allows tracking of user actions on a site’s content. The audit events are stored in the dbo.AuditData table in the WSS_Content database. The events can be collected via the SharePoint API or by reading the database directly.
Audit logging is disabled by default, and can be enabled on a per-site basis. To enable audit logging, follow these steps. For more details, see the Configure audit settings for a site collection article on Office Support.
-
Log in to Central Administration and go to
. -
Verify that the Auditing policy is set to Available.
-
On the site collection home page, click Site actions (gear icon), then Site settings.
-
On the Site Settings page, in the Site Collection Administration section, click Site collection audit settings.
NoteIf the Site Collection Administration section is not shown, make sure you have adequate permissions. -
Set audit log trimming settings, select the events to audit, and click OK.
73.3.1. Reading Audit Logs via the API
A PowerShell script can be used to collect audit logs via SharePoint’s API.
In order for NXLog to have SharePoint Shell access when running as a service,
run the following PowerShell commands. This will add the NT AUTHORITY\SYSTEM
user to the SharePoint_Shell_Access
role for the SharePoint configuration
database.
PS> Add-PSSnapin Microsoft.SharePoint.Powershell
PS> Add-SPShellAdmin -UserName "NT AUTHORITY\SYSTEM"
73.3.2. Reading Audit Logs From the Database
It is also possible to read the audit logs directly from the SharePoint database.
This configuration collects audit events from the AuditData table in the WSS_Content database.
Note
|
The
datetime
data type is not timezone-aware, and the timestamps are stored
in UTC. Therefore, an offset is applied when setting the $EventTime
field in the configuration below.
|
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
<Input audit_db>
Module im_odbc
ConnectionString Driver={ODBC Driver 13 for SQL Server}; \
Server=SHARESERVE1; Database=WSS_Content; \
Trusted_Connection=yes
IdType timestamp
# With ReadFromLast and MaxIdSQL, NXLog will start reading from the last
# record when reading from the database for the first time.
#ReadFromLast TRUE
#MaxIdSQL SELECT MAX(Occurred) AS maxid FROM dbo.AuditData
SQL SELECT Occurred AS id, * FROM dbo.AuditData \
WHERE Occurred > CAST(? AS datetime)
<Exec>
# Set $EventTime with correct time zone, remove incorrect fields
$EventTime = parsedate(strftime($id, '%Y-%m-%d %H:%M:%SZ'));
delete($id);
delete($Occurred);
</Exec>
</Input>
73.4. Windows EventLog
SharePoint will generate Windows event logs according to the diagnostic log levels configured (see the Diagnostic Logs section). NXLog can be configured to collect logs from the Windows EventLog as shown below. For more information about collect Windows EventLog events with NXLog, see the Windows Event Log chapter.
This configuration uses the im_msvistalog module to collect all logs from four SharePoint crimson channels, as well as Application channel events of Warning or higher level. The Application channel will include other non-SharePoint events. There may be other SharePoint events generated which will not be collected with this query, depending on the configuration and the channels used.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
<Input eventlog>
Module im_msvistalog
<QueryXML>
<QueryList>
<Query Id="0" Path="Application">
<Select Path="Application">
*[System[(Level=1 or Level=2 or Level=3)]]</Select>
<Select Path="System">
*[System[(Level=1 or Level=2 or Level=3)]]</Select>
<Select Path="Microsoft-Office Server-Search/Operational">
*</Select>
<Select Path="Microsoft-Office-EduServer Diagnostics">*</Select>
<Select Path="Microsoft-SharePoint Products-Shared/Operational">
*</Select>
<Select Path="Microsoft-SharePoint Products-Shared/Audit">*</Select>
</Query>
</QueryList>
</QueryXML>
</Input>
73.5. IIS Logs
SharePoint uses the Internet Information Server (IIS) to serve the configured sites as well as the Central Administration site. IIS generates its own logs.
See the Microsoft IIS chapter for more information about collecting events from IIS with NXLog.